Animal Locomotion Analysis
Team

Oliver Mothes, Daniel Haase, Manuel Amthor

Motivation

The detailed understanding of animal locomotion plays an important role in many fields of research, e.g., biology, motion science, and robotics. In order to analyze the locomotor system in vivo, high-speed X-ray acquisition is applied where the retrieval of anatomical landmarks is of main interest. To this day, the evaluation of these sequences is a very time-consuming task, since human experts have to manually annotate anatomical landmarks in single images. Therefore, an automation of this task at a minimum level of user interaction is of urgent need.
In this project computer vision principles combining with machine learning methods tackle this task of automation.

X-ray Animal Skeleton Tracking

The detailed understanding of animals in locomotion is a relevant field of research in biology, biomechanics and robotics. To examine the locomotor system of birds in vivo and in a surgically non-invasive manner, high-speed X-ray acquisition is the state of the art. For a biological evaluation, it is crucial to locate relevant anatomical structures of the locomotor system. There is an urgent need for automating this task, as vast amounts of data exist and a manual annotation is extremely time-consuming. We present a biologically motivated skeleton model tracking framework based on a pictorial structure approach which is extended by robust sub-template matching. This combination makes it possible to deal with severe self-occlusions and challenging ambiguities. As opposed to model-driven methods which require a substantial amount of labeled training samples, our approach is entirely data-driven and can easily handle unseen cases. Thus, it is well suited for large scale biological applications at a minimum of manual interaction. We validate the performance of our approach based on 24 real-world X-ray locomotion datasets, and achieve results which are comparable to established methods while clearly outperforming more general approaches.

Animal Locomotion Analysis using Augmented Active Appearance Models

For many fundamental problems and applications in biomechanics, biology, and robotics, an in-depth understanding of animal locomotion is essential. To analyze the locomotion of animals, high-speed X-ray videos are recorded, in which anatomical landmarks of the locomotor system are of main interest and must be located. To date, several thousand sequences have been recorded, which makes a manual annotation of all landmarks practically impossible. Therefore, an automatization of X-ray landmark tracking in locomotion scenarios is worthwhile. However, tracking all landmarks of interest is a very challenging task, as severe self-occlusions of the animal and low contrast are present in the images due to the X-ray modality. For this reason, existing approaches are currently only applicable for very specific subsets of anatomical landmarks. In contrast, our goal is to present a holistic approach which models all anatomical landmarks in one consistent, probabilistic framework. While active appearance models (AAMs) provide a reasonable global modeling framework, they yield poor fitting results when applied on the full set of landmarks. In this paper, we propose to augment the AAM fitting process by imposing constraints from various sources. We derive a general probabilistic fitting approach and show how results of subset AAMs, local tracking, anatomical knowledge, and epipolar constraints can be included. The evaluation of our approach is based on 32 real-world datasets of five bird species which contain 175,942 ground-truth landmark positions provided by human experts. We show that our method clearly outperforms standard AAM fitting and provides reasonable tracking results for all landmark types. In addition, we show that the tracking accuracy of our approach is even sufficient to provide reliable three-dimensional landmark estimates for calibrated datasets.


Online Tracking

Recent advances in the understanding of animal locomotion have proven it to be a key element of many fields in biology, motion science, and robotics. For the analysis of walking animals, high-speed x-ray videography is employed. For a biological evaluation of these x-ray sequences, anatomical landmarks have to be located in each frame. However, due to the motion of the animals, severe occlusions complicate this task and standard tracking methods can not be applied. We present a robust tracking approach which is based on the idea of dividing a template into sub-templates to overcome occlusions. The difference to other sub-template approaches is that we allow soft decisions for the fusion of the single hypotheses, which greatly benefits tracking stability. Also, we show how anatomical knowledge can be included into the tracking process to further improve the performance. Experiments on real datasets show that our method achieves results superior to those of existing robust approaches.

Publications
2022
Emanuel Andrada, Oliver Mothes, Heiko Stark, Matthew C. Tresch, Joachim Denzler, Martin S. Fischer, Reinhard Blickhan:
Limb, Joint and Pelvic Kinematic Control in the Quail Coping with Steps Upwards and Downwards.
Scientific Reports. 12 (1) : pp. 15901. 2022.
[bibtex] [pdf] [web] [doi] [abstract]
2021
Emanuel Andrada, Oliver Mothes, Dirk Arnold, Joachim Denzler, Martin S. Fischer, Reinhard Blickhan:
Uncovering Stability Princicples of Avian Bipedal Uneven Locomotion.
26th Congress of the European Society of Biomechanics (ESB). 2021.
[bibtex]
2019
Oliver Mothes, Joachim Denzler:
One-Shot Learned Priors in Augmented Active Appearance Models for Anatomical Landmark Tracking.
Computer Vision, Imaging and Computer Graphics -- Theory and Applications. Pages 85-104. 2019.
[bibtex] [web] [doi] [abstract]
2017
Oliver Mothes, Joachim Denzler:
Anatomical Landmark Tracking by One-shot Learned Priors for Augmented Active Appearance Models.
International Conference on Computer Vision Theory and Applications (VISAPP). Pages 246-254. 2017.
[bibtex] [pdf] [web]
2015
Emanuel Andrada, Daniel Haase, Yefta Sutedja, John A. Nyakatura, Brandon M. Kilbourne, Joachim Denzler, Martin S. Fischer, Reinhard Blickhan:
Mixed Gaits in Small Avian Terrestrial Locomotion.
Scientific Reports. 5 : 2015.
[bibtex] [web] [doi] [abstract]
2014
Daniel Haase, John A. Nyakatura, Joachim Denzler:
Comparative Large-Scale Evaluation of Human and Active Appearance Model Based Tracking Performance of Anatomical Landmarks in X-ray Locomotion Sequences.
Pattern Recognition and Image Analysis. Advances in Mathematical Theory and Applications (PRIA). 24 (1) : pp. 86-92. 2014.
[bibtex] [web] [abstract]
2013
Daniel Haase, Emanuel Andrada, John A. Nyakatura, Brandon M. Kilbourne, Joachim Denzler:
Automated Approximation of Center of Mass Position in X-ray Sequences of Animal Locomotion.
Journal of Biomechanics. 46 (12) : pp. 2082-2086. 2013.
[bibtex]
Daniel Haase, Joachim Denzler:
2D and 3D Analysis of Animal Locomotion from Biplanar X-ray Videos Using Augmented Active Appearance Models.
EURASIP Journal on Image and Video Processing. 45 : pp. 1-13. 2013.
[bibtex] [pdf]
2012
Manuel Amthor, Daniel Haase, Joachim Denzler:
Fast and Robust Landmark Tracking in X-ray Locomotion Sequences Containing Severe Occlusions.
International Workshop on Vision, Modelling, and Visualization (VMV). Pages 15-22. 2012.
[bibtex] [abstract]
2011
Daniel Haase, Joachim Denzler:
Anatomical Landmark Tracking for the Analysis of Animal Locomotion in X-ray Videos Using Active Appearance Models.
Scandinavian Conference on Image Analysis (SCIA). Pages 604-615. 2011.
[bibtex] [pdf] [abstract]
Daniel Haase, Joachim Denzler:
Comparative Evaluation of Human and Active Appearance Model Based Tracking Performance of Anatomical Landmarks in Locomotion Analysis.
Open German-Russian Workshop on Pattern Recognition and Image Understanding (OGRW). Pages 96-99. 2011.
[bibtex] [pdf]
Daniel Haase, John A. Nyakatura, Joachim Denzler:
Multi-view Active Appearance Models for the X-ray Based Analysis of Avian Bipedal Locomotion.
Symposium of the German Association for Pattern Recognition (DAGM). Pages 11-20. 2011.
[bibtex] [pdf] [abstract]