Tracking animals without markers in the wild
Researchers from the Cluster of Excellence Collective Behaviour developed a computer vision framework for posture estimation and identity tracking which they can use in indoor environments as well as in the wild.
UNIVERSITY OF KONSTANZ
Two pigeons are pecking grains in a park in Konstanz. A third pigeon flies in. There are four cameras in the immediate vicinity. Doctoral students Alex Chan and Urs Waldmann from the Cluster of Excellence Collective Behaviour at the University of Konstanz are filming the scene. After an hour, they return with the footage to their office to analyze it with a computer vision framework for posture estimation and identity tracking. The framework detects and draws a box around all pigeons. It records central body parts and determines their posture, their position, and their interaction with the other pigeons around them. All of this happened without any markers being attached to pigeons or any need for human being called in to help. This would not have been possible just a few years ago.
3D-MuPPET, a framework to estimate and track 3D poses of up to 10 pigeons
Markerless methods for animal posture tracking have been rapidly developed recently, but frameworks and benchmarks for tracking large animal groups in 3D are still lacking. To overcome this gap, researchers from the Cluster of Excellence Collective Behaviour at the University of Konstanz and the Max Planck Institute of Animal Behavior present 3D-MuPPET, a framework to estimate and track 3D poses of up to 10 pigeons at interactive speed using multiple camera views. The related publication was recently published in the International Journal of Computer Vision (IJCV).
Important milestone in animal posture tracking and automatic behavioural analysis
Urs Waldmann and Alex Chan recently finalized a new method, called 3D-MuPPET, which stands for 3D Multi-Pigeon Pose Estimation and Tracking. 3D-MuPPET is a computer vision framework for posture estimation and identity tracking for up to 10 individual pigeons from 4 camera views, based on data collected both in captive environments and even in the wild. “We trained a 2D keypoint detector and triangulated points into 3D, and also show that models trained on single pigeon data work well with multi-pigeon data,” explains Urs Waldmann. This is a first example of 3D animal posture tracking for an entire group of up to 10 individuals. Thus, the new framework provides a concrete method for biologists to create experiments and measure animal posture for automatic behavioural analysis. “This framework is an important milestone in animal posture tracking and automatic behavioural analysis", as Alex Chan and Urs Waldmann say.
Framework can be used in the wild
In addition to tracking pigeons indoors, the framework is also extended to pigeons in the wild. “Using a model that can identify the outline of any object in an image called the Segment Anything Model, we further trained a 2D keypoint detector with a masked pigeon from the captive data, then applied the model to pigeon videos outdoors without any extra model finetuning”, states Alex Chan. 3D-MuPPET presents one of the first case-studies on how to transition from tracking animals in captivity towards tracking animals in the wild, allowing fine-scaled behaviours of animals to be measured in their natural habitats. The developed methods can potentially be applied across other species in future work, with potential application for large scale collective behaviour research and species monitoring in a non-invasive way.
3D-MuPPET showcases a powerful and flexible framework for researchers who would like to use 3D posture reconstruction for multiple individuals to study collective behaviour in any environments or species. As long as a multi-camera setup and a 2D posture estimator is available, the framework can be applied to track 3D postures of any animals.
Key facts:
- Publication: Waldmann, U. & Chan, A.H.H. et al. 3D-MuPPET: 3D Multi-Pigeon Pose Estimation and Tracking. Int J Comput Vis (2024).
DOI: https://doi.org/10.1007/s11263-024-02074-y. - The article is part of a special issue on “Computer Vision Approaches for Animal Tracking and Modeling 2023” in IJCV.
- The framework is based on the 3D-POP dataset and the I-MuPPET
framework, developed by researchers in the Cluster of Excellence
Collective Behaviour. - Alex Chan and Urs Waldmann are doctoral students in the field of biology and computer science at the Cluster of Excellence Collective Behaviour. Senior author Fumihiro Kano is a junior group leader at the Cluster of Excellence Collective Behaviour.
- The Centre for the Advanced Study of Collective Behaviour at the University of Konstanz is a global hotspot for the study of collective behaviour across a wide range of species and across scales of organization. It is a Cluster of Excellence within the framework of the German Excellence Strategy of the federal and state governments.
Note to editors:
Watch a video about 3D-MuPPET here: https://youtu.be/GZZ_u53UpfQ?si=pW0DQRM2oelku41c
You can download photos here:
- https://www.uni-konstanz.de/fileadmin/pi/fileserver/2024_EXTRA/Tierbewegungen_in_freier_wildbahn.jpg
Caption: 3D-MuPPET offers a computer vision framework for 3D posture estimation and identity tracking of animals in indoor environments as well as in the wild.
Copyright: Alex Chan
DOI
New AI tool developed for advanced animal behavior analysis
The animal analysis model has vast applications, from biological research to robotics
Animal behavior analysis is a fundamental tool in various studies, ranging from basic neuroscience research to understanding the causes and treatments of diseases. It is widely applied not only in biological research but also across various industrial fields, including robotics. Recently, efforts have been made to accurately analyze animal behaviors using AI technology. However, there are still limitations for AI to intuitively recognize different behaviors like human observers can.
Traditional animal behavior research primarily involved filming animals with a single camera and analyzing low-dimensional data such as the time and frequency of specific movements. The analysis method provided AI with corresponding results for each piece of training data, akin to simply feeding AI with questions along with the answer key. While this method is straightforward, it requires time and labor-intensive human supervision to build the data. Observer bias is also a factor, as the analysis results can be distorted by the experimenter's subjective judgment.
To overcome those limitations, a joint research team led by Director C. Justin LEE of the Center for Cognition and Sociality within the Institute for Basic Science, and CHA Meeyoung, the Chief Investigator (CI) of the Data Science Group at the IBS Center for Mathematical and Computational Sciences (also a professor in the School of Computing at KAIST), has developed a new analytical tool called 'SUBTLE' (Spectrogram-UMAP-Based Temporal-Link Embedding). SUBTLE classifies and analyzes animal behavior through AI learning based on 3D movement information.
First, the research team recorded the movements of mice using multiple cameras, extracting the coordinates of nine key points such as the head, legs, and hips to obtain 3D action skeleton movement data over time. They then reduced this time-series data into two dimensions for embedding, a process that creates a collection of vectors corresponding to each piece of data, allowing complex data to be represented more concisely and meaningfully.
Next, the researchers clustered similar behavior states into subclusters and grouped these subclusters into superclusters representing standardized behavior patterns (repertoires), such as walking, standing, grooming, etc. During this process, they proposed a new metric called the Temporal Proximity Index (TPI) to evaluate behavior data clusters. This metric measures whether each cluster includes the same behavior state and effectively represents temporal movements, similar to how humans consider temporal information important when classifying behavior.
CI CHA Meeyoung stated, "The introduction of new evaluation metrics and benchmark data to aid in the automation of animal behavior classification is a result of the collaboration between neuroscience and data science. We expect this algorithm to be beneficial in various industries requiring behavior pattern recognition, including the robotics industry, which aims to mimic animal movements."
Director C. Justin LEE of the Center for Cognition and Sociality, who led this research commented, "We have developed an effective behavior analysis framework that minimizes human intervention while understanding complex animal behaviors by applying human behavior pattern recognition mechanisms. This framework has significant industrial applications and can also be used as a tool to gain deeper insights into the principles of behavior recognition in the brain."
Additionally, the research team transferred SUBTLE technology to Actnova, a company specializing in AI-based clinical and non-clinical behavior test analysis, in April last year. The team utilized Actnova's animal behavior analysis system, AVATAR3D, to obtain 3D movement data of animals for this research.
The research team has also made SUBTLE's code open-source, and a user-friendly graphical interface (GUI) to facilitate animal behavior analysis is available through the SUBTLE web service (http://subtle.ibs.re.kr/) for researchers who are not familiar with programming.
This research was published online on May 20th in the prestigious AI computer vision journal, the International Journal of Computer Vision (IJCV).
Figure 2. Schematic of the SUBTLE framework
JOURNAL
International Journal of Computer Vision
METHOD OF RESEARCH
Experimental study
SUBJECT OF RESEARCH
Animals
ARTICLE TITLE
SUBTLE: An Unsupervised Platform with Temporal Link Embedding that Maps Animal Behavior
ARTICLE PUBLICATION DATE
28-May-2024
No comments:
Post a Comment