Tuesday, July 25, 2023

 

Studying animal behavior without markers


With a new marker less method it is now possible to track the gaze and fine-scaled behaviours of every individual bird and how that animal moves in the space with others.


Peer-Reviewed Publication

UNIVERSITY OF KONSTANZ




Researchers are still puzzling over how animal collectives behave, but recent advances in machine learning and computer vision are revolutionizing the possibilities of studying animal behaviour. Complex behaviours, like social learning or collective vigilance can be deciphered with new techniques.

An interdisciplinary research team from the Cluster of Excellence Centre for the Advanced Study of Collective Behaviour (CASCB) at the University of Konstanz and the Max Planck Institute of Animal Behavior has now succeeded in developing a novel marker less method to track bird postures in 3D just by using video recordings. It is no longer necessary to attach position or movement transmitters to the animals. With this method called 3D-POP (3D posture of pigeons) it is possible to record a group of pigeons and identify the gaze and fine-scaled behaviours of every individual bird and how that animal moves in the space with others. „With the dataset, researchers can study collective behaviour of birds by just using at least two video cameras, even in the wild,” says Alex Chan, PhD student at the CASCB.

 

The dataset was released at the Conference on Computer Vision and Pattern Recognition (CVPR) in June 2023 and available via open access so that it can be reused by other researchers. The researchers Hemal Naik and Alex Chan see two potential application areas: Scientists working with pigeons can directly use the dataset. With at least two cameras they can study the behaviour of multiple freely moving pigeons. The annotation method can be used with other birds or even other animals so that researchers can soon decipher the behaviour of other animals.

 

 

 

Key facts:

  • An interdisciplinary study team from the CASCB and the Max Planck Institute of Animal Behavior with computer scientists, biologists, and comparative psychologists developed a new method for generating large-scale datasets with multiple animals.
  • Publication: Hemal Naik, Alex Hoi Hang Chan, Junran Yang, Mathilde Delacoux, Iain D. Couzin, Fumihiro Kano, Máté Nagy; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2023, pp. 21274-21284.
    https://openaccess.thecvf.com/content/CVPR2023/html/Naik_3D-POP_-_An_Automated_Annotation_Approach_to_Facilitate_Markerless_2D-3D_CVPR_2023_paper.html
  • The paper was published at the Conference on Computer Vision and Pattern Recognition (CVPR), one of the most prestigious computer science conferences in the world.
  • The dataset and code to apply the annotation method to other birds can be found here: https://github.com/alexhang212/Dataset-3DPOP
  • The study was funded by the Cluster of Excellence Centre for the Advanced Study of Collective Behaviour at the University of Konstanz.

 

 

 

Note to editors:

A photo is available here:
https://www.uni-konstanz.de/fileadmin/pi/fileserver/2023/tierverhalten_zukuenftig_markerlos.jpg

Caption: The posture of pigeons is recognized by automated image recognition, purely from video recordings.

Copyright: Alex Chan

No comments:

Post a Comment