What’s the project about?
Visually driven collective behaviour is well understood. Given enough light, each individual in
a group can sense its surroundings and thus follow or influence its neighbours. What happens when individuals in a group can't always sense their surrounding but are still in a moving group?
This is what happens in echolocating bat swarms. Individuals 'jam' each other and somehow still manage to fly together! We collected multi-camera thermal video of bat groups in a cave to understand what acoustically driven collective behaviour looks like. In this project you will generate 3D trajectories from the multi-camera data of single and multiple flying bats and understand the flight with reference to the LiDAR scan of the cave. The insights from the tracking data will reveal for the first time what the flight behaviour of active sensing groups looks like in their natural roosting sites.
For more on the project - check out the Ushichka project sites here and here.
Tasks
- 2D and 3D flight trajectory generation
- Flight trajectory verification
- Generation of simple statistics like alignment, inter-neighbour distance etc.
- Potentially drive trajectory analysis in your own direction!
Who can apply?
The project is available to both BSc and MSc students
What would be helpful?
It would be helpful if you're familiar with Python or MATLAB. While most of the project is built with graphical interfaces, there may be occasional cases where you'll need to run some code yourself.
What you'll learn
You will learn the useful skill of 2D and 3D animal tracking. Tracking objects in 2D and 3D is an
incredibly useful skill that can be used across multiple experimental paradigms in the lab and field, aside from potential uses outside of biology. Aside from this skill set, you'll also learn some
basic analyses of collective movement, and also potentially drive pilot analyses in related directions like spatial memory.
Who should I contact?
Thejasvi Beleyur & Iain Couzin, Department of Collective Behavior