Robotic bird created to study animal communication


A Human Frontiers in Science Program press release

A collaborative team, funded by the Human Frontiers Science Program have developed a new hyper realistic robotic finch in order to study how songbirds learn their songs

A zebra finch bird, as used in the study

Published in Methods of Ecology and Evolution, an interdisciplinary team founded by the Human Frontiers Science Program (HFSP) has developed a robotic zebra finch, the so called ‘RoboFinch’, to experimentally study the sensory cues songbirds need to learn their song.

“Robotic models let us study animal communication in totally new ways!

Zebra finches are a popular organism for studying vocal learning due to their ability to learn and modify their songs. By creating robotic models that closely resemble real life counterparts, researchers can study animal behaviour with stimuli that can be manipulated in a way that would be impossible with live animals.

Ralph Simon, one of the developers of RoboFinch, said “Robotic models allow us to study animal communication in totally new ways, for example, the poorly understood interaction between acoustic and visual components of a signal.”

There is substantial evidence to assume that combined exposure to acoustic and visual cues enhances song learning in bird babies. As the RoboFinch allows full experimental control over auditory and visual cues, the researchers are now able to investigate whether seeing how sound is produced is important to sing well.

A young male zebra finch standing next to the RoboFinch. Credit: Ralph Simon

The robot produces beak movements that can be exactly matched to the presented song at a high speed, suitable for the zebra finch’s highly acute vision. To validate the RoboFinch, the team raised young zebra finches alongside the robot, throughout the sensitive phase for song learning.

Judith Varkevisser who conducted the behavioural experiments explained “Zebra finches are often careful with new objects but they approached the RoboFinch immediately! They perched next to the robot, singing at him, but not when he sang or moved. They really seemed to be listening to it.”

3D print model of the upper and lower beak (left) and the inner mechanics of the RoboFinch (Right.) Credit: Rogier Elsinga

This demonstrates that the RoboFinch can be used in song learning studies to solve the question of if seeing the movements associated with song production helps in learning the song. Researchers also hope to use the RoboFinch to gain insights into how experience affects the development of vocal behaviour, both in multiple species and in field settings.

The team has provided detailed descriptions and manuals alongside their paper, and all technical details and programs are open source in order to allow other researchers to replicate the RoboFinch or adapt it to other species.

To read the full article for free, for a limited time, click here:

Simon, R., Varkevisser, J., Mendoza, E., Hochradel, K., Elsinga, R., Wiersma, P G., Middelburg, E., Zoeter, E., Scharff, C., Riebel, K., & Halfwerk, W. (2023). RoboFinch: A versatile audio-visual synchronised robotic bird model for laboratory and field research on songbirds. Methods in Ecology and Evolution, 00, 112. https://doi.org/10.1111/2041-210X.14063

Robotic bird created to study animal communication

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to top