All News

AI Advances Unlock Secrets of Dolphin Communication

A research team led by Laela Sayigh won the $100,000 Coller Dolittle Challenge for using AI to analyze dolphin whistles, suggesting these sounds function like words shared within communities. This breakthrough leverages decades of data and deep learning to explore animal communication, signaling a new era where AI accelerates understanding of complex interspecies languages.

Published May 26, 2025 at 01:14 PM EDT in Artificial Intelligence (AI)

Imagine cracking the code of dolphin language—sounds that might be as rich and structured as human words. A groundbreaking team of scientists from the Woods Hole Oceanographic Institution, led by Laela Sayigh, has taken a significant step toward this goal by winning the inaugural $100,000 Coller Dolittle Challenge prize. Their research used non-invasive hydrophones to capture dolphin whistles, revealing that these marine mammals may use specific whistles as alarm signals or responses to unfamiliar situations within their communities.

The challenge in decoding animal communication lies in the sheer volume of data required. Jonathan Birch, a judge for the prize and professor at the London School of Economics, highlights that unlike human language models like ChatGPT, which train on trillions of words, researchers lack comparable datasets for animals. However, the Sarasota Dolphin Research Program has amassed an extraordinary library of dolphin whistles over 40 years, enabling the application of deep learning techniques to analyze these sounds.

This award ceremony also recognized teams studying communication in nightingales, marmoset monkeys, and cuttlefish, underscoring a global push to understand interspecies communication. The Coller Dolittle Challenge, a collaboration between the Jeremy Coller Foundation and Tel Aviv University, opens submissions annually, fostering innovation in this emerging field.

AI is revolutionizing animal communication research. Kate Zacarian, CEO of the Earth Species Project, emphasizes that AI not only accelerates data processing but also enables entirely new inquiries into communication as a dynamic, structured phenomenon—something beyond human cognitive limits. Their open-source NatureLM audio model exemplifies this, analyzing sounds from orcas to jumping spiders, promising to unveil new insights soon.

The implications are profound: understanding animal languages could transform conservation efforts, enhance human-animal interactions, and deepen our grasp of cognition across species. This fusion of biology and AI heralds a new frontier where machines help us listen to the natural world in ways never before possible.

Keep Reading

View All
The Future of Business is AI

AI Tools Built for Agencies That Move Fast.

QuarkyByte empowers researchers and organizations with AI-driven tools to analyze complex communication patterns in animal studies and beyond. Explore how our advanced data infrastructure and machine learning insights can accelerate breakthroughs in interspecies communication and bioacoustics research.