AI is already changing the world of humans, and it might soon change the world of the animals as well.
The idea of holding a conversation with a dolphin or understanding the intricate social calls of a bird has long been the stuff of science fiction. However, according to Drew Purves, the co-lead at Google DeepMind’s Sustainability Program, this long-held fantasy is moving closer to the realm of scientific possibility. On a recent podcast, Purves articulated a powerful vision where advanced AI, specifically foundational models, could not only decode animal communication but fundamentally reshape humanity’s relationship with nature.


Purves, whose work sits at the intersection of ecology and artificial intelligence, believes that the same kind of technology powering large language models for humans can be adapted for the non-human world. “I like to think that we will be able to talk to animals using AI at some point, and that these embedding, foundational modeling approaches like Perch will be really important in that,” Purves stated. “But, of course, you’d need some other elements there.”
He immediately pointed to tangible progress in this field, offering a glimpse into how researchers are already laying the groundwork. “Interestingly, some colleagues of mine have been involved in producing this thing called Dolphin Gemma, which is a large language model that’s been adapted to be suitable for beginning to decode dolphin communication,” he explained. “It takes the sounds, separates them out, tokenizes them, and basically brings it into the world of large language modeling. That’s an example of its early days, but that’s an example of AI actively being used to study animal communication at a level that we really couldn’t do before.”
The “Perch” model he references is a powerful bioacoustics classifier developed by Google, trained on a vast library of bird vocalizations to identify thousands of different species by their sounds. Projects like Perch and Dolphin Gemma represent a new frontier in bioacoustics, moving beyond simple cataloging to understanding the very structure of animal vocalizations.
However, for Purves, the true potential of this work extends far beyond data analysis. He believes its most profound impact could be on human consciousness and our connection to the planet. “Most of the work we’re doing at the moment is filling known information gaps in known processes,” he noted. “But sometimes, the real change can come in the long run from these moments of awakening where people, almost overnight, can change their relationship with nature.”
To illustrate this point, Purves drew on two powerful historical precedents that dramatically shifted public perception. “I think we’ve seen at least two examples in the past. One was the first picture of the Earth from the moon,” he said, referencing the iconic “Blue Marble” photograph. “People often trace some of the more recent increases in the modern conservation movement to that picture.”
His second example struck a more acoustic chord. “Another one is whale song. Just listening to whales, even if we can’t tell what they’re saying, really changed the way people thought about whales,” Purves recalled. The 1970 album “Songs of the Humpback Whale” is widely credited with galvanizing the “Save the Whales” movement and leading to a global ban on commercial whaling. These moments, he argues, created a powerful, intuitive connection that data alone could not.
This leads to the ultimate, transformative goal of his vision. It’s not just about listening in, but about the potential for a genuine dialogue. “So, the idea of being able to do that for more species with things like Perch, but then being able to decode that to say what they’re actually saying, and maybe one day even have some kind of conversation—if AI can help to empower that, that might in the long run be the most powerful role of AI.”
Purves’s vision is part of a burgeoning field where technology and conservation converge. He is not alone in this endeavor. Organizations like the Earth Species Project, a non-profit, are also using AI to decode animal communication with the explicit goal of fostering a deeper connection to nature. Similarly, Project CETI (Cetacean Translation Initiative) is a large-scale, interdisciplinary effort focused specifically on applying advanced machine learning and robotics to listen to and translate the communication of sperm whales. These initiatives, powered by breakthroughs in AI, signal a clear trend: humanity is on the cusp of listening to the planet in an entirely new way. Many years ago, Google had run an April Fools’ prank where it said it had developed a technique to be able to communicate with animals. In 2025, the idea seems to be not too far from reality.