The naked mole rat may not be much to look at, but it has much to say. The wrinkled, whiskered rodents, which live, like many ants do, in large, underground colonies, have an elaborate vocal repertoire. They whistle, trill and twitter; grunt, hiccup and hiss. And when two of the voluble rats meet in a dark tunnel, they exchange a standard salutation. “They’ll make a soft chirp, and then a repeating soft chirp,” said Alison Barker, a neuroscientist at the Max Planck Institute for Brain Research, in Germany. “They have a little conversation.”
Hidden in this everyday exchange is a wealth of social information, Dr. Barker and her colleagues discovered when they used machine-learning algorithms to analyse 36,000 soft chirps recorded in seven mole rat colonies. Not only did each mole rat have its own vocal signature, but each colony had its own distinct dialect, which was passed down, culturally, over generations. During times of social instability — as in the weeks after a colony’s queen was violently deposed — these cohesive dialects fell apart. When a new queen began her reign, a new dialect appeared to take hold. “The greeting call, which I thought was going to be pretty basic, turned out to be incredibly complicated,” said Dr. Barker, who is now studying the many other sounds the rodents make. “Machine-learning kind of transformed my research.”
Machine-learning systems, which use algorithms to detect patterns in large collections of data, have excelled at analysing human language, giving rise to voice assistants that recognise speech, transcription software that converts speech to text and digital tools that translate between human languages.
In recent years, scientists have begun deploying this technology to decode animal communication, using machine-learning algorithms to identify when squeaking mice are stressed or why fruit bats are shouting. Even more ambitious projects are underway — to create a comprehensive catalogue of crow calls, map the syntax of sperm whales and even to build technologies that allow humans to talk back.
“Let’s try to find a Google Translate for animals,” said Diana Reiss, an expert on dolphin cognition and communication at Hunter College and co-founder of Interspecies Internet, a think tank devoted to facilitating cross-species communication. The field is young and many projects are still in their infancy; humanity is not on the verge of having a Rosetta Stone for whale songs or the ability to chew the fat with cats. But the work is already revealing that animal communication is far more complex than it sounds to the human ear, and the chatter is providing a richer view of the world beyond our own species. “I find it really intriguing that machines might help us to feel closer to animate life, that artificial intelligences might help us to notice biological intelligences,” said Tom Mustill, a wildlife and science filmmaker and the author of the forthcoming book, “How to Speak Whale.” “This is like we’ve invented a telescope — a new tool that allows us to perceive what was already there but we couldn’t see before.” Studies of animal communication are not new, but machine-learning algorithms can spot subtle patterns that might elude human listeners. For instance, scientists have shown that these programs can tell apart the voices of individual animals, distinguish between sounds that animals make in different circumstances and break their vocalisations down into smaller parts, a crucial step in deciphering meaning.