The project wants to strengthen human bonds with other living species while seeking to promote their protection using machine learning, with the aim of deciphering non-human communication
A group of researchers from the Earth Species Project (ESP) organization, based in California, United States, intends to use machine learning to decipher the “languages” of animals to something that humans can understand, and they want to apply it to the entire world. animal Kingdom.
Although it sounds like something out of a science fiction story, the plan, which the researchers liken to ambitions to “go to the Moon”, is not really new. And its root has as its background the human interest in the study of animal vocalizations, which has existed for a long time.
For example, we know that the warning sounds of different primates vary depending on the predator; We also know that dolphins communicate with characteristic whistles; and we know that some songbirds can rearrange the components of their calls to convey different meanings.
However, most experts refrain from calling this a language because no animal communication meets all the requirements. In any case, until recently, decoding was based on careful observation.
Using machine learning
But now there has been a surge in the use of machine learning to handle the huge volumes of data that can be collected by contemporary animal sensors, according to The Guardian.
For example, Elodiw Briefer, an associate professor who studies vocal communication in animals and mammals, said machine learning is now being used to understand animal communication. Briefer co-developed an algorithm to determine, using pig grunting, whether the animal is feeling happy or sad.
Are you interested in how #bioacoustics data can help us research and understand how animals communicate? Check out this #tech4wildlife position from @earthspecies and find out how to apply:https://t.co/FUE8RiRs8z pic.twitter.com/5gnJIMb75O
— WILDLABS Community (@WILDLABSNET) July 23, 2022
Another project is DeepSqueak, which analyzes ultrasonic sounds from rats to determine if they are under stress; there is also CETI (which stands for Cetacean Translation Initiative), which translates sperm whale communication using machine learning.
Apply it to the entire animal kingdom
For its part, the California-based nonprofit Earth Species Project (ESP), founded in 2017 with the help of Silicon Valley investors like LinkedIn co-founder Reid Hoffman, says its approach is different, because it doesn’t focuses on deciphering the communication of one species, if not all of them.
ESP thus plans to first decipher animal communication using machine learning, and then make its findings available to everyone.
Aza Raskin, co-founder and president of ESP, claims to want to strengthen human ties with other living species while seeking to promote their protection using machine learning, with the aim of deciphering non-human communication.
“We are species agnostic,” Raskin told The Guardian, adding that the translation algorithms ESP is developing are designed to “work across all of biology, from worms to whales.”
forms of non-verbal communication
As The Guardian explains, to communicate, dolphin trainers make “together” and “create” signs with their hands. The two trained dolphins exchange sounds before going out, turning around and raising their tails, devising a new trick of their own and executing it.
The founder stated that this does not establish that there is a language, but simply that, if they had had access to a linguistic communication tool, this would have been much easier.
Thus, in the interview, Raskin assured that, like humans, animals also have various forms of non-verbal communication, such as bees that do a special “waggle dance” to indicate to each other that they should perch on a specific flower. .
Advances in experimental algorithm
Despite the seemingly insurmountable challenges facing the group, the project has made at least some progress, such as an experimental algorithm that can supposedly detect which individual in a noisy group of animals is “talking.”
A second algorithm can supposedly generate imitated animal calls to “talk” directly to them. “It’s making the AI speak the language,” Raskin told The Guardian, “even though we don’t know what that means yet.”
Skepticism for new technology
Although this type of research has very interesting implications, not everyone is excited about the power of AI.
For example, Robert Seyfarth of the University of Pennsylvania believes the technology could be useful for problems such as identifying an animal’s vocal repertoire. But there are other areas, such as discovering the meaning and function of vocalizations, where he is skeptical that it adds much.
Only time will tell if the project will succeed. What is clear is the growing role of artificial intelligence and machine learning in the future of science.
“These are the tools that allow us to take off our human glasses to understand entire communication systems,” says Raskin.
Rachel Maga is a technology journalist currently working at Globe Live Media agency. She has been in the Technology Journalism field for over five years now. Her life’s biggest milestone is the inside tour of Tesla Industries, which was gifted to her by the legend Elon Musk himself.