Report Abuse

How Scientists Are Using AI To Talk To Animals

Post a Comment
How Scientists Are Using AI To Talk To Animals

In the 1970s, a young gorilla named Koko caught the world's attention with his ability to use human sign language. However, skeptics say that Koko and other animals that have "learned" to speak (including chimpanzees and dolphins) cannot understand what they are "saying" and try to trick other species into using them. maybe not physically there, nothing.

"One group of researchers wanted to see if animals communicate symbolically, and another group said, 'That's anthropomorphism.' We need it. Scientists are now using advanced sensors and artificial intelligence technology to monitor and decipher how various species, including plants, exchange information through their communication channels. This field of "digital bioacoustics " is the subject of Baker's new book " Sounds of ". How digital technology brings us closer to the animal and plant world .

Scientific American told Bakker how technology can help humans communicate with creatures like bats and bees, and how those conversations are forcing us to reconsider our relationships with other species.

[ Below is an edited transcript of the interview. ]

Can you give us a brief history of human attempts to communicate with animals?

In the mid-20th century there were many attempts to teach human language to non-human primates such as the koko. This effort is somewhat controversial. Looking back, one of our opinions today (perhaps less common then) is that we were too anthropocentric in our programming. The desire then is to evaluate a non-human intelligence by teaching it how to speak – in fact, we have to think about its ability to speak in complex ways on its own terms, in its own way, which is embodied in its worldview communication. . One of the terms used in the book is the concept of the environment , namely the concept of the life experience of living things. If we focused on the appearance of any other organism, we would not expect bees to speak human language, but we would be very interested in the beautiful language of bees, viz., vibration and position . It's sensitive to nuances like polarization in sunlight that we can't even begin to transfer to our bodies. That's where science is today. The digital bioacoustic field, which was accelerating to breakneck speeds and revealing amazing results in communicating through the tree of life, is now approaching these animals and asking: "Can they talk like humans??" But “Could they convey complex information to each other? How did you do that? What's important to them?” And I think that's a more biocentric approach, or at least not anthropocentric.

From a broader perspective, I also think it's important to recognize that there is a long and respected tradition of listening to nature, “deep listening.” This is an ancient art that is still practiced without tools. The ancient Aborigines had a deep, deeply attuned tradition of listening to non-human voices. So if we combine digital hearing -- unlocking a vast new range of non-human voices and decoding those sounds with AI -- with deep listening, I think we're close to making two important discoveries. The first is non-human language. And that's a very controversial statement, we can investigate. Second, I think we're on the verge of interspecies communication.

What technologies have made this achievement possible?

Digital bioacoustics relies on minuscule, portable, and lightweight digital recorders that look like the miniature microphones scientists have installed everywhere from the Arctic to the Amazon. You can place this microphone on the back of a turtle or whale. You can put it in the depths of the ocean, you can put it on a mountain, and you can attach it to a bird. And they can record 24/7 in remote locations where scientists cannot easily reach them, even in the dark and without preventing human observers from entering the ecosystem.

These devices generate a deluge of data, and that's where artificial intelligence comes into play, because the same natural language processing algorithms that we use for such effects in tools like Google Translate can also be used to recognize patterns in non-human communication.

What would be an example of this communication style?

In the chapter Bats, where I discuss Yossi Yovel's research, there is a particular study where he observed and recorded the sounds of [about two dozen] Egyptian fruit bats over a period of two and a half months. His team [15,000] then customized speech recognition software to analyze the sound, and the algorithm associated certain sounds with certain social interactions captured on video, such as when bats fight over something, food. Thanks to this, researchers were able to classify most of the vocalizations of bats. Yeovil and other researchers such as Jerry Carter were able to determine that bats have a more complex language than we previously understood. bats fighting over food; In fact, they differentiate between sexes when communicating with each other; They have strange names or "signature calls". Mother bats refer to their babies as "mother". But while human mothers raise their voices when talking to babies, bat mothers elicit unusual responses in babies, which learn to "say" certain words or signs as they grow. So bats engage in vocal learning.

This is a great example of how deep learning can take this model of [these] devices, all these sensors and microphones, and tell us something that we can't understand with human ears. Since most of the communication with bats is ultrasonic, beyond our hearing range, and because bats speak faster than we do, we need to slow them down and also lower the frequency in order to hear them. . So we can't listen like bats, but our computers can. The next idea, of course, is that our computers can also talk to bats. [The program] creates certain patterns and uses those patterns to return to bat colonies or beehives, and that's what researchers are doing now.

How do researchers talk to bees?

Bee research is excellent. A [researcher] named Tim Landgraf, as I mentioned before, studies bee communication, which is vibrational and positional. When bees "talk" to each other, their body movements are important as well as their sounds. Now computers, especially deep learning algorithms, can follow suit because they can use computer vision with natural language processing. They have now refined this algorithm to the point where they can actually track individual bees and determine the effect one individual's communication has on another bee. Hence the ability to decipher the honey bee language. We found that they have a special signal. [Researchers have given this character] a cute name. teeth [bees]; You are the antichrist. There is a "quiet" or "stop" signal, a loud "danger" signal. They have whistles [related to gathering] and begging and shaking, which are all direct behaviors of groups and individuals.

Landgraf's next step was to encode this information into a robot he named RoboBee. Eventually, after seven or eight prototypes, Landgraf developed a "bee" that could enter the hive and basically issue orders that the bees would obey. So Landgraf's robotic bee can tell other bees to stop, and they do. They can also perform something more complex, the well-known swing dance – a communication pattern they use to alert other honey bees of the location of a nectar source. It's a very simple experiment because you put a nectar source in a hive where there are no bees and then you tell a robot to tell the bees where the nectar is coming from and then see if the bees can fly there successfully. And indeed they did. It's a great result.

This raises many philosophical and ethical questions. One could imagine using such a system to protect bees, by having the bees fly to a safe source of nectar rather than, say, one containing high levels of pesticides. You can imagine that it could be a way to tame wild animals that we haven't fully tamed yet, or to control the behavior of other wild animals. The notion of the complex level and sophistication of communication among nonhumans raises very important philosophical questions about the uniqueness of language as a human ability.

How does this technology affect our understanding of nature?

The discovery of digital bioacoustics is akin to the invention of the microscope. When [Dutch scientist Anthony] van Leeuwenhoek began his research with his microscope, he discovered the microbial world... and in doing so, laid the foundations for many discoveries to come. Thus, the microscope allows humans to see again with our eyes and imagination. The analogy here is digital bioacoustics combined with artificial intelligence such as planetary scale hearing aids, allowing us to hear again with artificially enhanced ears and imaginations. This slowly opens our minds not only to the beautiful sounds non-humans make, but also to some fundamental questions about the human-non-human divide, our relationship to other species. And [it] opens up new ways of thinking about the environment and our relationship to the planet. It's deep.

How to film an animal that talks from the inside

Related Posts

Post a Comment