Artificial Intelligence 'could let
humans talk to animals' in real-life Dr DolittleSo much effort has gone into teaching systems such as Siri and Alexa to understand us that using AI to decode animal communication could be a logical next step.
Talking to animals, in the way Dr Dolittle did, might seem more like appropriate subject matter for a family film than serious scientific research.
But Professor Michael Bronstein at Imperial College London says that Artificial Intelligence could provide the key to unlocking completely alien languages – such as the complex songs of whales.
Bronstein is already working on an AI chatbot that could decipher sperm whales’ unique language even though it has no points of reference with any human communication.
He does concede, though, that the first conversations between humans and whales may be “only be a rough approximation of the true depth and meaning of what they’re saying” because our lives and reference points are so completely different.
Nevertheless, Professor Bronstein heads a team at the Cetacean Translation Initiative – or CETI – which has been set up in the hope of one day decoding the secret language of sperm whale communication.
In a paper published in Scientific Reports in 2019, the team documented some encouraging first steps.
READ MORE
AI robot dolphins and sharks could change aquariums and theme parks forever
When Europeans first met the people of Polynesia and Australia, after a separation lasting thousands of years, they still ate similar foods, and lived in broadly similar social groups.
The use of AI enables ideas and wider concepts to be interpreted, rather than words and sentences.
We are working with very different tools to American neuroscientist Dr John Lilly, who hoped to use dolphin communication to teach NASA how to talk to aliens.
But Professor Michael Bronstein at Imperial College London says that Artificial Intelligence could provide the key to unlocking completely alien languages – such as the complex songs of whales.
Bronstein is already working on an AI chatbot that could decipher sperm whales’ unique language even though it has no points of reference with any human communication.
He does concede, though, that the first conversations between humans and whales may be “only be a rough approximation of the true depth and meaning of what they’re saying” because our lives and reference points are so completely different.
Sperm whales are able to communicate with others of their kind over huge distances
(Image: Getty Images)
Nevertheless, Professor Bronstein heads a team at the Cetacean Translation Initiative – or CETI – which has been set up in the hope of one day decoding the secret language of sperm whale communication.
In a paper published in Scientific Reports in 2019, the team documented some encouraging first steps.
It might be a bit more complicated then Dr Doolittle
They made thousands of recordings of whale communication which, after analysis, enabled them to make detailed predictions about which specific whale was likely to “speak” next.
But they still had no idea what the mighty beasts were actually saying.
They made thousands of recordings of whale communication which, after analysis, enabled them to make detailed predictions about which specific whale was likely to “speak” next.
But they still had no idea what the mighty beasts were actually saying.
Professor Michael Bronstein believes we could train an AI to interpret the songs of whales
(Image: TWIITER)
It’s not even clear if animals have a language in the same way that humans do – making any attempt at translation doomed to failure.
Professor Sophie Scott, a leading expert on the neuroscience of voices and speech, told the Daily Star that while many animals are vocalising all the time, most – for example pigs – “don’t seem to be saying very much.”
“However elephants and dolphins,” she adds, “seem to have huge complexity to their communication.”
Octopuses, too, she says, clearly have complex problem-solving intelligence. “But with no common frame of reference, it’s hard to see what we could find to talk about.”
Professor Sophie Scott, a leading expert on the neuroscience of voices and speech, told the Daily Star that while many animals are vocalising all the time, most – for example pigs – “don’t seem to be saying very much.”
“However elephants and dolphins,” she adds, “seem to have huge complexity to their communication.”
Octopuses, too, she says, clearly have complex problem-solving intelligence. “But with no common frame of reference, it’s hard to see what we could find to talk about.”
Professor Scott told the Daily Star that Elephants have their own complex language
(Image: BBC/Royal Institution/Paul Wilkinson)
Birds, too, she says, could be sharing complex information about themselves as part of the day Dawn Chorus.
She says that the research that Dr Alan McElligott did at the University of Roehampton into communication with goats and other animals shows there’s a lot we can learn from our four-legged friends even without AI.
The problem would be that the lives these creatures lead are so thoroughly alien, it will be a lot harder to learn their language than it was – for example – an explorer meeting a previously un-contacted group of humans.
She says that the research that Dr Alan McElligott did at the University of Roehampton into communication with goats and other animals shows there’s a lot we can learn from our four-legged friends even without AI.
The problem would be that the lives these creatures lead are so thoroughly alien, it will be a lot harder to learn their language than it was – for example – an explorer meeting a previously un-contacted group of humans.
Dr Alan McElligott undertaken extensive research in the field of animal communication
(Image: PA)
READ MORE
AI robot dolphins and sharks could change aquariums and theme parks forever
When Europeans first met the people of Polynesia and Australia, after a separation lasting thousands of years, they still ate similar foods, and lived in broadly similar social groups.
The use of AI enables ideas and wider concepts to be interpreted, rather than words and sentences.
We are working with very different tools to American neuroscientist Dr John Lilly, who hoped to use dolphin communication to teach NASA how to talk to aliens.
If, or when, we eventually encounter alien intelligences we will both need AI to interpret for us
(Image: Getty Images)
In recent years, so much effort has gone into the development of teaching computers to understand human speech – with systems such as Siri and Alexa – that decoding animal communication could be a logical next step.
Professor Bronstein told New Scientist. “I think it’s the right time, with the right data and with the right expertise, to possibly solve this problem.”
Ironically, communicating with actual extra-terrestrial intelligences – who may well have their own AIs that would “speak” to ours – could be an easier challenge than learning to chat with the wide and varied array of intelligences we’ve been sharing the planet with all along.
In recent years, so much effort has gone into the development of teaching computers to understand human speech – with systems such as Siri and Alexa – that decoding animal communication could be a logical next step.
Professor Bronstein told New Scientist. “I think it’s the right time, with the right data and with the right expertise, to possibly solve this problem.”
Ironically, communicating with actual extra-terrestrial intelligences – who may well have their own AIs that would “speak” to ours – could be an easier challenge than learning to chat with the wide and varied array of intelligences we’ve been sharing the planet with all along.
No comments:
Post a Comment