- Insightera
- Posts
- Will We Ever Speak With Animals?
Will We Ever Speak With Animals?
Also about the Conclave, Google and Apple
Executive Summary
Bocconi's model has chosen Robert Prevost for Pope.
Google launched the AI Futures Fund to back startups using DeepMind's AI tools.
Apple considering AI search for Safari as traditional searches decline, impacting Google deal.
Decoding animal talk faces huge hurdles.
Was this email forwarded to you? Subscribe here.
News
π΄ This model predicted the new Pope before it happened. Link
Bocconi University developed a model that ranked cardinals based on their status, information control, and coalition-building capacity. Robert Prevost was the winner in the "Status" category.
"Status," as referred to by researchers, rewards cardinals who are connected not only to many but also to the most influential cardinals.
Based on this prediction, at least some secrets of the Conclave were uncoveredβsocial influence and networking play a key role, as in any other social setting. No surprise there.
π¨βπ» Google has launched the AI Futures Fund. Link
A new initiative to invest in startups, from seed to late-stage, that are developing applications using AI tools from Google DeepMind.
Applications for the AI Futures Fund opened on May 12, and it operates on a rolling basis without fixed deadlines, with early participants including companies like Viggle (a meme creation platform) and Toonsutra (a webtoon app)
Google also has its Google for Startups Founders Funds, which supports founders from an array of industries and backgrounds building companies, including AI companies.
π¦ Apple is bringing AI search options to Safari. Link
Apple is "actively looking at" integrating AI search options into its Safari browser within the next year, having already held discussions with companies like Perplexity, OpenAI, and Anthropic.
This consideration comes as Eddy Cue, Apple's senior vice president of services, revealed during Google's antitrust trial that searches in Safari declined last month for the first time in 22 years, a trend he attributes to the growing use of AI.
In his speech, Cue mentioned that Google pays Apple around $20 billion each year to remain the default search engine on Safari. He pointed out that if the number of searches goes down, it could directly affect how much Apple earns from this arrangement.
Discussion

Will we ever speak with animals?
Long before, humans were only capable of delivering simple pieces of information to members of different tribes and cultures. The usage of gestures, symbols, and sounds were our main tools for intra-cultural communication.
With more global interconnectedness, our communication across cultures became more advanced, and we began to be immersed in the languages of other nations. With education and learning of foreign languages, we became capable of delivering complex messages across regions. The most groundbreaking shift happened recently with the advancement of language models.
At the current stage, we are able to hold a conversation on any topic with a representative of a language we have never heard before, assuming mutual access to the technology.
Can this achievement be reused to go beyond human-to-human communication?
There are several projects that aim to achieve this. Project CETI is one of the most prominent. A team of more than 50 scientists has built a 20-kilometer by 20-kilometer underwater listening and recording studio off the coast of an Eastern Caribbean island.
They have installed microphones on buoys. Robotic fish and aerial drones will follow the sperm whales, and tags fitted to their backs will record their movement, heartbeat, vocalisations, and depth. This setup is accumulating as much information as possible about the sounds, social lives, and behaviours of whales. Then, information is being decoded with the help of linguists and machine learning models.
Some achievements have been made. The CETI team claims to be able to recognize whale clicks out of other noises and has established the presence of a whale alphabet and dialects.
Before advanced machine learning models, it was a struggle to separate different sounds in a recording, creating the 'cocktail party problem'. As of now, project CETI has achieved more than 99% success rate in identifying individual sounds.
Nevertheless, overall progress, while remarkable, is far away from an actual Google Translate between humans and whales. And there are serious reasons for this.
First of all, a space of 20x20 km is arguably too small to pose as a meaningful capture of whale life. Whales tend to travel more than 20,000 km annually. In addition, on average, there are roughly only 10 whales per 1,000 kmΒ² of ocean space, even close to Dominica. Such limited observation area creates the so-called 'dentist office' issue.
David Gruber, the founder of CETI, provides a perfect explanation:
"If you only study English-speaking society and you're only recording in a dentist's office, you're going to think the words root canal and cavity are critically important to English-speaking culture, right?"
Speaking of recent developments in language models, LLMs work based on semantic relationships between words (vectors). If we imagine that language is a map of words, and the distance between each word represents how close their meanings are, if we overlap these maps, we can translate from one language to another even without pre-existing understanding of each word.
This strategy works very well if languages are within the same linguistic family. However, it is a very big assumption that this strategy will work for human and animal communication.
Thirdly, there is an issue of interpretation of the collected animal sounds. Humans can't put themselves into the body of a bat or whale to experience the world in the same way. It might be noted that recorded sounds are about a fight for food; however, animals could be interacting regarding a totally different topic that goes beyond our capability.
For example, communication could be due to Earth's magnetic field changes or something more exotic. And a lot of collected data is labeled based on the interpretation of human researchers, which is very likely to be wrong.
An opportunity to understand animal communication is one of those areas that can change our world once more. At the current state, we are likely to be capable of alerting animals of some danger, but actual Google Translate for animal communication faces fundamental challenges that are not going to be overcome any time soon.
Our service
Do you need help with your data ecosystem? Reply directly to this email or reach out to [email protected]. More information about our service is available here.
How did we do this week? |
Was this email forwarded to you? Subscribe here.