UAE launches new AI ecosystem for global agricultural development
CGIAR's AI innovations set to increase farmers' Resilience
image:
UAE AI Ecosystems Launch
view moreCredit: International Affairs Office, UAE
- The global AI for Agriculture Ecosystem is a dynamic platform, built upon a collaborative network that includes the International Affairs Office at the UAE Presidential Court, Mohamed bin Zayed University of Artificial Intelligence (MBZUAI), NYU Abu Dhabi, ai71, and key international partners including the Gates Foundation, CGIAR, and the World Bank.
- The ecosystem’s capabilities are anchored in four landmark initiatives: the Institute for Agriculture and Artificial Intelligence (IAAI), the CGIAR AI Hub, AgriLLM, and The Agricultural Innovation Mechanism for Scale (AIM for Scale).
- The ecosystem brings together Abu Dhabi’s human talent, research excellence, and advanced AI capabilities to drive global agricultural resilience. By uniting scientific innovation with real-world deployment, it delivers AI-powered tools that help smallholder farmers adapt to extreme and unpredictable weather, protect their livelihoods, and strengthen food security in climate-vulnerable regions around the world.
Abu Dhabi, United Arab Emirates – The United Arab Emirates has launched Abu Dhabi’s AI Ecosystem for Global Agricultural Development, a platform designed to bring AI solutions to climate-exposed agricultural regions and support the communities most affected by shifting weather patterns.
The launch builds on the USD 200 million UAE–Gates Foundation partnership announced at COP28 to accelerate agricultural innovation.
With two years of progress and the partnership continuing to advance, the announcement was made in the presence of Her Excellency Mariam Almheiri, Head of the International Affairs Office at the UAE Presidential Court, Bill Gates, Chair of the Gates Foundation, and Ismahane Elouafi, Executive Managing Director, CGIAR. Following the announcement, Her Excellency and Bill Gates joined the UAE–Gates Partnership Showcase, an immersive overview of the ecosystem’s core pillars, demonstrating how Abu Dhabi’s research strength, technological capabilities, and AI leadership are being brought together to support vulnerable agricultural communities around the world.
These capabilities come together through four landmark initiatives: the CGIAR AI Hub, the Institute for Agriculture and Artificial Intelligence (IAAI), AgriLLM, and The Agricultural Innovation Mechanism for Scale (AIM for Scale).
Her Excellency Mariam Almheiri, Head of the International Affairs Office at the UAE Presidential Court, said: "The UAE is harnessing artificial intelligence for global good, to help protect the farmers and communities most exposed to climate volatility. By connecting our national research and AI capabilities with leading global partners, we are turning science into real tools that reach people on the ground. Through our partnership with the Gates Foundation, we are advancing Agri-AI solutions that support millions of smallholder farmers facing unpredictable weather, helping secure a more stable and hopeful future for communities worldwide.”
Bill Gates, Chair of the Gates Foundation added: “Around the world, smallholder farmers are facing the harshest impacts of climate change with the fewest tools to adapt. The AI for Agriculture Ecosystem helps change that by putting practical, data-driven solutions directly in farmers’ hands. I’m grateful for the UAE’s leadership: this initiative helps strengthen food security and support farmers in a warming world.”
The AI for Agriculture Ecosystem is built upon a collaborative network, that includes the International Affairs Office at the UAE Presidential Court, four Abu Dhabi-based institutions; Mohamed bin Zayed University of Artificial Intelligence (MBZUAI), NYU Abu Dhabi, and ai71, alongside key international partners including the Gates Foundation, CGIAR, and the World Bank.
Together, these partners form a unified system that turns advanced research and AI capabilities into practical solutions for farmers, governments, and development actors. This system spans the full chain of innovation from scientific discovery to digital advisory, open-source agricultural AI models, and field deployment in climate-vulnerable regions.
These initiatives are:
CGIAR AI Hub: A global collaborative workspace hosted in Abu Dhabi by ai71 as a core technology partner. It is set to revolutionize agriculture by driving digital transformation and innovation. The hub aims to establish Abu Dhabi as a leading center for AI in agriculture, leveraging over 50 years of CGIAR's extensive agricultural data and expertise from its 13 global research centers and partner networks.
“The AI Agriculture Ecosystem is rooted in science and powered by global collaboration,” said Ismahane Elouafi, CGIAR’s Executive Managing Director. “By combining AI expertise and insights from global partners, the AI Agriculture Ecosystem can develop innovations that strengthen decision-making, guide policies and investments, and accelerate the adoption of digital tools – supporting vulnerable communities in the Global South and farmers in the Global North facing similar challenges.”
Groundbreaking advances delivered by the CGIAR AI Hub so far include the AI-Water Project, an AI-powered hydrological model by IWMI, that enhances transboundary water management and forecasting, with a global dashboard under development for the 2026 UN Water Conference; and the AI Genebank, a platform that speeds up crop improvement by rapidly screening over 700,000 plant samples for climate-resilient traits, and linking them to research via an interactive chatbot.
Institute of Agriculture and Artificial Intelligence (IAAI): Based at MBZUAI in Abu Dhabi, the IAAI is a new pioneering digital advisory hub for agriculture that offers digital advisory tools, training programs, and technical assistance teams to governments and NGOs. The hub is dedicated to improving the lives and livelihoods of over 43 million smallholder farmers impacted by extreme weather changes.
AgriLLM: It is an open-source large language model developed by ai71 in Abu Dhabi and designed to advance global agricultural intelligence. It is trained on deep agricultural data, including 350,000 agricultural documents, 50,000 research papers, and 120,000 real farming questions & answers, and is designed for multilingual understanding. Four AI agents by AgriLLM are currently being tested with partners, demonstrating how the model can deliver region-specific, role-based guidance for climate adaptation, resource management, and food production. The entire AI-enabled model will be released as a public good, free for anyone to use, modify, or build upon.
AIM for Scale: An initiative jointly funded by the UAE and the Gates Foundation. It is based at NYU Abu Dhabi. It is driving global efforts to deliver AI-powered weather forecasting and digital advisory services to smallholder farmers. At COP30, AIM for Scale announced a shared ambition with its partners to reach 100 million farmers by 2030. This ambition is already backed by real progress: in 2025, the Government of India delivered AI-powered monsoon forecasts via SMS to 38 million farmers, the largest such deployment to date. Building on this momentum, MBZUAI and the University of Chicago launched an AI Weather Forecasting Training Program in Abu Dhabi, training officials from Bangladesh, Chile, Ethiopia, Kenya, and Nigeria, with plans to expand to 25 more countries by 2027.
ENDS
AI headphones automatically learn who you’re talking to — and let you hear them better
image:
Shyam Gollakota's team at the University of Washington combined off-the-shelf noise-canceling headphones with binaural microphones to create prototype "proactive listening" AI headphones, pictured here.
view moreCredit: Hu et al./EMNLP
Holding a conversation in a crowded room often leads to the frustrating “cocktail party problem,” or the challenge of separating the voices of conversation partners from a hubbub. It’s a mentally taxing situation that can be exacerbated by hearing impairment.
As a solution to this common conundrum, researchers at the University of Washington have developed smart headphones that proactively isolate all the wearer’s conversation partners in a noisy soundscape. The headphones are powered by an AI model that detects the cadence of a conversation and another model that mutes any voices which don’t follow that pattern, along with other unwanted background noises. The prototype uses off-the-shelf hardware and can identify conversation partners using just two to four seconds of audio.
The system’s developers think the technology could one day help users of hearing aids, earbuds and smart glasses to filter their soundscapes without the need to manually direct the AI’s “attention.”
The team presented the technology Nov. 7 in Suzhou, China at the Conference on Empirical Methods in Natural Language Processing. The underlying code is open-source and available for download.
“Existing approaches to identifying who the wearer is listening to predominantly involve electrodes implanted in the brain to track attention,” said senior author Shyam Gollakota, a UW professor in the Paul G. Allen School of Computer Science & Engineering. “Our insight is that when we’re conversing with a specific group of people, our speech naturally follows a turn-taking rhythm. And we can train AI to predict and track those rhythms using only audio, without the need for implanting electrodes.”
The prototype system, dubbed “proactive hearing assistants,” activates when the person wearing the headphones begins speaking. From there, one AI model begins tracking conversation participants by performing a “who spoke when” analysis and looking for low overlap in exchanges. The system then forwards the result to a second model which isolates the participants and plays the cleaned up audio for the wearer. The system is fast enough to avoid confusing audio lag for the user, and can currently juggle one to four conversation partners in addition to the wearer’s audio.
The team tested the headphones with 11 participants, who rated qualities like noise suppression and comprehension with and without the AI filtration. Overall, the group rated the filtered audio more than twice as favorably as the baseline.
Gollakota’s team has been experimenting with AI-powered hearing assistants for the past few years. They developed one smart headphone prototype that can pick a person’s audio out of a crowd when the wearer looks at them, and another that creates a “sound bubble” by muting all sounds within a set distance of the wearer.
“Everything we’ve done previously requires the user to manually select a specific speaker or a distance within which to listen, which is not great for user experience,” said lead author Guilin Hu, a doctoral student in the Allen School. “What we’ve demonstrated is a technology that’s proactive — something that infers human intent noninvasively and automatically.”
Plenty of work remains to refine the experience. The more dynamic a conversation gets, the more the system is likely to struggle, as participants talk over one another or speak in longer monologues. Participants entering and leaving a conversation present another hurdle, though Gollakota was surprised by how well the current prototype performed in these more complicated scenarios. The authors also note that the models were tested on English, Mandarin and Japanese dialog, and that the rhythms of other languages might require further fine-tuning.
The current prototype uses commercial over-the-ear headphones, microphones and circuitry. Eventually, Gollakota expects to make the system small enough to run on a tiny chip within an earbud or a hearing aid. In concurrent work that appeared at MobiCom 2025, the authors demonstrated that it is possible to run AI models on tiny hearing aid devices.
Co-authors include Malek Itani and Tuochao Chen, UW doctoral students in the Allen School.
This research was funded by the Moore Inventor Fellows program.
For more information, contact proactivehearing@cs.washington.edu
Article Title
Proactive Hearing Assistants that Isolate Egocentric Conversations
No comments:
Post a Comment