Illustration for: Forest Listeners lets users explore Amazon and Atlantic forests to find species
Research & Benchmarks

Forest Listeners lets users explore Amazon and Atlantic forests to find species

3 min read

In the Brazilian rainforests, a constant buzz of sounds goes mostly unheard, and many of those calls haven’t been recorded at all. Scientists have been looking for massive audio datasets to teach models how to tell species apart, but hauling gear into the Amazon or the Atlantic forest is pricey and takes ages. That’s why a new citizen-science site tries to turn any web browser into a makeshift field lab.

The platform renders the thick canopy and layered acoustics in three-dimensional detail, so anyone with an internet connection can “step” into a virtual forest. Volunteers click through a digital stand-in, listen to a clip that might belong to a hidden animal, then say yes or no. Each yes or no feeds back into an algorithm that seems to pick up patterns quicker than before.

If it works as hoped, the whole thing could speed up biodiversity cataloguing and give the public a real way to help AI-based conservation research.

Forest Listeners takes users into a virtual 3D forest, where they can: - Search the Atlantic or Amazon rainforests for hidden species - Train their ears to recognize their unique calls - Click "yes" if they do, or "no" if they don't. Every response helps These contributions are helping to fine tune Perch, an AI model from Google DeepMind, to accelerate and scale the process of monitoring biodiversity. By immersing audiences in this interactive and engaging experiment, we aim to inspire continued deeper learning about rainforests and provide valuable crowd-sourced support for expert-led conservation efforts.

Hear the unique calls of the rainforest to assess its health We're able to gauge the health of a forest from the inside out by listening to the diversity and patterns of animal behavior. But analyzing the thousands of hours of audio recordings is a challenge, and training data for audio models is lacking for many important species all over the world, including the Brazilian rainforests. That's why we're excited about the work we have done with Forest Listeners, bringing together scientists, citizen scientists and Google AI to monitor health, assess biodiversity and measure restoration success.

Train AI models with the global community This AI experiment is built on more than 1.2 million audio recordings from the Atlantic and Amazon rainforests.

Related Topics: #AI #DeepMind #Perch #Amazon #Atlantic #rainforest #biodiversity #citizen-science #audio recordings

Forest Listeners lets anyone with a web connection step into a 3-D slice of the Amazon and Atlantic rainforests. When a sound seems like a hidden species, you just click “yes” or “no,” and that little bit of data slides straight into Google’s AI models - the project calls it a way to fine-tune their research tools. The whole thing leans on Google Arts & Culture, DeepMind and WildMon, basically turning citizen ears into a backup for the usual monitoring rigs.

How much those crowd-sourced tags actually boost detection accuracy is still a bit fuzzy; the article doesn’t give numbers or a rollout schedule. The interface is clean, the idea is catchy, and the ask is simple: listen, decide, help. I can see skeptics wondering if casual users can really tell the calls apart, especially when it’s all simulated, but the developers seem to think that pooling many responses will iron out the occasional mistake.

So, it’s a mix of game-like listening and AI training, hoping to add a little more insight into fragile rainforest ecosystems - though we’ll have to wait and see how well it works.

Common Questions Answered

What is the primary function of the Forest Listeners platform?

Forest Listeners lets users explore a virtual 3D slice of the Amazon and Atlantic rainforests, listen to recorded animal calls, and label each call as a hidden species or not. These user responses generate valuable data that helps train biodiversity‑monitoring AI models.

How does Forest Listeners help improve the Perch AI model from Google DeepMind?

Each "yes" or "no" click on a species call provides a labeled audio example that is fed directly into Perch, Google DeepMind's biodiversity‑monitoring AI. This crowd‑sourced labeling fine‑tunes the model, enhancing its accuracy and scalability for detecting species in the rainforest.

Which organizations are collaborating on the Forest Listeners project?

The project is a partnership between Google Arts & Culture, DeepMind, and the conservation nonprofit WildMon. Together they combine web‑based citizen‑science tools, advanced AI research, and ecological expertise to turn ordinary browsers into virtual field labs.

Why is large‑scale audio data essential for monitoring biodiversity in the Amazon and Atlantic forests?

Audio recordings capture the myriad vocalizations of species that are often invisible to the human eye, especially in dense canopy layers. Large, diverse datasets are required to train machine‑learning models that can automatically recognize and track these calls across vast, hard‑to‑reach rainforest areas.