I Built an App That Listens to Your Garden and Tells You What's Disappearing π
This is a submission for Weekend Challenge: Earth Day Edition
What I Built
I opened my window in London on a Thursday evening and just... listened.
House sparrows. A robin somewhere in the distance. The faint rhythmic call of a great tit.
Then I pointed Last Breath at that same window β and found out the swift I'd heard that morning has lost 47% of its UK population since 1995.
It's still out there. Singing. Disappearing.
Last Breath is a real-time acoustic biodiversity monitor that runs entirely in the browser. You press record, point your microphone at any outdoor environment for ten seconds, and it tells you:
- Every species present in that soundscape β identified by acoustic signature alone using Google Gemini 2.5 Flash
- The real IUCN Red List conservation status of each detected species
- Whether their population is increasing, stable, or in freefall
- For the ones that are falling β roughly how many breeding seasons remain in your region
- A biodiversity health score from 0β100 for the soundscape you just recorded
Every sighting can be logged as a verified conservation record through Auth0 authentication and stored in a live global map shared across every user of the app in real time.
The world is going quiet. Most people have no idea it's happening outside their window. Last Breath makes it impossible to ignore.
Demo
π΄ Try it live β last-breadth.vercel.app
πΉ [Watch the demo β https://www.youtube.com/watch?v=HkCItSU0nvs]
Best experienced outdoors or near an open window. Early morning gives the richest soundscapes.
Here's what a session looks like:
- Press record β a live waveform visualises your soundscape in real time
- Ten seconds. Gemini analyses every acoustic signature present
- Species cards slide in one by one as IUCN data loads in parallel
- One card turns red β Common Swift Β· Endangered Β· β Decreasing Β· ~12 breeding seasons remaining in London
- You log the sighting. It pins to the global map. Someone in Lagos, Tokyo, or SΓ£o Paulo sees it.
Code
Last Breath
An acoustic biodiversity monitor that listens to the world and tells you what's disappearing.
Point a microphone at any outdoor environment β a garden, a forest edge, a park at dawn β and Last Breath identifies every species by sound, cross-references each one against the IUCN Red List, and gives you a real-time biodiversity health score for that place. Verified sightings are logged to a live global map shared across all users.
The Problem
We are living through the sixth mass extinction. Most people have no idea it is happening outside their window. Acoustic monitoring β listening for the species present in a soundscape β is one of the most sensitive and non-invasive ways to measure biodiversity. Field researchers have done this for decades with expensive equipment and months of manual analysis. Last Breath brings the same capability to anyone with a browser.
Demo
Record a 10-second clipβ¦
How I Built It
Why this exists
Apps like Merlin and BirdNET already identify birds by sound. But they stop at identification β they tell you what is there, not what it means. They don't show you that the species you just heard is in freefall. They don't give you a way to act on it. That gap is what Last Breath fills.
The audio pipeline
When you press record, two things happen simultaneously:
- A
MediaRecordercaptures the audio stream as a blob - An
AnalyserNodereads frequency data 60 times per second and draws a live waveform on canvas
After ten seconds the blob converts to base64 and hits /api/analyse β a Vercel serverless function that injects the Gemini API key server-side and forwards raw audio to Gemini 2.5 Flash with a bioacoustician system prompt. No preprocessing. No spectrograms. Raw audio in, structured conservation JSON out.
// Sending audio directly to Gemini β no preprocessing needed
{
contents: [{
parts: [
{ text: BIOACOUSTICIAN_PROMPT },
{ inline_data: { mime_type: 'audio/webm', data: base64Audio } }
]
}],
generationConfig: { temperature: 0.1 }
}
The conservation layer
For each detected species, two parallel IUCN API calls fire immediately β threat category + population trend, and habitat narrative. All lookups run with Promise.allSettled() so a failed lookup never blocks the others. Cards render progressively as each resolves.
The biodiversity health score is calculated from the threat categories present:
| Status | Score impact |
|---|---|
| CR β Critically Endangered | β15 |
| EN β Endangered | β10 |
| VU β Vulnerable | β6 |
| NT β Near Threatened | β3 |
| LC β Least Concern | +2 |
| DD β Data Deficient | 0 |
LC is the only status that adds to the score β hearing healthy species is a positive signal. The score renders as an animated SVG ring gauge, colour-coded green above 70, amber 40β70, red below 40.
The extinction countdown
This was the hardest design decision. Showing "~12 breeding seasons remaining" is blunt. But abstract statistics about wildlife decline stop meaning anything after a while β they become wallpaper. A countdown attached to a sound you just heard outside your window in London is different. It's not about the Amazon. It's about your garden. That specificity is the whole point.
Security
Every external API call is proxied through three Vercel serverless functions:
-
api/analyse.jsβ injectsGEMINI_API_KEY, handles 429/503 retries with automatic model fallback -
api/iucn.jsβ injectsIUCN_API_TOKEN, handles the two-step v4 taxa β assessment lookup -
api/sightings.jsβ injectsSUPABASE_SERVICE_KEY, validates all reads and writes
Open DevTools on the live app β you will not find a single API key anywhere in the browser.
The global map
Sightings live in Supabase PostgreSQL. Every user sees every other user's observations β not just their own. The map uses Leaflet with CartoDB Dark Matter tiles, markers colour-coded by IUCN status. The goal was a living document β a record of what the world still sounds like, built collaboratively by people who stopped and listened.
Tech stack
| Layer | Technology |
|---|---|
| Frontend | Vanilla HTML, CSS, JS β zero frameworks |
| AI analysis | Google Gemini 2.5 Flash |
| Conservation data | IUCN Red List API v4 |
| Authentication | Auth0 Universal Login |
| Database | Supabase (PostgreSQL) |
| Map | Leaflet.js + CartoDB Dark Matter |
| Hosting | Vercel (static + serverless functions) |
Prize Categories
Best Use of Google Gemini
Gemini 2.5 Flash is the entire engine of this application. It receives raw audio β no preprocessing, no feature extraction β and identifies species by acoustic signature alone, returning structured conservation-ready JSON. I used the multimodal audio input capability via the generateContent API with an inline base64 audio blob. The bioacoustician system prompt was iterated across many real outdoor recordings to balance species recall with reliable output. The api/analyse.js function includes automatic retry logic and model fallback for overload responses.
Best Use of Auth0 for Agents
Auth0 Universal Login handles all authentication via the SPA SDK. The auth layer is the trust mechanism for the shared global map β only verified, authenticated users can write to Supabase. Every logged sighting stores the user's Auth0 sub ID and display name alongside the observation, making every map pin attributable to a real verified observer. This transforms the map from a database into a ledger of provable human action β which felt important for something intended to be taken seriously as a conservation record.
73% of the world's wildlife has disappeared since 1970. That number is so large it stops meaning anything.
Last Breath is an attempt to make it small again. Local. Personal. The size of a garden in London.
We are not going to solve the sixth mass extinction with a web app. But we might start building a record of what we still have. And records matter. Silence is harder to ignore when you've heard what it's replacing.
Point your mic at the world. Find out what's still there.



Top comments (1)
Fantastic!
Generalization possibilities are manifold, hey?
Depending on the kind of dB range, frequency range etc you can distinguish/capture, more species could be monitored.
Moreover, other kinds of signals (video, images) are already handled in security systems to track missing people, find wanted criminals. Wonder if you got some inspiration from that too.
Great work!