This is a submission for Weekend Challenge: Earth Day Edition
What I Built
By 2100, most of what you’re about to see is already gone.
This is a museum built from that absence - an archive of what humanity lost during the Anthropocene.
The premise is simple: curators from the future built this collection, looking back at us.
Every exhibit documents a real environmental loss. Not invented species, but things already gone, or measurably disappearing right now. The Great Barrier Reef. The monarch migration. The vaquita porpoise.
Visitors can do two things:
Nominate an exhibit - type a word, a phrase, anything.
"Fireflies." "The sound of a forest."
The AI archivist grounds that in real, documented science, turning it into a permanent museum card.
No fiction. Every exhibit is real data, real species, real loss.
Ask the Curator - a floating button opens a conversation with the museum's archivist.
It is 2100. Everything is already gone.
The curator speaks entirely in the past tense and answers from that weight.
You’re not chatting with an assistant.
You’re talking to someone who has already watched it disappear.
Demo
anthropocene-archive.vercel.app
Try nominating something you're afraid we'll lose. Then ask the Curator what happened to it.
Code
Future Museum of Extinct Things
It is the year 2100. You are standing in a digital archive built by those who remembered. This is what they chose to preserve.
Built for the DEV Earth Day Challenge 2026 - a contemplative digital museum set in the year 2100, where the exhibits are the species, places, sounds, and sensations that humanity lost during the Anthropocene. Visitors can nominate what they are afraid we will lose, and an AI curator - powered by Google Gemini - writes a scientifically-grounded permanent exhibit for each one.
What It Is
The premise: a museum from the future, looking back at us. Every exhibit documents a real environmental loss, not invented, not speculative fiction, but things already gone or measurably disappearing. The Great Barrier Reef. The monarch migration. The sound of a full dawn chorus. Truly dark skies.
The Gemini integration isn't decorative…
How I Built It
Stack: vanilla HTML, CSS, and JavaScript — no frameworks, no build step. GSAP for animation. Two Gemini-powered Vercel serverless functions.
I deliberately kept the stack minimal. This project isn’t about complexity - it’s about control over tone, pacing, and interaction.
The nomination feature sends user input to /api/gemini.js, where Gemini is prompted to translate a vague or emotional phrase into a real, documented environmental phenomenon.
The challenge wasn’t generating text - it was constraining it.
Without strict instructions, the model drifted into fiction. With too many constraints, it became sterile. The prompt had to balance both: enforce real species, real data, real locations - while still sounding like a human curator, not a report.
The Curator chat lives in /api/curator.js and is treated as a separate system entirely.
It’s not just a chatbot, it’s a character with rules:
- year 2100
- speaks only in the past tense
- offers no solutions, only memory
It’s also context-aware. If a user opens an exhibit and asks a question, the Curator responds from within that specific loss rather than generically.
Both functions run server-side, keeping the API key completely off the client.
Design-wise, everything supports the same idea: quiet loss.
Soil, bark, amber, parchment - materials that age and decay.
A subtle grain overlay, concentric rings that echo tree rings, ripples, or sonar, something searching, or remembering.
The goal wasn’t just to show information.
It was to make it feel like something already gone.
Prize Categories
Best use of Google Gemini: two integrations, both central to the concept:
- generating scientifically grounded exhibits from visitor input
- maintaining a consistent character voice from the year 2100
Top comments (0)