This is a submission for Weekend Challenge: Earth Day Edition
What I Built
For this Earth Day Challenge, I wanted to build something that bridges the gap between raw climate data and actionable human insight. When extreme weather events occur, raw numbers (like humidity percentages or temperature drops) aren't enough. People need context.
I built Arara Watch, a Global Disaster Monitor. It takes real-time climate and satellite data and transforms it into an actionable Threat Matrix for natural disasters (wildfires, floods, extreme heat/cold, landslides).
To celebrate Earth Day, the system doesn't just look at the dangers; it also generates an AI-powered Ecological Overview of any selected region on Earth, highlighting its vegetation type, natural attractions, and overall ecosystem conservation status.
Demo
🔗 Live Application: Arara Watch
Code
💻 GitHub Repository: isaque21/dev_earth_day_challenge
How I Built It
To ensure the application could handle sudden spikes in traffic (like during an actual natural disaster) without costing a fortune, I designed a 100% Serverless architecture deployed via Terraform.
The Serverless Foundation
- Frontend: Vanilla HTML/CSS/JS hosted on an Amazon S3 Bucket and distributed globally via CloudFront. I used a Glassmorphism UI design over a high-resolution nature background to fit the Earth Day theme.
- Backend: An Amazon API Gateway routing requests to an AWS Lambda function (Python 3.12).
- Data Ingestion: The Lambda function fetches real-time weather from the OpenWeather API and active fire hotspots from NASA's FIRMS satellite data.
The Core Challenge: Rate Limiting & Latency
Integrating Large Language Models in real-time mapping applications usually leads to API rate limits (like the 429 Too Many Requests error).
To solve this, I implemented a Proximity Caching Strategy using Amazon DynamoDB. When a user clicks on the map, the Lambda function rounds the coordinates to two decimal places (creating a "coverage radius" of roughly 1.1km) and uses it as a Global Secondary Index (GSI) key.
If another user checked that same neighborhood in the last 30 minutes, the API bypasses the external requests and returns the cached analysis instantly. This dropped my API calls by over 80% and reduced response times from 8 seconds to just 200 milliseconds for cached locations! 🚀
Prize Categories
I am submitting this project for the Best Use of Google Gemini category.
Instead of using the AI just for chatbot-style text generation, I utilized Google Gemini 2.5 Flash as a structured backend microservice.
By using strict prompt engineering, I instructed Gemini to act as both a Civil Defense Analyst and an Ecological Expert. The prompt feeds the raw NASA and OpenWeather data into the model and forces it to return a strictly formatted JSON response. This allows the Vanilla JS frontend to seamlessly map the AI's cognitive output directly into the DOM to render the colored Threat Matrix and the Ecological Overview cards without any parsing errors.
Top comments (0)