I recently completed the Google × Kaggle Agentic AI Capstone, and embarked on a deeply meaningful project: a “Sentinel AI” designed for disaster management. In just a week (and then some), I built — from scratch — a multi-agent pipeline that can ingest data, analyze risk, and help guide early-warning and response efforts.
🔍 Why this project — and why now
Natural disasters are increasing in frequency and impact worldwide — floods, earthquakes, storms — often hitting vulnerable communities hardest. I wanted to explore how AI agents could contribute meaningfully to disaster resilience. The idea: build an intelligent, autonomous “sentinel” that helps identify risk patterns, highlight vulnerable areas, and support early-warning or response planning.
This felt like a real-world problem where automation, data analysis and smart decision-support could make an impact.
🧠 What is Sentinel AI
Sentinel AI is a multi-agent system built in a notebook, combining:
EDA Agent — scans disaster-related datasets (historical incident data, geospatial data, socioeconomic indicators, etc.) and does exploratory analysis to surface patterns.
Feature Engineering Agent — creates derived features (e.g. risk indices, population exposure, proximity to hazard zones, vulnerability scales) to enhance predictive capacity.
Model Builder Agent — trains multiple candidate models (e.g. classification or risk-score predictors) and selects the best according to validation metrics.
Evaluation Agent — assesses model performance (accuracy, ROC-AUC, confusion matrix) and generates diagnostic visualizations.
Report Writer Agent — collates findings: summarizing data, model behavior, risk predictions, visualizations and exportable report for decision-makers.
Coordinator Agent — orchestrates the entire workflow end-to-end.
With this architecture, Sentinel AI aims to accept new data, run the full pipeline autonomously, and deliver results in a reproducible and shareable format.
🚧 What I did — and what I learned
Developed a multi-agent pipeline (not just a simple model) that handles everything from raw data to final report.
Followed disciplined steps: data cleaning, feature engineering, modeling, evaluation, reporting — with each stage encapsulated in an agent for modularity.
Gained a better appreciation for pipeline design, reproducibility, and automation — key qualities when building AI for social good.
Faced challenges: balancing model complexity vs. interpretability; handling missing, inconsistent, or noisy real-world data; defining meaningful risk features.
Learned that “AI for good” requires more than model performance — it demands thoughtful design, context-awareness, and clarity in outputs that decision-makers can act on.
🎯 What the capstone achieved (and future potential)
A working, reusable template: drop in new disaster datasets (flood history, seismic zones, population density, etc.), run the pipeline, get a risk-analysis/report.
A proof-of-concept for AI-assisted disaster risk assessment & early warning pipelines.
A modular architecture that can be extended: integrate geospatial analysis, time-series forecasting, multi-hazard layers, alert-generation agents, or real-time data ingestion (e.g. weather feeds, sensor data).
A step toward bridging AI + social impact: showing how agentic AI can contribute beyond academic tasks — toward resilience, safety, and human welfare.
✨ What’s next — and how you can try it
Want to tinker with Sentinel AI yourself? Here’s how:
Head to my Kaggle Notebook (link / repo) — add any disaster-relevant dataset (hazard zones, historical events, population, socioeconomic data, etc.)
Run the full pipeline — EDA, feature engineering, modeling, evaluation, and report generation
Adapt or extend it: add geospatial or time-series analysis, integrate alerting agents, or wrap into a web dashboard / API for real-time use
If you care about disaster resilience, climate risk or humanitarian aid — I hope you’ll find this project a small but meaningful step.
♥️ Acknowledgements & thanks
Thanks to the Google × Kaggle Agentic AI course for enabling this capstone; to open-data sources; and to all the individuals and organizations working toward disaster risk reduction and climate resilience globally.
Top comments (0)