DEV Community

agenthustler
agenthustler

Posted on

How to Build a Wildfire and Disaster Alert Monitor

Natural disasters demand real-time information. This guide shows you how to build a monitoring system that scrapes wildfire data, weather alerts, and emergency feeds to keep communities informed.

Architecture Overview

Our monitor pulls data from three sources: NASA FIRMS (fire data), NOAA weather alerts, and local emergency management websites. We aggregate everything into a single alerting pipeline.

Core Scraper

pip install requests beautifulsoup4 geopy schedule
Enter fullscreen mode Exit fullscreen mode
import requests
from bs4 import BeautifulSoup
from datetime import datetime
from geopy.distance import geodesic

class DisasterMonitor:
    def __init__(self, api_key, lat, lon, radius_km=100):
        self.api_key = api_key
        self.center = (lat, lon)
        self.radius = radius_km
        self.alerts = []

    def check_nasa_firms(self):
        url = "https://firms.modaps.eosdis.nasa.gov/api/area/csv"
        params = {
            "source": "VIIRS_SNPP_NRT",
            "area_coords": f"{self.center[1]-1},{self.center[0]-1},{self.center[1]+1},{self.center[0]+1}",
            "day_range": 1
        }
        resp = requests.get(url, params=params)
        fires = []
        for line in resp.text.strip().split("\n")[1:]:
            parts = line.split(",")
            if len(parts) > 2:
                fire_loc = (float(parts[0]), float(parts[1]))
                dist = geodesic(self.center, fire_loc).km
                if dist <= self.radius:
                    fires.append({
                        "lat": parts[0], "lon": parts[1],
                        "confidence": parts[8] if len(parts) > 8 else "N/A",
                        "distance_km": round(dist, 1)
                    })
        return fires

    def check_noaa_alerts(self):
        lat, lon = self.center
        url = f"https://api.weather.gov/alerts/active?point={lat},{lon}"
        resp = requests.get(url, headers={"User-Agent": "DisasterMonitor/1.0"})
        data = resp.json()
        alerts = []
        for feature in data.get("features", []):
            props = feature["properties"]
            alerts.append({
                "event": props["event"],
                "severity": props["severity"],
                "headline": props["headline"],
                "expires": props.get("expires", "")
            })
        return alerts

    def scrape_emergency_page(self, url):
        proxy_url = f"http://api.scraperapi.com?api_key={self.api_key}&url={url}"
        resp = requests.get(proxy_url)
        soup = BeautifulSoup(resp.text, "html.parser")
        updates = []
        for item in soup.select(".alert-item, .emergency-update, .news-item"):
            title = item.select_one("h2, h3, .title")
            date = item.select_one(".date, time")
            if title:
                updates.append({
                    "title": title.text.strip(),
                    "date": date.text.strip() if date else str(datetime.now())
                })
        return updates

# Usage
monitor = DisasterMonitor("YOUR_SCRAPERAPI_KEY", 34.05, -118.24, radius_km=50)
fires = monitor.check_nasa_firms()
alerts = monitor.check_noaa_alerts()
print(f"Active fires nearby: {len(fires)}")
print(f"Weather alerts: {len(alerts)}")
Enter fullscreen mode Exit fullscreen mode

Adding Notification Support

import smtplib
from email.mime.text import MIMEText

def send_disaster_alert(alerts, recipient):
    body = "\n\n".join([
        f"WARNING: {a['event']} ({a['severity']})\n{a['headline']}"
        for a in alerts
    ])
    msg = MIMEText(body)
    msg["Subject"] = f"Disaster Alert: {len(alerts)} active warnings"
    msg["To"] = recipient
    with smtplib.SMTP("smtp.gmail.com", 587) as server:
        server.starttls()
        server.login("your_email", "your_password")
        server.send_message(msg)
Enter fullscreen mode Exit fullscreen mode

Scaling with Proxy Infrastructure

Emergency sites get hammered during disasters. ScraperAPI ensures reliable access even under heavy load. For continuous monitoring, ThorData residential proxies avoid IP blocks when polling frequently. Track your monitoring pipeline health with ScrapeOps.

Running Continuously

import schedule
import time

monitor = DisasterMonitor("YOUR_KEY", 34.05, -118.24)

def check_cycle():
    fires = monitor.check_nasa_firms()
    alerts = monitor.check_noaa_alerts()
    if fires or alerts:
        send_disaster_alert(alerts, "team@example.com")

schedule.every(15).minutes.do(check_cycle)
while True:
    schedule.run_pending()
    time.sleep(60)
Enter fullscreen mode Exit fullscreen mode

Conclusion

A disaster monitor is a high-impact project combining multiple data sources. Start with NASA and NOAA APIs, add local emergency page scraping, and build outward from there. The key is reliability -- when disasters strike, your monitor needs to work every time.

Top comments (0)