Over the years, working with CISOs and IT leaders organizing cybersecurity events, I kept hearing the same problem — too much noise, not enough signal. Security teams were spending their mornings piecing together what matters from dozens of feeds, newsletters, and dashboards before they could even start their actual work.
So I built defend.network — a free platform that publishes daily threat briefings and weekly vulnerability reports. Every briefing is structured, categorized, and paired with action checklists so a security team can scan it in 5 minutes and know exactly what to prioritize.
The whole thing runs fully automated for about $2-3/month. No manual intervention. No database. No framework. Here's how I built it.
The Architecture
I wanted the simplest possible stack that could still produce a professional, fast, SEO-friendly platform. The answer turned out to be: no stack at all.
defend.network is a static site on Netlify's free tier. Pure HTML, CSS, and JavaScript. No React. No Next.js. No database. The entire deployed site is 366KB.
Content isn't written by hand — it's generated by serverless functions that run on a schedule, produce HTML pages, and push them to a GitHub repository. When GitHub receives the new files, Netlify detects the commit and auto-deploys the site. The new briefing is live within seconds.
The full architecture in one sentence: Make.com → Netlify Function → Claude Haiku API → HTML Template → GitHub → Netlify Auto-Deploy.
That's it. Six moving parts, four of which are free.
Static Site (Netlify Free Tier)
├── index.html
├── briefings/
│ ├── index.html (archive with filters)
│ └── [keyword-slug-YYYY-MM-DD].html
├── threats/
│ └── [threat-type].html (16 category pages)
├── industries/
│ └── [industry].html (13 category pages)
├── vulnerabilities/
│ └── index.html (weekly reports)
├── tools/
│ └── index.html (59-tool directory)
├── sitemap.xml (auto-generated)
├── feed.xml (RSS)
└── data/
└── schema.json (taxonomy definitions)
No build step. No compile. No bundler. Just HTML files served from a CDN.
The Automation Pipeline
Every morning at 6 AM UTC, the entire pipeline runs without any human involvement. Here's what happens in about 2-3 minutes:
Step 1: The Trigger
A Make.com scenario (free tier) fires an HTTP request to a Netlify background function at 6 AM UTC daily. Make.com handles the scheduling — that's all it does. One webhook, one schedule, zero cost.
Step 2: Fetch & Parse
The Netlify background function fetches RSS feeds from 5 cybersecurity publications:
- The Hacker News
- BleepingComputer
- Krebs on Security
- Dark Reading
- The Record
Plus CISA advisories and vendor security bulletins.
The function parses these feeds and selects the top 20 most significant articles based on recency and topic diversity. This pre-filtering keeps the API prompt focused and the cost per briefing low.
Step 3: AI Analysis
The curated articles go to Anthropic's Claude Haiku API with a structured prompt that asks Claude to:
- Identify the top 5 threats of the day
- Assign severity levels (Critical / High / Medium / Low)
- Tag each threat with types from a predefined taxonomy (16 categories)
- Tag affected industries (13 categories)
- Write an executive summary
- Generate actionable remediation steps
- Produce a prioritized action checklist
The response comes back as structured JSON — not free-form text. The taxonomy is enforced at generation time, so every briefing uses consistent categories that power filters and category pages downstream.
{
"severity": "critical",
"threatTypes": ["supply-chain", "malware", "apt"],
"industries": ["technology", "healthcare", "finance"],
"executiveSummary": "...",
"threats": [
{
"title": "Trivy Supply Chain Attack",
"severity": "critical",
"industry": "technology",
"description": "...",
"actions": ["Audit Trivy versions...", "Rotate credentials..."]
}
],
"actionChecklist": ["URGENT: Audit Trivy scanner versions...", "..."]
}
Cost per briefing: ~$0.03-0.05. At 30 briefings per month, that's roughly $1-1.50/month for the AI analysis.
Step 4: Build HTML
The function injects the JSON into a stored HTML template. The template handles severity badges, threat type tags (linked to category pages), industry tags, the action checklist with checkboxes, and SEO metadata in the <head>.
Briefing URLs are generated keyword-first, date-second: /briefings/trivy-supply-chain-north-korea-wiper-2026-03-24 rather than /briefings/2026-03-24. Primary keywords appear in the URL slug before the date — deliberate for SEO.
Step 5: Push to GitHub
The HTML file is pushed to the repo via the GitHub Contents API. In the same run, the function also updates:
- briefings/index.json — master index powering the archive page
- sitemap.xml — new URL added
- feed.xml — new RSS entry
- 29 category pages — each threat type and industry page regenerated
One function execution updates the entire site.
Step 6: Auto-Deploy
GitHub receives the commit. Netlify detects it. Site deploys in seconds. The new briefing is live, sitemap updated, RSS feed current, and every relevant category page links to it.
Zero manual intervention. The site just gets smarter every morning.
The Structured Taxonomy
This is the part that makes the platform useful for security teams — and it's what drives the SEO strategy.
Every briefing is tagged from a predefined taxonomy:
- 16 threat types: Ransomware, Zero-Day, Phishing, APT, Supply Chain, Data Breach, Malware, Vulnerability Exploit, Credential Theft, DDoS, BEC, Insider Threat, IoT/OT, Mobile Malware, Compliance, Cryptojacking
- 13 industries: Healthcare, Finance, Government, Technology, Energy, Education, Manufacturing, Retail, Legal, Telecom, Transportation, Media, Defense
- 4 severity levels: Critical, High, Medium, Low
These aren't free-form tags — they're enforced in the API prompt, defined in schema.json, and validated before the HTML is generated.
This taxonomy powers two things:
Category pages. Each threat type and industry has its own landing page (/threats/ransomware.html, /industries/healthcare.html, etc.) — 29 pages total. They aggregate every briefing with that tag, show statistics, and include unique intro content.
Dynamic filters. The briefings archive page loads filter options from schema.json so users can drill down by threat type, industry, or severity. One source of truth, client-side rendering.
The SEO Strategy
With 29 auto-generated category pages and daily briefings, the SEO strategy mostly builds itself:
Keyword-rich URLs. Every briefing URL leads with keywords, not dates: /briefings/trivy-supply-chain-north-korea-wiper-2026-03-24.
29 category landing pages. Each targets keywords like "ransomware threat intelligence" or "healthcare cybersecurity threats." They accumulate more briefings over time — becoming more authoritative as content grows.
Dense internal linking. Every tag on a briefing links to its category page. Every category page links back to briefings. Google can crawl the full topical graph.
Prev/next navigation. Briefings are chained sequentially — search engines can follow the entire archive.
Auto-generated sitemap & RSS. Regenerated on every publish. New content is discoverable within minutes.
The compounding effect. Every day the site publishes, it gets harder for competitors to catch up. Each briefing adds a new indexed page, strengthens category pages, extends internal links, and feeds the RSS. 30 new pages per month. 365 per year. All structured, all interlinked, all targeting long-tail cybersecurity keywords.
The Cost
| Service | Role | Monthly Cost |
|---|---|---|
| Claude Haiku API | AI analysis (~30 briefings × $0.03-0.05) | ~$1.00-1.50 |
| Netlify | Hosting + serverless functions (free tier) | $0 |
| GitHub | Repository + Contents API | $0 |
| Make.com | Daily scheduling trigger (free tier) | $0 |
| Beehiiv | Email newsletter (free plan) | $0 |
| Total | ~$2-3/month |
For context: a comparable threat intelligence subscription from a commercial vendor runs $10,000-50,000+ per year.
What I'd Do Differently
Add individual CVE pages earlier. Every CVE in a vulnerability report could have its own page at /cve/CVE-2026-XXXXX.html — high-intent, low-competition queries. That's 10-20 free indexed pages per week.
Build tool comparison pages. The 59-tool directory is static. Pages like /compare/crowdstrike-vs-sentinelone.html would target valuable "vs" keywords with zero additional API cost.
Try It
The platform is live at defend.network. Daily briefings publish every morning. Weekly vulnerability reports drop every Monday. RSS feed at defend.network/feed.xml.
Completely free. No signup required. No paywall. No ads.
I built this to give back to the cybersecurity community that shaped my career through years of organizing IT and cybersecurity events. If it saves one security team 20 minutes on a Monday morning, it was worth building.
I'd love feedback — what's useful, what's missing, what would make this fit into your daily workflow. Drop a comment below.
If you found this useful, consider sharing it with a security professional who might benefit from a structured daily briefing.
Top comments (0)