<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: n74459944-web</title>
    <description>The latest articles on DEV Community by n74459944-web (@n74459944web).</description>
    <link>https://dev.to/n74459944web</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/n74459944web"/>
    <language>en</language>
    <item>
      <title>I Built an AI-Powered News Digest That Runs Itself for $5/Month — Here's How</title>
      <dc:creator>n74459944-web</dc:creator>
      <pubDate>Sat, 21 Mar 2026 11:36:17 +0000</pubDate>
      <link>https://dev.to/n74459944web/i-built-an-ai-powered-news-digest-that-runs-itself-for-5month-heres-how-3p29</link>
      <guid>https://dev.to/n74459944web/i-built-an-ai-powered-news-digest-that-runs-itself-for-5month-heres-how-3p29</guid>
      <description>&lt;p&gt;I wanted one place to see the day's most important stories, organized by category, without the noise of a 24/7 news cycle. Nothing I found did exactly that — so I built it in a day.&lt;br&gt;
It's called Le Bref (French for "The Brief"). It scrapes 25+ RSS feeds twice a day, uses Claude AI to pick the top stories, summarizes them into 3 tiers of depth, and stores everything in a permanent searchable archive.&lt;br&gt;
Total monthly cost to run: about $5-10. Here's the full breakdown.&lt;br&gt;
The Problem&lt;br&gt;
Every news app I tried had the same issues:&lt;/p&gt;

&lt;p&gt;Paywalled (The Economist's Espresso, premium newsletters)&lt;br&gt;
Ephemeral (today's digest replaces yesterday's — no archive)&lt;br&gt;
Uncategorized (everything dumped into one feed)&lt;br&gt;
Too long (I want the gist, not a 2,000-word article)&lt;/p&gt;

&lt;p&gt;I wanted a site where I could glance at today's highlights in 30 seconds, read more if something caught my eye, and go back to any past date to see what happened.&lt;br&gt;
The Architecture&lt;br&gt;
[25 RSS Feeds] → [Python Cron Job] → [Claude API] → [Supabase] → [Next.js Frontend]&lt;br&gt;
Five moving parts, all on free or near-free tiers.&lt;br&gt;
Backend: Python on Railway&lt;br&gt;
A cron job runs twice daily (6 AM and 6 PM UTC). It does three things:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Scrape — feedparser pulls the latest articles from 25 RSS feeds across 7 categories (Politics, Economics, Tech, World, Health, Crime, Breaking).&lt;/li&gt;
&lt;li&gt;Summarize — All articles get sent to the Claude API in a single call. The prompt asks Claude to act as an editor: pick the top 10-14 stories, categorize them, and generate three tiers of content for each:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;TL;DR: One punchy sentence with a key number or fact&lt;br&gt;
Summary: 3-4 factual sentences&lt;br&gt;
Why It Matters: 2 sentences on broader significance&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Store — Results go into Supabase (PostgreSQL) with deduplication via headline hashing.
The key cost insight: I pre-generate everything at cron time. No API calls happen when users visit the site. This dropped costs from potentially $50+/day (if every user triggered an API call) to about $0.15/day flat.
Frontend: Next.js on Vercel
The frontend reads directly from Supabase's REST API — no backend server needed for reads since Row Level Security allows public SELECT access with the anon key.
Features that took the most thought:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;3-tier expandable cards — TL;DR always visible, click to expand full summary&lt;br&gt;
"Go deeper with Claude" button — This links to claude.ai/new?q=... with a pre-filled prompt asking Claude to explain the story in depth. The user uses their own Claude account. Cost to me: $0. Value to the user: massive.&lt;br&gt;
Dark mode — CSS variables swap between light and dark themes, persisted in localStorage, defaults to system preference&lt;br&gt;
Breaking news — Claude can tag one story per day as "breaking." When it does, a pulsing red tab appears in the category bar&lt;br&gt;
Trending topics — If 2+ articles share a subcategory (like "iran" or "fed"), it surfaces as a trending topic&lt;/p&gt;

&lt;p&gt;Database: Supabase&lt;br&gt;
Simple schema — one articles table with columns for all 3 content tiers, category, subcategory, sources, and a headline hash for deduplication. Row Level Security is configured so the anon key can only read, and only the service role can write.&lt;br&gt;
I also added a subscribers table for newsletter signups — anyone can INSERT their email, but only the service role can SELECT (so emails stay private).&lt;br&gt;
SEO&lt;br&gt;
This part was important since news content is inherently search-friendly:&lt;/p&gt;

&lt;p&gt;Dynamic sitemap at /sitemap.xml listing every article page&lt;br&gt;
Individual article URLs like /article/2026-03-21/fed-holds-rates&lt;br&gt;
Open Graph images generated dynamically at /api/og&lt;br&gt;
RSS feed at /api/rss for Feedly subscribers&lt;/p&gt;

&lt;p&gt;The "Go Deeper" Trick&lt;br&gt;
This is my favorite architectural decision. The typical approach for an AI-powered feature would be:&lt;/p&gt;

&lt;p&gt;User clicks "explain more"&lt;br&gt;
Your server calls an LLM API&lt;br&gt;
You pay for the tokens&lt;br&gt;
You serve the response&lt;/p&gt;

&lt;p&gt;The problem: if 1,000 users each click it 5 times a day, you're looking at $15-50/day in API costs.&lt;br&gt;
My approach: the "Go deeper with Claude" button is just a link:&lt;br&gt;
javascriptfunction buildClaudeLink(article) {&lt;br&gt;
  var prompt = "Explain this news story in depth...\n\n" &lt;br&gt;
    + "Headline: " + article.headline &lt;br&gt;
    + "\nSummary: " + article.summary;&lt;br&gt;
  return "&lt;a href="https://claude.ai/new?q=" rel="noopener noreferrer"&gt;https://claude.ai/new?q=&lt;/a&gt;" + encodeURIComponent(prompt);&lt;br&gt;
}&lt;br&gt;
It opens Claude's web interface with a pre-filled prompt. The user gets a full expert-level explainer. I pay nothing. It works because Claude.ai has a free tier, and the pre-filled prompt means the user doesn't have to think about what to ask.&lt;br&gt;
Cost Breakdown&lt;br&gt;
ServiceCostClaude API (20 articles/day across 2 editions)~$3-5/monthSupabase (database, free tier)$0Vercel (frontend hosting, free tier)$0Railway (backend cron, starter plan)$0-5Domain (lebref.news)~$12/yearTotal~$5-10/month&lt;br&gt;
What I'd Do Differently&lt;br&gt;
Start with fewer RSS feeds. I launched with 25 feeds, which means ~125 articles per scrape. Claude handles it fine, but debugging prompt issues is harder with more input.&lt;br&gt;
Test the prompt extensively. The summarization prompt is the most important code in the entire project. Small wording changes dramatically affect output quality. I went through several iterations to get Claude to consistently produce the right length and tone.&lt;br&gt;
Set up dedup from day one. My first run produced duplicate stories because the morning and evening editions covered the same events. I now hash the first 5 significant words of each headline per date to catch near-duplicates.&lt;br&gt;
Try It&lt;br&gt;
lebref.news — free, no account needed, dark mode, RSS feed available.&lt;br&gt;
The archive grows every day. The longer it runs, the more valuable the searchable history becomes. It's like compound interest for content.&lt;br&gt;
If you have questions about the architecture or want to build something similar, drop a comment — happy to share more details.&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>nextjs</category>
      <category>python</category>
      <category>ai</category>
    </item>
  </channel>
</rss>
