I built a thing. It's not revolutionary, but it solves a real problem I had.
Like most developers, I consume way too much tech news. HackerNews, Reddit, Dev.to, Medium, random RSS feeds… it's a lot. The problem isn't finding content. It's finding the good content without spending half my day scrolling.
Tech news isn’t scarce. Attention is. So, I built a news aggregator that sends me only what’s worth reading, filtered, with a suggested comment attached per article that I can add to the conversation.
What does it actually do?
It saves time, and helps make it easy for me to jump into some of the big conversations happening in the wild.
Pretty simple concept: scrape sources every couple hours, let AI score the relevance of each article, group related stories into topics, and send me a digest twice a day with only the stuff worth reading. Then suggest comments or ways for me to weigh in.
The setup is straightforward:
- Scrapes HackerNews, RSS feeds, Reddit, Dev.to, and Medium every 2 hours
- AI analyzes each article and scores it 0-10 for relevance
- Groups high-scoring articles into major/minor topics
- Suggests a comment you can post to join the conversation immediately
- Sends email digests at 9 AM and 6 PM with articles scoring 7+
Why did I build this instead of using existing tools?
Honestly? Because existing aggregators either suck at filtering or cost too much for what they do.
We live in a time where we can leverage our technical skills alongside Claude and just... create.
The tech stack
Nothing fancy here:
- Node.js for the scraping and scheduling
- Xano for the backend (database, API, AI tools)
- PM2 to keep it running in the background
- Various AI providers (Claude, GPT-4, Gemini) for content analysis
The app runs on your local machine. Just clone, run the setup wizard, and it handles everything else.
What makes it useful?
The AI scoring is surprisingly good at filtering noise. It looks for technical depth, discussion-worthy topics, and actual insights rather than just clickbait or rehashed press releases.
The topic extraction is probably the most useful feature. Instead of getting 50 random articles, I have 5-6 topics with related articles grouped together. Much easier to process.
And the wake-from-sleep scheduling means my computer automatically wakes up to send digests, even if I'm not around.
Is this actually better than just checking HackerNews?
For me, yeah. I went from spending 30-45 minutes a day browsing tech news to spending maybe 10 minutes reading curated digests. The signal-to-noise ratio is way better.
What I enjoy most is the suggested comment, so I can actively participate in the conversation immediately. The API provider you select in setup helps decide what you should respond with. Personally, I’ve found Claude to be amazing.
The setup process
The setup wizard handles most of the complexity. You need:
- A Xano account (free tier works)
- An AI API key (Anthropic, OpenAI, or Google)
- Optionally, a Gmail app password for email digests
First, create a free account at Xano.com, and then copy your base url and create your metadata token.
!Base URL, underlined](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/76tksu3i0v86090z9rdb.png)
Your base URL is underlined in red. Include the https://
Then, in Powershell or Terminal, cd into the cloned directory and run npm run setup and it walks you through everything: configuring services, creating database tables, setting up scheduled tasks. Takes maybe 10 minutes.
Then, it runs every two hours and sends you a digest at both 9:00am and 6:00pm.
Bonus feature
If you head into Xano after it scrapes, you can populate your voice_examples db table with past writings. Then, you can use the AI agent I bundled with the repo to help generate your own blog articles for Medium, Dev.to, and more.
Believe it or not, it’s a scraper and a content writer.
What I learned building this
My biggest take away? Batching is awesome. Store data in memory, send in one big batch to Xano, and now I don’t have to worry about hammering API limits. ~600 requests vs ~20 makes a huge difference!
I’ve gone through several iterations, from sending one article at a time to the database, to then sending in batches. I would recommend sending big chunks of data in batch requests!
The AI analysis piece was actually the easiest part. Modern LLMs are pretty good at content evaluation when you give them clear criteria.
The scheduling and wake-from-sleep stuff was more complex than expected, especially making it work across Windows, macOS.
Ultimately, this was a fun project and you can definitely expect more.
Worth building yourself?
If you want full control over sources, scoring criteria, and digest timings, building your own makes sense. Feel free to fork the repo and modify as you see fit. This has been outfitted for tech news, but the scope can be whatever you set. With fully customizable scheduling options as well, the era of highly personalized microapps is among us.
Github repo is here: https://github.com/drippinrizz/tech-news-aggregator
Happy building!

Top comments (2)
Awesome! Will give it a try.
Let me know how it goes!