<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Maxwell Jensen</title>
    <description>The latest articles on DEV Community by Maxwell Jensen (@maxwelljensen).</description>
    <link>https://dev.to/maxwelljensen</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/maxwelljensen"/>
    <language>en</language>
    <item>
      <title>LLM Aggregator: aggregate RSS feeds and summarise them with LLMs</title>
      <dc:creator>Maxwell Jensen</dc:creator>
      <pubDate>Thu, 30 Apr 2026 18:14:03 +0000</pubDate>
      <link>https://dev.to/maxwelljensen/llm-aggregator-aggregate-rss-feeds-and-summarise-them-with-llms-42c6</link>
      <guid>https://dev.to/maxwelljensen/llm-aggregator-aggregate-rss-feeds-and-summarise-them-with-llms-42c6</guid>
      <description>&lt;p&gt;I’d like to share a tool I’ve been developing for my own workflow: &lt;a href="https://codeberg.org/maxwelljensen/llm_aggregator" rel="noopener noreferrer"&gt;&lt;code&gt;llm_aggregator&lt;/code&gt;&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What is it?&lt;/strong&gt; Free software CLI tool written in Go that fetches articles from multiple RSS feeds, optionally filters them by date or keywords, then sends them as a query to any LLM through OpenAI-compatible API to produce a concise summary, or analysis, or whatever you prompt it for.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why I built it:&lt;/strong&gt; I like some news sources, but I don’t really care for keeping up with hundreds of articles a day. I wanted something that:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Works completely from the terminal.&lt;/li&gt;
&lt;li&gt;Does one thing well: fetches, filters, summarises; the Linux way.&lt;/li&gt;
&lt;li&gt;Works with any LLM providers.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  How it works: a quick example
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Feed file (one URL per line)&lt;/span&gt;
&lt;span class="nv"&gt;$ &lt;/span&gt;&lt;span class="nb"&gt;cat &lt;/span&gt;feeds.txt
https://news.ycombinator.com/rss
https://lwn.net/headlines/newrss
https://opensource.com/feed

&lt;span class="c"&gt;# Basic usage: summarise recent tech news&lt;/span&gt;
&lt;span class="nv"&gt;$ &lt;/span&gt;llm_aggregator &lt;span class="nt"&gt;--api-key&lt;/span&gt; &amp;lt;API_KEY&amp;gt; &lt;span class="nt"&gt;--base-url&lt;/span&gt; &amp;lt;API_URL&amp;gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--feeds-file&lt;/span&gt; feeds.txt &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--prompt&lt;/span&gt; &lt;span class="s2"&gt;"What are the latest trends in open-source AI?"&lt;/span&gt;

&lt;span class="c"&gt;# Power-user mode: filter, limit, output to JSON&lt;/span&gt;
&lt;span class="nv"&gt;$ &lt;/span&gt;llm_aggregator &lt;span class="nt"&gt;-f&lt;/span&gt; feeds.txt &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-p&lt;/span&gt; &lt;span class="s2"&gt;"Summarize Linux kernel news"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--include-keywords&lt;/span&gt; linux,kernel &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--max-days-old&lt;/span&gt; 2 &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--max-total-articles&lt;/span&gt; 15 &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--output&lt;/span&gt; json &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--output-file&lt;/span&gt; kernel_summary.json

&lt;span class="c"&gt;# Bonus: a bubbletea TUI with live progress bars!&lt;/span&gt;
&lt;span class="nv"&gt;$ &lt;/span&gt;llm_aggregator &lt;span class="nt"&gt;--feeds-file&lt;/span&gt; feeds.txt &lt;span class="nt"&gt;--prompt&lt;/span&gt; &lt;span class="s2"&gt;"Tech highlights"&lt;/span&gt; &lt;span class="nt"&gt;--tui&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Technical highlights
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Written in Go:&lt;/strong&gt; single binary, available on every platform (Linux, macOS, Windows), zero runtime dependencies. &lt;code&gt;go build ./cmd/llm_aggregator.go&lt;/code&gt; is all you need.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Feed parsing&lt;/strong&gt; by &lt;code&gt;gofeed&lt;/code&gt;: handles RSS, Atom, and JSON.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;LLM integration via &lt;code&gt;openai-go&lt;/code&gt;:&lt;/strong&gt; use any OpenAI-compatible endpoint (Deepseek, Claude, Ollama, etc.) by changing a few parameters.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Filtering &amp;amp; processing pipeline:&lt;/strong&gt; articles are fetched, filtered (date/keywords), content extracted (with goquery fallback when feeds are snippet-only), and assembled into a context-aware prompt.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Flexible output:&lt;/strong&gt; plain text, Markdown, or structured JSON (optionally including the original articles).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Sensible defaults:&lt;/strong&gt; silent by default, verbose logging behind &lt;code&gt;-v/--verbose&lt;/code&gt;, environment variable for API key.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;TUI:&lt;/strong&gt; built with &lt;code&gt;bubbletea&lt;/code&gt; &amp;amp; &lt;code&gt;lipgloss&lt;/code&gt;. Still rough, but should be serviceable.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Configuration
&lt;/h2&gt;

&lt;p&gt;All options are command flags, a TOML file at &lt;code&gt;~/.config/llm_aggregator/config.toml&lt;/code&gt; or environment variables prefixed with &lt;code&gt;LLM_AGGREGATOR_&lt;/code&gt;. More information on this in the repository, but I explicitly designed it to fit any Linux workflow.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I’d love feedback on
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;The TUI (&lt;code&gt;-t/--tui&lt;/code&gt;) experience: is it genuinely useful? If so, would something add to it?&lt;/li&gt;
&lt;li&gt;Your personal use case and if anything is missing that would add to your workflow.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I haven’t had anyone else try this software, so expect bugs or obvious things that I might have missed. However, I did already successfully use it to make a personal daily digest, using a Python script that compiles a newspaper in LaTeX, from about 25 feeds.&lt;/p&gt;

&lt;p&gt;Interested? Check out &lt;a href="https://codeberg.org/maxwelljensen/llm_aggregator/releases" rel="noopener noreferrer"&gt;releases&lt;/a&gt; in the repository and grab a binary for your platform.&lt;/p&gt;

&lt;p&gt;Happy to answer questions. I want this program to benefit as many people as possible.&lt;/p&gt;

</description>
      <category>showdev</category>
      <category>ai</category>
      <category>opensource</category>
      <category>go</category>
    </item>
  </channel>
</rss>
