DEV Community

Cover image for How I Built an AI-Powered Editorial Layer for a News PWA Using Notion. A submission for the DEV x Notion MCP Challenge 2026
samson asekome
samson asekome

Posted on

How I Built an AI-Powered Editorial Layer for a News PWA Using Notion. A submission for the DEV x Notion MCP Challenge 2026

Notion MCP Challenge Submission 🧠

The Problem I Was Solving
I built TeqHub β€” a Progressive Web App that aggregates news, tech articles, and videos from over 30 RSS feeds and delivers them to users through a polished mobile-first interface. The app runs on Render, syncs content automatically every 6 hours, and works offline.
The problem with automated content aggregation is quality. RSS feeds are messy. You get duplicate stories from different sources, poorly written summaries, metadata dumps instead of real content, and occasionally articles you simply do not want on your platform. When your entire pipeline is automated, how do you stay in control of what your users see β€” without manually reviewing every article before it publishes?
That was the editorial challenge. And Notion became the answer.

The Architecture
Before explaining the Notion integration, here is how the full pipeline works:
RSS Feeds (30+ sources) feeds into sync.js, which fetches and parses articles, which feeds into ai-pipeline.js that processes each article through three steps: similarity dedup to block near-duplicate stories, Groq AI rewrite to rewrite articles over 300 words using Llama 3, and then the article is inserted into the SQLite database and logged to Notion. TeqHub then serves content to users. At the start of every sync run, Notion is checked for any removed articles, which are then deleted from the TeqHub database.
Every piece of content that reaches TeqHub users has passed through this pipeline. Notion sits at the end of the insert step and at the start of every sync run.

What Gets Logged to Notion
Every new article triggers a Notion page creation with these properties:
Title: Article headline Source: RSS feed name (BBC Sport, Dev.to, etc.) Category: Tech News, Sports, Politics, etc. Published: Original article publish date Summary: AI-generated 2-sentence summary URL: Link to original article TeqHub ID: Internal database row ID Status: Published or Removed
This gives me a clean, readable view of everything on TeqHub at any moment β€” on desktop or mobile β€” without opening the admin panel or querying a database.

The Bidirectional Part: Removal via Notion
This is where the integration becomes genuinely useful rather than just decorative.
Most Notion integrations are one-directional: push data in, read it, done. This one flows both ways. The Status property on each Notion page is an active control switch for the live application.
Here is what happens when I change a Status to Removed:

  1. On the next sync run (every 6 hours, or triggered manually from admin), TeqHub queries the Notion database for all pages with Status equal to Removed
  2. For each one, it reads the TeqHub ID property
  3. That article is deleted from the SQLite database
  4. The Notion page is archived automatically, so it is not processed again
  5. The article disappears from the app for all users on their next refresh The entire removal β€” from Notion to database to user-facing app β€” requires exactly one action from me: changing a dropdown.

The AI Layer
The Groq rewrite step is what makes the Notion log actually useful as an editorial tool. Raw RSS content is often truncated, poorly formatted, or written in a style inconsistent with the platform. Before an article is inserted and logged:
β€’ Articles under 300 words are left as-is (too short to rewrite meaningfully)
β€’ Raw metadata content, such as Hacker News-style URL dumps, is detected and skipped
β€’ Articles over 300 words are sent to Groq’s llama-3.1-8b-instant model
The prompt: Rewrite the following article in clear, well-structured English. Keep the same meaning and facts. Do not remove important information. Do not add new ideas or opinions. Write in third person, present tense, where appropriate. Keep it concise β€” aim for 60 to 80 percent of the original length. Then write a 2-sentence summary starting with SUMMARY on a new line.
The rewritten content goes into the database. The 2-sentence summary goes into the Notion log. So when I review the Notion dashboard, I am reading AI-cleaned summaries of every article β€” making editorial decisions fast.

Why This Matters
Two things make this integration meaningful beyond a basic demo.
First, Notion is infrastructure here, not documentation. The Status field on a Notion page has a direct, automated effect on a production database. Notion is not just storing information β€” it is issuing instructions to a live system.
Second, it solves a real problem for small teams and solo builders. If you are running an automated content platform alone, you cannot review every article before it publishes. But you also cannot afford to let everything through unfiltered. The Notion editorial layer gives you passive oversight (the log is always there if you want to check) and active control (you can remove anything with one tap) without slowing down the automation.

What I Would Build Next
The natural next step is to add a filter view in Notion that shows only articles from specific sources or categories β€” so if a particular RSS feed starts producing bad content, I can scan and remove it in bulk. Notion’s filtering and grouping make this straightforward.
A second step would be a quality score property β€” computed at insert time based on word count, image presence, and rewrite ratio β€” so low-quality articles are visually flagged in the Notion dashboard before I even read them.

A Note on the Build Process
TeqHub was built collaboratively with Claude (Anthropic) β€” not just for code generation, but as a thinking partner throughout the architecture, debugging the Notion integration, refining the AI pipeline, and iterating on every feature in this post. The workflow of using an AI assistant to build a platform that itself uses AI for editorial automation felt like an appropriate way to explore what this tooling can actually do in production.
Tech Stack
TeqHub backend: Node.js + Express + SQLite + Turso Content sync: Custom RSS parser AI rewrite: Groq API β€” llama-3.1-8b-instant (free tier) Editorial layer: Notion API Frontend: Single-file PWA (vanilla HTML/CSS/JS) Hosting: Render (free tier)

One Line
I used Notion as a real-time editorial control panel for a news aggregation PWA β€” every AI-rewritten article is logged automatically, and changing a status field in Notion removes it from the live app on the next sync.

notion #javascript #webdev #challenge

Top comments (0)