Built something for a DEV Challenge but feel like not enough people saw it?
This post is for exactly that.
Sometimes great submissions get buried...
For further actions, you may consider blocking this person and/or reporting abuse
This is such a cool idea, thanks for starting the discussion!
Receiving such feedback from someone with 8 year club badge is flattering! haha.
Thanks, man :)
Will keep posting new versions as more challenges come!
DevStretch
An installable PWA that interrupts your coding session with dev-themed movement breaks.
Highlights: terminal dark aesthetic, voice guidance, CLI progress bar, stand-up reminders, zero dependencies, and vanilla JS only:)
Submission link
Feedback welcome on anything, but especially curious if the notification flow works for you (it's my current open issue 😅)
Hey! Love the project - the terminal aesthetic and zero-dependency approach is really clean.
I put together a PR that replaces the browser TTS with pre-generated ElevenLabs audio for higher quality voice guidance, while keeping full offline support through the Service Worker cache: PR #2
Replaces
window.speechSynthesis(browser TTS) with high-quality pre-generated ElevenLabs audio files.scripts/generate-tts.js) for reproducibilityWould love to hear what you think!
Thank you so much for this, I really appreciate the effort! 🙏
The audio quality is impressive, but I want to keep the app fully lightweight and dependency-free; that was one of the core goals from the start..
Also tbh, the slightly robotic browser TTS fits the terminal aesthetic better than I expected, it feels almost intentional 🤭
So I’ll stick with the native browser APIs for now. Though this is a very cool approach and a great reference for anyone who wants higher-quality audio in their own fork!
No worries. It's up to you :)
The PR didn't add any dependencies, but increased the overall size of the project by 4mb because of the audios being sent to users. And the Vercel free plan for bandwidth is good enough, and I made it to be cached in the browser. So it shouldn't be a big deal either.
But again, all good! You can close that PR if there is no need.
Thank you for clarifying! Really appreciate the thought you put into this, especially the caching approach 😊 I’ll keep it in mind
🇬🇪 🇬🇪 🇬🇪
Built a knowledge evaluator on Cloudflare Workers that scores conversation excerpts using Workers AI and routes them to Notion based on confidence — high signal auto-promotes, ambiguous items land in a Review Queue for human judgment, Claude Desktop reads them back via MCP. The bidirectional loop was the part most submissions missed.
dev.to/dannwaneri/i-built-a-knowle...
Nice one!
Anyone with experience in Cloudflare Workers?
Yes,been building on Workers for a few years in production. The evaluator was the first time I paired Workers AI with Notion as a judgment surface rather than just storage. Happy to answer anything on the Workers or Workers AI side.
Project Title
JS Daily Byte - Daily JavaScript Quotes Telegram Channel
Short Description
JS Daily Byte is an automated Telegram channel that posts one JavaScript concept every day. It uses GitHub Actions as a scheduler, reads quotes from a JSON file, and sends the daily post to Telegram using the Bot API. The whole system runs without any server, database, or hosting.
Submission Link
dev.to/sushantrahate/js-daily-byte...
Feedback I’m Looking For
I’d love feedback on:
I love the simplicity of this project! So there can be many ideas on how to make it "better", but it will become more complex.
I think a couple of things you can make is Open Discussion group

And maybe add more channels to distribute, like X, Mastodon, BSKY.
Thanks for the suggestion! I tried to keep the system intentionally simple with GitHub Actions only. ♥️
The discussion group is a really good idea actually. It could turn the channel into more of a learning community. Also thinking about distributing the posts to other platforms later.
Happy you found my suggestions useful :)
I think you could also add some inline buttons under each post that can do something useful. not sure, maybe you can do upvotes or something. or maybe for upvotes you can use emojis. you can create some styled emojis and allow only those to be used for posts. also allow paid reactions if someone wants to support the project.
this is how you limit reactions:
Thanks! I’ve updated the reactions and decided not to enable paid reactions.

Nice! I dropped one custom emoji in discussion chat if you like it you can add :)
Good call
Thanks, Ben.
I’m thinking of turning this into a recurring series, like “Welcome Thread v1, v2, v3” as new challenges go live. Since DEV only allows 4 tags per post, that also helps keep each thread focused on active challenges and recent projects.
Yeah, I agree something like this :)
Let me start 👇
Here are a few of my recent projects and challenge submissions.
1. Notion Skills Registry
Notion Skills Registry: A Package Manager for AI Agent Skills with MCP
Nikoloz Turazashvili (@axrisi) ・ Mar 16
A Notion-based registry designed to manage AI agent skills like packages.
Instead of duplicating prompts and workflows across projects, this system lets you version, organize, and distribute skills from a central workspace. It works well for teams building with MCP and agent frameworks where skills evolve quickly.
2. Gemini-Powered History Narrator
I Built a Gemini-Powered History Narrator 6 Months Ago. Here's What I'd Tell You Now.
Nikoloz Turazashvili (@axrisi) ・ Feb 28
An AI storytelling project that turns historical events into engaging narrated experiences using Gemini.
The post reflects on building the project months ago and shares lessons learned about AI narration, storytelling prompts, and what I would do differently today.
3. Issue Discovery Tool for Open Source Contributors
Ever Spent Hours Looking for an Open Source Issue to Contribute To? Those Days Are Over.
Nikoloz Turazashvili (@axrisi) ・ Feb 28
A tool built to help developers find open-source issues to contribute to faster.
Instead of browsing GitHub endlessly, it surfaces relevant issues and helps contributors discover projects where they can actually make an impact.
Curious to hear what you think.
Now it’s your turn. Drop your challenge submission below 👇
Hey 👋
I’d like to submit my project, TableCraft
🔗 GitHub: github.com/jacksonkasi1/TableCraft
It’s a developer-first tool focused on making data tables more efficient, flexible, and easy to use in real-world apps. Built with a strong focus on performance and usability.
Would love your feedback! 🚀
hey!
Have you submitted the project using the template of the challenge?
Link to challenges: dev.to/challenges
Do it first if you haven't done it. This thread is to get feedback, not to submit projects for challenges :)
Oops! Sorry, I didn’t notice that. Thanks for pointing it out.
this is awesome tool, have you listed it on forg.to?
1/2
this is unique niche!
One thing I can suggest right away is to turn that
promptyou copy pasted to reusable skill and add it everywhere you need it.Here is example, but I didn't dig deep to see if it fits 100% to the workflow, please double check:
2/2
Another thing, from what I can see in the repo, it only works on source csv's, right?
Maybe there are any public APIs to get the data, and AI will do that alongside general analysis, database queries, and Notion updates?
In my cursory look of 'rarity state' resources for things like books, coins, etc. I didn't find one that was what I was looking for for this example application. The idea here too is that this idea is extendible beyond just books, right? Postage stamps, coins, sports memorabilia, etc. all have large dollar value items that could have the same approach taken for.
Nice job on the prompt, btw.
I guess that's the idea for your next billion dollar SAAS project to create a service to provide that info for all collectibles? ))
That's not a project I personally want to tackle, but it would be interesting.
CosmoDex
now you can simply 'speak' to NASA NEO data, thanks to gemini! Check it out here: dev.to/astrodeeptej/fusing-nasa-da...
Hey everyone here's my submission Plante which is an automated greenhouse and gamified learning platform. It even has the little game from the app you can play within the blog 😀 Would love to get feedback on the app and also tips to get better at working with hardware. Thanks!
From Deadlocks to Green Streaks: Building an AI Greenhouse with Gemini in 36 Hours 🧑🌾
Jowi A ・ Mar 3
flintwork-token-sync: Designers edit design tokens in a Notion table, an AI agent validates everything and builds CSS. Broken references get caught before they hit code.
Built it with two MCP servers - Notion MCP handles the database, a custom one handles validation and build. Claude orchestrates both.
dev.to/vmvenkatesh78/i-built-an-mc...
Here's the link to a demo Watch the demo →
Would love to know if the dual-MCP architecture makes sense from the writeup or if it needs more explanation.
Hey, just checked the submission. Please include the demo video. I think it is part of the requirements anyway.
It would be great if you could show how this is intended to be used before anyone can give suggestions. :)
Hey @axrisi
I've attached a video link to the demo under the section titled Video Demo to the post
Please do let me know if you are unable to access it
Here's the link to it Watch the demo →
I just watched it. Got the point now in full.
We can definitely see it working and really interesting project.
Do you think this could work even better if Figma variables synced into Notion, so designers would not have to maintain both manually?
Feels like that might make the workflow more natural for actual design teams: designers keep changing values in Figma, Notion stays as the structured token/control layer, and your existing validation + build pipeline still does the rest.
Also curious whether Figma variables could map directly to your Notion token schema, or whether Figma MCP could handle that first and then feed the rest of your existing flow.
Yeah that's a good idea actually. Figma variables would map pretty cleanly to the three tiers as
colors as globals, modes for light/dark as semantic tokens.
A Figma MCP could write into Notion and the rest of the pipeline wouldn't need to change since it just reads from Notion regardless of how tokens got there.
Haven't built that part yet but it's definitely something I want to explore. Notion stays as the control layer, Figma just becomes another input.
Thanks for the suggestion.😀
Yes! Amazing that you also think it could work.
I just imagined a manager who needs to tell designers they have to maintain the Notion database for all changes they make in Figma. 🫣😁
This could make their life easier, and it would probably be less error-prone too.
WellInsightEngine
A prototype platform that turns oil well telemetry into AI-generated operational insights.
It combines TimescaleDB for KPI precomputation with Gemini for reasoning, so AI explains instead of calculating.
It detects things like instability, repeated choke adjustments, and abnormal patterns, and generates actionable insights for engineers.
Submission:
From Metrics to Meaning: Building WellInsightEngine with Gemini
Vlad ・ Mar 4
Demo:
Would appreciate feedback on:
• insight quality and usefulness
• architecture decisions
• potential real-world applicability
Interesting to hear about new niches I never thought of!
One question I have - would it be correct to say that if any professional looks at those metrics, they would immediately understand all without the need for Gemini summaries? Or it is more complex and requires digging, and real human hours are being saved by plugging Gemini.
Also, what about cron jobs to generate those reports both periodically and on demand?
Couldn't see if there is any database to save those Gemini reports and look at them after some time has passed. Maybe even export as PDF if anyone would need to show it to people who don't have access to those metrics/reports.
And one more question/suggestion: I think making an MCP for this service would be incredible. It would be an interactive chat with an agent where it gets all metrics, generates a report, and also digs deep into the solution if anything needs to be fixed.
Also, please check the Cloud Run deployment:
Good questions, thanks for taking a close look.
Metrics vs Gemini
I checked this with a professional from DTEK. The feedback was that the reasoning is accurate and matches how engineers would interpret these metrics in practice. It's not that metrics are unreadable, but they do require digging and context switching.
Cron jobs
Right now it's more on-demand, but the infrastructure is already in place for scheduled jobs. We can leverage Cloud Tasks to trigger the insight generation endpoint, since the system is built natively on GCP.
Persistence
Insights are stored in the database and linked to the data used during reasoning, so they can be revisited later.
You can also share them: well-insight-engine-frontend-servi...
Export
PDF export makes sense, especially for people without access to the system.
MCP
Yeah, that direction makes a lot of sense.
Cloud Run downtime
Thanks for pointing it out. GCP credits ran out at some point, I topped up the balance, so it should be accessible now, feel free to check: well-insight-engine-frontend-servi...
Note: this is a prototype with pre-seeded data, so there's no real-time ingestion. Better to filter from February in the calendar to see data:

Thanks for explaining in detail :)
Wish you luck in the challenge.
Notion Memory Engine: I built a system that lets AI share the same Notion database as a brain. You finish a conversation in Claude, type "at memory -create command" and the full context gets saved to Notion. Open Copilot in VS Code and it reads from that same database no copy pasting, no starting over.
Not super confident in the demo video yet and i think i might need to redo it and also would love feedback on whether the workflow comes through clearly without a live narrator.
I Was Tired of Explaining My Project to Every New AI Tab I Opened
Isah Alamin ・ Mar 19
Nothing much to critique here, honestly. It’s simple, clear, and easy to install, which is exactly what makes it good. Nice work.
I also think this is a great example of the kind of use case my submission is meant to support, especially since you’re working with skills:
Notion Skills Registry: A Package Manager for AI Agent Skills with MCP
Nikoloz Turazashvili (@axrisi) ・ Mar 16
It can help manage skills like
@memory -read and @memory create fullmore cleanly.So if you ever want to update or change a skill later, you can do it once in Notion and then install or update it across every workspace where you use it.
It also creates a more solid infrastructure around your skills, so they become easier to maintain, version, and reuse as your setup grows.
Appreciate that, And your Skills Registry project looks solid, definitely feels like they'd pair well together. Will check it out!
ENGRAM — Engineering Intelligence That Lives in Your Notion
A self-hosted Rust binary that listens to your GitHub repos via webhooks, runs every event through 9 specialized AI agents (Claude), and writes structured intelligence into 23 interconnected Notion databases — all via MCP.
No local database. Notion IS the persistence layer.
What it covers out of the box:
🔐 Security audits on every PR📉 Performance regression tracking
📋 RFC lifecycle management
🚀 Auto-generated onboarding docs
🔍 Code review pattern analysis
💯 Codebase health scoring
One binary. Zero config files. Browser-based setup wizard.
ENGRAM is a self-organizing engineering intelligence platform. It connects your GitHub repositories, Notion workspace, and Claude AI into a single autonomous system that continuously analyzes your codebase and writes structured intelligence directly into Notion.
Engineering Intelligence, etched in Notion.
Quick Start · How It Works · Intelligence Layers · Dashboard · Deployment · Security
ENGRAM is a self-organizing engineering intelligence platform. It connects your GitHub repositories, Notion workspace, and Claude AI into a single autonomous system that continuously analyzes your codebase and writes structured intelligence directly into Notion.
No polling. No manual data entry. GitHub webhooks push events to ENGRAM, 9 specialized AI agents interpret them using Claude, and every insight — security audits, performance regressions, architecture maps, RFC lifecycle tracking, team health reports, onboarding documents — is written as structured, queryable, relational data in your Notion workspace.
Notion is the central nervous system. Every metric, every decision, every piece of intelligence lives in 23 interconnected databases in your workspace.
Key Features
rust-embed. Just download and run.🔗 Full submission: dev.to/manojpisini/engram-ai-power...
Feedback I'd genuinely love:
Resume Tailor is an AI-powered resume and cover letter tailoring tool. Give it a job posting and your resume. It outputs a tailored resume and cover letter as downloadable PDFs
Hey, maybe I missed it, but how is Notion MCP actually used?
you are calling the client directly, no?
You’re right! Thanks for pointing that out; in my current setup I’m calling the Notion client directly, not using MCP.
I treated Notion as a structured data source in a deterministic pipeline, so I didn’t delegate control to the LLM.
To fully meet the requirement, I’d expose Notion via an MCP server and let the model orchestrate those calls instead.
Thanks for creating this space! It is a huge help for those of us whose posts got a bit buried in the main feed.
Project Title: Stop Context Switching: How I Built an Autonomous AI Shield (Synapse)
What it does: It’s an AI "bouncer" built with Node.js, Groq (Llama-3), and Notion MCP. It intercepts incoming Slack messages and autonomously decides whether to create a bug ticket in a Notion Sprint database or search company docs to answer questions, protecting developer focus time.
Link:
I Let AI Handle My Slack Messages So I Could Actually Code (Notion MCP)
Balkaran Singh ・ Mar 15
Feedback Wanted: As a CS undergrad, I am actively trying to learn how industry veterans evaluate agent architecture. I’ve already gotten some fantastic pointers on webhook security and tool routing, but I’d love to hear any thoughts on handling LLM tool-calling edge cases or structuring the backend for a production environment.
I won't repost here the feedback I gave you on your post. So maybe others have something more to say!
Just submitted my Notion MCP Challenge project:
Ghost Maintainer — An AI Junior Partner for Open Source
Solo maintainers wear too many hats. Ghost Maintainer takes over the repetitive parts — triaging issues, reading code, writing fixes, and opening PRs — so you can focus on the work that actually needs a human.
It uses Notion as an operations center. Bugs get triaged and fixed automatically. Features queue up until you're ready. Everything stays visible in Notion so you never lose track.
Submission Link: dev.to/sbis04/ghost-maintainer-an-...
If you are an open source project maintainer, let me know what challenges you face in your day-to-day and if a tool like this would be useful. Thank you!
Hi. Actually, a very well thought-out project.
Couple of open questions:
1) Does the challenge use MCP or rather direct Notion API?
2) What happens if someone bombards issues? If malicious actors do that, would that burn through AI budget?
Thanks for checking it out and asking questions — they actually pushed me to rethink a couple of things!
1) MCP vs direct Notion API:
Both, and your question actually made me tighten up the MCP side. Ghost Maintainer now runs two MCP servers side by side:
Notion MCP — this is Notion's official open-source MCP server. It gives the AI direct access to the workspace — searching databases, reading pages, creating content, and appending blocks. It's the foundation layer.
Ghost Maintainer MCP (custom Dart server) — this sits on top and adds the maintenance-specific tools: ghost_triage_issue, ghost_investigate_issue, ghost_deploy_fix, etc. The prompts are written to use both servers together — for example, the triage prompt tells the AI to search for duplicates through Notion MCP before classifying the issue.
You connect both to your AI client (Gemini CLI, Claude, Cursor), and they work as a pair. Notion MCP handles the "see and touch Notion" part, Ghost Maintainer handles the "think about the code and open PRs" part.
The GitHub Actions automation pipeline does still call the Notion API directly for the event-driven parts (ingesting new issues, updating triage results) — that made more sense than routing everything through MCP for server-to-server communication. But the interactive workflow is fully MCP-driven.
2) Issue spam / burning through AI budget:
Yeah, this is a real thing to worry about. Right now, there's no rate limiting, so a flood of issues would hit the Gemini API for each one. A few things help in practice though:
ghost_maintainer config --auto-fix-bugs=false. That way only the triage step runs automatically (which is lightweight). The investigation + PR creation — which is the expensive part — only happens when you explicitly runghost_maintainer fix <issue>. So you control which issues get the full treatment.For a solo maintainer getting a handful of issues a week, it's a non-issue. For a popular repo, you'd definitely want those guardrails in place first
Happy you found it useful, man :)
Wish you luck in the challenge :)
Built a tool that grades MCP schemas A+ to F. Pointed it at Notion's own server.
Notion: F. 19.8/100. 22 tools, 4,463 tokens. Every tool name violates the spec.
Then I used Notion MCP to build a live dashboard showing 201 graded servers.
dev.to/0coceo/i-built-a-tool-that-...
🚀 Project Title:
I Built an AI That Runs Your Entire Academic Life Automatically (Notion MCP)
🧠 What it does:
I didn’t just build a study tool… I built a full AI Academic Operating System.
ACADEMICOS takes a raw syllabus and turns it into a fully automated, self-managing study system inside Notion.
What used to take hours of planning now happens in seconds:
👉 This isn’t productivity. This is autonomous academic execution.
It literally feels like having an AI that thinks, plans, and adapts like a top 1% student — but works for you 24/7.
🔗 Submission Link:
dev.to/afsal_ahmed/i-built-an-ai-t...
💬 Looking for feedback on:
If you’ve ever struggled with planning, consistency, or execution —
this might be something interesting to explore.
Would love your thoughts! 🙌🚀
Cool idea and strong use case.
My only doubt is challenge fit: from the repo/write-up it looks more like a direct Notion API integration than actual Notion MCP usage. So the project itself is interesting, but I’d make the MCP part much clearer if that’s meant to be core.
ENGRAM — AI-Powered Engineering Intelligence That Lives in Your Notion
A self-hosted Rust binary that listens to your GitHub repos via webhooks, runs every event through 9 specialized AI agents (Claude), and writes structured intelligence into 23 interconnected Notion databases. No local database — Notion IS the persistence layer.
Security audits, performance regressions, RFC lifecycle tracking, auto-generated onboarding docs, code review analysis, health scoring — all written as query able, relational data in your workspace. One binary. Zero config files. Setup wizard in the browser.
Submission: dev.to/manojpisini/engram-ai-power...
Feedback I'd appreciate:
Is the Notion-as-database architecture convincing, or does it feel like a limitation?
Does the 9-agent breakdown make sense, or would fewer, broader agents be better?
Any dashboard panels you'd want to see that aren't there?
I create something that turn your Notion workspace Into a full RPG
dev.to/asynchronope/questboard-tur...
🚨 Project Valkyrie — AI for Crisis Logistics (Notion + MCP)
What happens when a storm is about to hit your logistics hub?
Most teams scramble across tools and lose critical time. I built Valkyrie to fix that.
It:
• detects threats near your assets
• creates incident reports automatically in Notion
• suggests response actions
• keeps humans in control before execution
💡 Example:
A storm approaches a Singapore hub → Valkyrie detects it → creates an incident → operator approves → response begins.
🔗 Project: dev.to/kanyingidickson-dev/project...
Would love feedback on:
• Is the problem clearly defined?
• Does this feel like a real-world system or just a concept?
• What would make this production-ready?
Happy to check out your projects too 👀
Project Title
Nudgen - AI-Powered Retention & Outreach Email Automation
Short Description
Nudgen helps small businesses and SaaS teams automate both retention and outreach email campaigns with minimal effort.
It personalizes every message at the contact level using your own branding voice, so emails feel natural instead of generic. Campaigns are behavior-driven, adapting based on user actions and stopping automatically when users engage.
Beyond retention, Nudgen also supports a common real-world scenario: finding potential lead contacts and running cold email outreach campaigns automatically.
Instead of manually researching leads and writing sequences, Nudgen can help you:
We also make it friendly for builders: Nudgen can integrate into AI workflows via CLI, allowing you to run campaigns using tools like OpenClaw, Claude, or your own agents.
The goal is simple: help teams grow (acquire + retain users) without complex setup or constant manual work.
Submission Link
nudgen.net/
Feedback I’m Looking For
I’d love feedback on:
Project Title: GitNotion - GitHub → Notion + AI Reports
Short Description: Built an MCP server that pulls GitHub activity into Notion and writes reports on top of it. Point it at any repo and it syncs issues, PRs, commits into databases. Then generates weekly summaries, release notes, contributor breakdowns. Ships via npx gitnotion so you get 8 tools in Claude Desktop without any setup.
Submission Link: dev.to/dax-side/i-built-an-mcp-ser...
Feedback I want:
How would you use this? Solo dev tracking your own projects? Team lead wanting automated updates?
What's missing? I sync basic GitHub data but wondering what other stuff would make this worthwhile for daily use
Does anyone want automated project reports? Or is this solving a problem nobody has?
Interesting idea.
Have you thought about using Notion’s existing GitHub sync as the foundation, then layering your AI reports on top of that?
Feels like that could simplify the workflow and make the value prop clearer, because the most differentiated part of this project is really the summaries, release notes, and contributor insights.
Built FlowMind, a unified AI-powered system that transforms how individuals and teams manage their digital lives by centralizing workflows, communication, and automation into a single intelligent interface.
dev.to/kanew/flowmind-ai-powered-l...
FlowMind/lib/notion.ts
From the code, this looks like a direct Notion SDK/API integration on the server rather than Notion MCP. That does not make the project bad, but it does make the challenge fit less clear if Notion MCP is supposed to be a core part of the build.
Thanks for the feedback — that’s fair and helpful.
You’re right that the current implementation uses the Notion server SDK directly for core data paths. I built this first for speed and reliability, but for the Notion MCP Challenge, MCP should be central, not peripheral.
I'm now refactoring so Notion MCP becomes the primary execution layer for task, user, and workflow operations, with SDK only as a temporary fallback during migration. I will also document and demo MCP-first flows in the final submission so the challenge fit is explicit.
Appreciate you flagging this early — I'm actively implementing the change now.
I created Animu, a web app designed to solve two big problems for returning or new anime viewers.
First, the app has a cross-season Arc Search. A user might only remember that they stopped watching around the time "Killua met Biski." They can type that phrase into the search bar. The app pulls data for the entire anime franchise and ranks the episodes by relevance. This helps them find the exact episode they need.
Second, the app includes a Sentiment Map. This feature uses AI to read the synopsis of every episode in a series and grades the emotional tone. The app then graphs these scores over the course of the show. If a viewer wants to know if an anime stays lighthearted or eventually turns dark, they can see the overall emotional arc at a glance before they decide to start watching.
Submission post: dev.to/quodline/finding-where-you-...
I use Cursor every day. Sonnet, the frontier models — constantly. But Composer and Auto? I'd open them, poke around, and close them. Those tokens just expire every month doing nothing.
So I built Iynx.
What it does
Why I built it
Try it
github.com/amit221/Iynx
MIT licensed. Happy to answer questions!
RevOps AI. I Turned Notion into an AI-Powered CRM with Gemini and MCP
dev.to/pooyagolchian/revops-ai-i-t...
Hey. Quick question: since the AI chat can call MCP tools directly in the app, how are you handling chat history, context limits, and compaction over time? Are you using any RAG or memory layer?
The chat is intentionally stateless. History is kept in-browser with a 10-message sliding window. Since the AI has live MCP tool access to Notion, it can always fetch current data on demand rather than relying on conversation memory. For a CRM assistant where interactions are typically short and task-oriented ("update this deal", "score these leads"), this is a pragmatic tradeoff. For a production version, I'd add conversation persistence, token-aware compaction, and optional summarization of older turns.
dev.to/saint_zero_day/i-built-a-ta...
From Hallucinations to Grounded AI: Building a Gemini RAG System with Qdrant
Soham Sharma ・ Mar 4