DEV Community

Cover image for I built a privacy-first AI text splitter because LLM token limits kept reducing.
live ASMR
live ASMR

Posted on

I built a privacy-first AI text splitter because LLM token limits kept reducing.

Copilot - " message too long "

I ran into this exact issue last week while trying to analyze a massive PDF for my research. It’s incredibly frustrating when Copilot says "Message too long" or ChatGPT just hallucinates the second half of the document.

Every AI model has a "Context Window." Even though they claim large token limits, the web interfaces often cap your input (Copilot is notoriously strict, often capping around 10k characters).

You can manually copy-paste your text into Notepad, check the word count, and cut it into blocks. But the problem is losing context, if you don't tell the AI "Wait for the next part," it tries to answer incomplete data.

I stopped doing it manually because it was taking too long. I build a browser-based tool called Free AI Text Splitter that automates this.

It basically takes your PDF or text and chops it into segments that fit perfectly into Copilot or any LLM you use. It also adds a "prompt" to every segment telling the AI to wait until the whole file is uploaded.

You can easily copy the split parts of the document with a single click and paste them into your LLM. Continue this process until all sections are uploaded, it’s straightforward, efficient, and very quick.

The best part for me was that it runs locally in the browser, so I didn't have to upload my private docs to a server.

Here is the link if you want to try it: https://free-ai-text-splitter.pages.dev

Hope this saves you some time.

Top comments (0)