If you run a public-facing website today, you might have noticed some strange, massive spikes in your traffic. You get excited, thinking your app finally went viral—only to realize it's just AI bots hammering your servers.
I recently experienced this exact scenario with my project, Viconic.dev, a platform for finding and sharing icons. Here's the story of how AI crawlers almost took down my site—and how I regained control without losing AI search visibility.
🤖 The "Hug of Death" by AI Bots
Initially, I hosted Viconic on Vercel. It was fast, easy to deploy, and worked perfectly for my needs.
However, as the site grew, I started getting hit by massive waves of automated requests.
Web crawlers from AI companies like Perplexity, OpenAI (ChatGPT), and others were aggressively scraping my site for training data and real-time search results.
Within a short period, these bots completely exhausted my Vercel request limits.
👉 My site was choking.
👉 I was facing potential downtime.
👉 And unexpected costs were coming.
⚙️ Step 1: Moving for Control (Vercel → DigitalOcean)
To handle the load and get more predictable server control, I decided to migrate my infrastructure from Vercel to DigitalOcean.
Having my own server environment gave me:
- Better stability
- No serverless request limits
- Predictable pricing
- More control over traffic handling
I was no longer at the mercy of request quotas—and I finally had the raw power to handle traffic spikes.
🛡️ Step 2: Taming the Bots with Cloudflare
Moving servers solved the capacity issue—but letting bots consume all my bandwidth was still a bad idea.
At the same time, I didn't want to block them completely.
AI search engines are the future of discovery, and I still wanted Viconic to be referenced by tools like ChatGPT and Perplexity.
So I brought in Cloudflare.
My "Smart Defense" Strategy:
🚫 Throttling aggressive behavior
- Blocked or rate-limited known AI bots
- Prevented crawling on resource-heavy endpoints
✅ Allowing targeted access
- Whitelisted bots for:
- Icon portfolio pages
- Search endpoints
This created a perfect balance:
- Bots can still index my content
- But they can't overload my infrastructure
💡 The Takeaway
AI is changing how traffic works on the web.
As developers:
- We can’t block all bots
- But we also can’t let them drain our resources
A solid setup like:
DigitalOcean (VPS) + Cloudflare (Smart Routing & Bot Control)
…is a powerful way to survive in the AI scraping era.
💬 Final Thoughts
Have you experienced AI bots crashing your projects?
How are you handling this new wave of AI scrapers?
Let me know in the comments 👇
🔗 Check out Viconic
If you're looking for high-quality icons for your next project, feel free to explore:
Cheers 🚀
Top comments (0)