The Problem
I wanted to move to Latin America, but cost-of-living sites like Numbeo had terrible, outdated crowdsourced data. So I built my own.
The Solution
NomadInflation.com - scrapes real prices from local sources weekly. No estimates. Just actual data.
How I Built It (in 24 hours)
Full transparency: I'm a "vibe coder" - been programming 4 years but not brilliant. AI did the heavy lifting.
Tech Stack:
- IDE: Windsurf + Claude Sonnet 4.5 integration
- Backend: PHP + SQLite
- Frontend: Vanilla JS + Leaflet.js
- Hosting: DigitalOcean VPS
- Data: Web scraping from local sources
Process:
- ChatGPT designed the architecture
- Claude wrote most of the code via Windsurf
- I reviewed, tested, and iterated
- Deployed in one day
The Hard Parts
Finding scrapable data sources:
- Most sites block bots
- Had to use mix of government data + large retailers
- Currency conversion for 6+ currencies
Making it actually work:
- Testing scraper reliability
- Handling rate limits
- Ensuring data accuracy
Current Status
✅ Tracking Asunción, Paraguay ($966/month)
✅ Interactive map with 23+ LATAM cities
✅ Multi-currency support (USD, EUR, AUD, CAD, MXN, PYG)
✅ Weekly automated updates
✅ Email capture for alerts
What's Next
- Add São Paulo, Mexico City, Buenos Aires
- Price change notifications
- Historical data charts
- Validate if people actually want this
Lessons Learned
In 2025, execution > technical perfection.
You don't need to be a 10x engineer. You need:
- A real problem
- AI tools (Claude, ChatGPT)
- Willingness to ship fast
Took the Pieter Levels approach: Build in public, ship fast, iterate based on feedback.
Check it out: https://nomadinflation.com
Feedback welcome! What cities should I add? What features matter most?
Also posted on Hacker News today if you want to follow the discussion there.
Top comments (0)