DEV Community

darjus
darjus

Posted on

Power of Agent assisted coding and learning to achieve goals faster and cheaper

Today in tech there are a lot of AI skeptics saying that AI is opening Pandora’s box for security, performance, and maintainability issues because engineers rely on agent-assisted (a.k.a. vibe) coding.

Partially this makes sense, but the blame is often misplaced. Good engineers know how to benefit from new technologies and how to adapt to a changing technology market.

In this post, I will show you how I used AI to design a solution that helped me:

  • Achieve my goal in a very short time
  • Work with limited resources
  • Move forward despite limited domain knowledge

Goal and Background

I already knew about open data from OpenStreetMap.

I also knew about the tool https://osm2pgsql.org, which can import OSM data into PostgreSQL with PostGIS.

Even before LLMs were popular, I tried to run such imports on a cheap VPS with limitations like importing only one country or one continent.

I always ended up with out-of-memory errors, even when importing a single large country. The VPS had around 16 GB RAM plus NVMe swap.

After researching, I discovered that importing the whole world usually requires 64–128 GB RAM,which is extremely expensive for a VPS.

But my goal was never to store everything.

I only needed:

  • Hotel/accommodation data

- Nearby POIs

So I could build tools to help me choose vacation destinations.


Original Architecture Idea (That Demotivated Me)

My initial idea was:

  1. One server with PostGIS + osm2pgsql
  2. Import one country
  3. A second server with a final relational database
  4. A service that extracts only hotels + POIs into the final DB

However, this required:

  • Lua scripting
  • Advanced SQL
  • Deep PostGIS knowledge

None of this is impossible, but the amount of work felt huge compared to my simple goal.
So I abandoned the idea.


Enter AI

Everything changed when I started using AI for coding and learning.I revived my idea with AI-assisted development. I started with the cheapest Claude Code subscription, but quickly upgraded once I saw how fast I was progressing.


Step 1: Infrastructure

I rented a VPS with:

  • 24 GB RAM
  • 8 vCPU cores
  • 200 GB NVMe disk

Cost: < 20 EUR/month

In hindsight, even 16 GB would probably be enough.

I installed:

  • Docker
  • Dokploy

Step 2: osm2pgsql + Lua Schema

Claude Code helped me generate:

  • A Docker Compose setup for osm2pgsql
  • Lua scripts that define only the tables I care about

Example table definition:

-- Accommodations (hotels, hostels, guest houses, etc.) local accommodations = osm2pgsql.define_table({ name = 'accommodations', ids = { type = 'any', id_column = 'osm_id', type_column = 'osm_type' }, columns = { { column = 'name', type = 'text' }, { column = 'tourism', type = 'text' }, { column = 'stars', type = 'int4' }, { column = 'rooms', type = 'int4' }, { column = 'website', type = 'text' }, { column = 'phone', type = 'text' }, { column = 'email', type = 'text' }, { column = 'wheelchair', type = 'text' }, { column = 'internet_access', type = 'text' }, { column = 'swimming_pool', type = 'text' }, { column = 'sauna', type = 'text' }, { column = 'country', type = 'text' }, { column = 'tags', type = 'jsonb' }, { column = 'geom', type = 'geometry', projection = 4326 } } })

This drastically simplified the PostGIS schema.


Why I Use --create Mode

osm2pgsql is always started with --create mode.

That means:

  • Each run wipes the PostGIS database
  • Imports exactly one country
  • The container stops after finishing

Why?
Because PostGIS here is just a temporary staging database.


Step 3: Second Server (Final Relational DB)

On another VPS I run:

  • PostgreSQL
  • A simple schema with tables like hotels
  • Fields like distance_to_beach, beach_type, etc.

This database is optimized for my application queries.


Step 4: Webhook-Based Automation

Flow:

  1. osm2pgsql finishes importing a country
  2. Docker Compose runs a small curl command
  3. A webhook is sent to the second server with the country code

On the second server:

  • A Bun.js controller receives the webhook
  • It triggers a CLI script on worker
  • The script queries PostGIS tables
  • Inserts normalized hotel data into the final PostgreSQL database

The entire script was generated by AI and works great.


Step 5: Automatic Loop

After the CLI import finishes:

  • I call the Dokploy REST API
  • Start the osm2pgsql container again

Claude Code built the Compose file in an idempotent way:

  • It tracks already imported countries
  • Automatically picks the next one

Result:

A fully automatic country-by-country pipeline.


Final Result

I now have:

  • No monster servers
  • No 128 GB RAM
  • Fully automatic imports
  • A clean relational hotel database

Time to build: ~2 days

Monthly cost: ~30 EUR (2 VPS + Claude Code subscription)


Closing Thoughts

AI did not replace engineering it just gives you superpowers and you have to learn how to control it. And don't forget that you still have to:

  • Design the architecture
  • Validate assumptions
  • Debug issues
  • And the most important - have interesting ideas.

Top comments (0)