Every property valuation starts with the same question: what did similar properties sell for nearby? The data exists — government
land registries publish it — but it is scattered across different formats, languages, and access methods. UK Land Registry uses CSV
dumps. France publishes DVF files. Singapore has a government portal. New York City releases rolling Excel files. None of them
talk to each other.
I built a unified API that normalises all of it into a single interface. One endpoint, one schema, eleven markets.
What It Covers
Market Source Transactions Location Param
UK (England & Wales) HM Land Registry 31M+ postcode
France DVF 8.3M+ code_postal
Singapore HDB Resale Data 973K+ postal_code
New York City Dept of Finance 51K+ zip_code Chicago Cook County 480K+ zip_code
Dubai DLD Transactions 1.3M+ area_name
Miami Miami-Dade County 190K+ zip_code
Philadelphia OPA Sales Data 350K+ zip_code
Connecticut Town Clerk Records 280K+ town
Ireland Property Price Reg 450K+ county
Taiwan Ministry of Int 180K+ city_en
All data is government-sourced. No scraped listings. No estimates. Actual recorded sale prices.
Three Endpoints
- Comparable Sales
GET /v1/comps?market=nyc&zip_code=10001&months=12&limit=20
Returns recent sales near a location: address, sale price, date, property type, size, price per square foot/metre. Filter by
property type, bedroom count, and date range.
- Area Statistics
GET /v1/stats?market=uk&postcode=SW1A1DA
Median price, price per area unit, transaction volume, price trends, property type breakdown. The numbers you need to understand a micro-market at a glance.
- Monthly Trends
GET /v1/trends?market=fr&code_postal=75001&months=24
Month-by-month median prices and volumes. Useful for spotting whether a market is heating up or cooling down.
How It Works
Each market has its own backend API (FastAPI + SQLite) running on a dedicated port. A unified gateway at api.nwc-advisory.com routes requests by the market parameter to the correct backend, normalises the response schema, and returns consistent JSON regardless of which country you are querying.
Client request
→ api.nwc-advisory.com (gateway, port 8080)
→ market backend (ports 8060-8070)
→ SQLite database (government data, indexed)
← normalised JSON response
← consistent schema
The SQLite databases are built from government bulk downloads. Each has compound indexes on location + property type + date for
fast radius searches. The densest queries (central London postcodes covering thousands of transactions) return in under 5 seconds.
The Hardest Parts
Address normalisation across 11 countries. UK postcodes, French postal codes, Singapore 6-digit codes, US ZIP codes, Dubai area
names, Irish counties, Taiwanese districts in both English and Traditional Chinese. Each market has its own geocoding logic and
search radius behaviour.
Currency and unit consistency. Price per square foot in the US, price per square metre in Europe and Asia. GBP, EUR, USD, SGD, AED,NTD. The API returns values in each market's native currency and unit — no forced conversions that would confuse local users.
Data freshness. Each market publishes on a different schedule. UK Land Registry is monthly with a 6-week delay. France DVF is semi-annual. Singapore HDB is monthly with minimal delay. The update pipeline checks each source, downloads new data, ingests it into SQLite, deploys to the server, and verifies the health endpoint — all scripted.
Free Tier
The API is available on RapidAPI with a free tier:
- Free: 50 requests/month, 10 results per search, 3 markets (UK, NYC, Singapore)
- Pro ($29/mo): 1,000 requests, 100 results, all 11 markets
- Ultra ($99/mo): 10,000 requests, 500 results, all 11 markets
Interactive docs at api.nwc-advisory.com/docs (Swagger UI). You can test queries directly in the browser.
MCP Server (For AI Agents)
If you are building with Claude, GPT, or any LLM agent framework, there is an MCP server that wraps the API into three tools:
search_property_comps, get_area_stats, and list_markets. The repo is open source on GitHub: Tianning-lab/property-comps-mcp-server.
This means an AI agent can answer questions like "What did 3-bedroom flats sell for near Zurich Bahnhofstrasse in the last year?" by calling the tool directly — no prompt engineering around raw API calls needed.
What I Would Do Differently
Start with fewer markets. I launched UK and France first, then added nine more. Each new market took 2-3 days of data pipeline work plus frontend localisation (French, Traditional Chinese, Spanish for NYC). If I did it again, I would ship three markets and validate demand before building the rest.
Billing from day one. The free property apps get traffic but almost nobody signs up for paid tiers. Building the RapidAPI listing earlier and focusing on API-first distribution (developers, agents, fintech integrations) would have been a better monetisation path than consumer web apps.
Stack
- Backend: Python 3.13, FastAPI, SQLite (one DB per market)
- Gateway: FastAPI proxy with OpenAPI 3.1 spec
- Frontend: Static HTML/CSS/JS (11 localised versions)
- Server: Single Ubuntu box, nginx, Let's Encrypt
- Data pipeline: Python ingest scripts per market, cron-scheduled checks
No cloud services, no managed databases, no Kubernetes. One server handles all 11 markets comfortably.
The API docs are at api.nwc-advisory.com/docs. The MCP server is at github.com/Tianning-lab/property-comps-mcp-server. If you are building anything in proptech or real estate analytics, I would like to hear what data you wish existed.

Top comments (0)