DEV Community

Domonique Luchin
Domonique Luchin

Posted on

County Record Scraping: How CrawlOS Finds 125 Leads Per Night Across 14 Texas Counties

Every night while I sleep, a Playwright scraper hits 14 Texas county deed record portals and logs 125 qualified leads to my CRM.

Here is how CrawlOS works.

The Problem With Paid Data

Real estate lead services charge $200 to $500 per month for the same public data that county appraisal districts publish for free.

I built a scraper instead.

The Architecture

CrawlOS is a Python + Playwright system running on lb-telecom-01 (my Vultr VPS in Dallas).

It hits the public-facing portal for each county, extracts recent deed filings, filters by acquisition criteria, and inserts qualifying records into a Supabase table.

The 14 Counties

Harris, Fort Bend, Montgomery, Brazoria, Galveston, Chambers, Liberty, Waller, Austin, Colorado, Wharton, Matagorda, Jackson, and Victoria.

All Gulf Coast and surrounding markets — the territory I know from six years of O&G structural work in the region.

The Filter Logic

Not every deed transfer is a lead. The scraper applies filters:

  • Transfer type: warranty deed, not quitclaim
  • Price range: within acquisition parameters
  • Property type: residential or light commercial
  • Recency: filed within the last 30 days

Anything passing all four filters gets inserted. Everything else is discarded.

The Output

125 leads per night on average. Each record includes parcel ID, grantor/grantee, transfer amount, address, and filing date.

Load Bearing Capital works the list. Petroleum Noir runs a parallel filter on mineral rights transfers from the same pipeline.

Why This Matters

The data is public. It has always been public. The advantage is not access — it is automation.

Anyone can pull Harris County deed records manually. Almost nobody does it every night across 14 counties without lifting a finger.

Top comments (0)