<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Domonique Luchin</title>
    <description>The latest articles on DEV Community by Domonique Luchin (@domoniqueluchin).</description>
    <link>https://dev.to/domoniqueluchin</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/domoniqueluchin"/>
    <language>en</language>
    <item>
      <title>I Scraped 14 Texas Counties Every Night to Find Distressed Properties</title>
      <dc:creator>Domonique Luchin</dc:creator>
      <pubDate>Mon, 06 Apr 2026 12:52:10 +0000</pubDate>
      <link>https://dev.to/domoniqueluchin/i-scraped-14-texas-counties-every-night-to-find-distressed-properties-44fd</link>
      <guid>https://dev.to/domoniqueluchin/i-scraped-14-texas-counties-every-night-to-find-distressed-properties-44fd</guid>
      <description>&lt;p&gt;Real estate wholesaling is a data game. You need to find distressed properties before anyone else does. Foreclosure filings, delinquent tax records, code violations, 311 complaints. These are all public records. Most wholesalers check them manually. I automated it.&lt;/p&gt;

&lt;h2&gt;
  
  
  The System: Crawl OS
&lt;/h2&gt;

&lt;p&gt;Crawl OS is a set of 5 Supabase edge functions that scrape 14 Texas counties every night. Each function handles a different data source:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Foreclosure filings&lt;/strong&gt; from county clerk websites&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Delinquent tax records&lt;/strong&gt; from county tax assessor portals&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Code violations&lt;/strong&gt; from city code enforcement databases&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;311 complaints&lt;/strong&gt; from municipal service request systems&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Normalization&lt;/strong&gt; function that standardizes addresses and deduplicates across sources&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  The Pipeline
&lt;/h2&gt;

&lt;p&gt;Each night at 2 AM Central, pg_cron triggers the scraping functions. They hit the county websites, parse the HTML or API responses, extract the relevant records, and insert them into staging tables.&lt;/p&gt;

&lt;p&gt;The normalization function runs at 3 AM. It standardizes addresses (matching "123 Main St" with "123 Main Street"), geocodes new properties, and merges records from different sources into a single lead record.&lt;/p&gt;

&lt;p&gt;At 4 AM, the scoring function runs. Each lead gets a score based on:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Number of distress signals (foreclosure + tax delinquent = higher score)&lt;/li&gt;
&lt;li&gt;Property value (from county appraisal data)&lt;/li&gt;
&lt;li&gt;Days since first distress signal&lt;/li&gt;
&lt;li&gt;Neighborhood trend data&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;By 5 AM, the dashboard shows fresh scored leads. The AI outreach agent starts making calls at 9 AM.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Numbers
&lt;/h2&gt;

&lt;p&gt;14 counties. Roughly 125 scored leads per nightly run. The leads feed into Load Bearing Capital's wholesale pipeline at loadbearingcapitaltx.com.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Learned
&lt;/h2&gt;

&lt;p&gt;County websites are terrible. Different formats, different update schedules, different levels of data quality. Some counties have APIs. Most have HTML tables from 2004. You write a scraper for each one and you handle failures gracefully because at least 2 of the 14 will be down on any given night.&lt;/p&gt;

&lt;p&gt;Isolation matters. Each county gets its own pg_cron job. If Harris County's website is down, it does not block the Galveston County scrape. Failures are logged and retried the next night.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Legal Part
&lt;/h2&gt;

&lt;p&gt;All of this data is public record. County governments publish it specifically so citizens can access it. Scraping public records is legal. Using that data to contact property owners with legitimate purchase offers is legal. Just do not pretend to be a government agency and you are fine.&lt;/p&gt;

&lt;h2&gt;
  
  
  Build It Yourself
&lt;/h2&gt;

&lt;p&gt;If you are in real estate and you are still checking county websites manually, you are giving your competition a 12-hour head start every single day. Automate it.&lt;/p&gt;

</description>
      <category>python</category>
      <category>webscraping</category>
      <category>realestate</category>
      <category>automation</category>
    </item>
    <item>
      <title>Why I Build Everything From My Phone (And You Probably Can Too)</title>
      <dc:creator>Domonique Luchin</dc:creator>
      <pubDate>Mon, 06 Apr 2026 12:51:34 +0000</pubDate>
      <link>https://dev.to/domoniqueluchin/why-i-build-everything-from-my-phone-and-you-probably-can-too-42h</link>
      <guid>https://dev.to/domoniqueluchin/why-i-build-everything-from-my-phone-and-you-probably-can-too-42h</guid>
      <description>&lt;p&gt;I do not have a home office. I do not have a standing desk with three monitors. I build from my phone. Late at night. In parking lots. On lunch breaks. Wherever I am.&lt;/p&gt;

&lt;p&gt;This is not a flex. It is a constraint that shaped how I architect everything.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Workflow
&lt;/h2&gt;

&lt;p&gt;My phone talks to Claude. Claude talks to my infrastructure. The infrastructure does the work.&lt;/p&gt;

&lt;p&gt;I describe what I need in plain English. Claude writes the SQL, the edge function, the deployment script. I review it, approve it, and it ships. No IDE. No terminal. No laptop required.&lt;/p&gt;

&lt;h2&gt;
  
  
  What This Forces You to Do
&lt;/h2&gt;

&lt;p&gt;When your only interface is a chat window on a 6-inch screen, you make different architectural decisions:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Everything goes in the database.&lt;/strong&gt; Configuration, state, schedules, content. If it is in the database, I can query it and modify it from anywhere. If it is in a config file on a server, I need SSH access.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Edge functions over microservices.&lt;/strong&gt; I do not want to manage containers from my phone. Supabase edge functions deploy through the dashboard or the CLI. One file, one function, done.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;pg_cron over external schedulers.&lt;/strong&gt; GitHub Actions and AWS Lambda are great but they require navigating complex UIs. pg_cron is one SQL statement.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AI agents over manual processes.&lt;/strong&gt; I cannot be on the phone with leads while I am at my day job. AI agents handle inbound calls, qualify leads, and book appointments. I check the results when I have time.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Tools
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Claude (mobile app) for all development and planning&lt;/li&gt;
&lt;li&gt;Supabase dashboard (mobile browser) for database management&lt;/li&gt;
&lt;li&gt;GitHub mobile app for quick code reviews&lt;/li&gt;
&lt;li&gt;VoIP.ms portal for phone system management&lt;/li&gt;
&lt;li&gt;dev.to app for checking published articles&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;That is the entire stack. Five apps on my phone.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Point
&lt;/h2&gt;

&lt;p&gt;You do not need a $3,000 setup to build real things. You need a system that works from wherever you are. If your architecture requires you to be at a specific desk to make changes, it is too fragile.&lt;/p&gt;

&lt;p&gt;Build for the phone. Build for the parking lot. Build for 11 PM on a Tuesday when you have 20 minutes before you crash.&lt;/p&gt;

&lt;p&gt;That is when the real work happens.&lt;/p&gt;

</description>
      <category>mobile</category>
      <category>productivity</category>
      <category>startup</category>
      <category>devops</category>
    </item>
    <item>
      <title>Row Level Security Is Not Optional: How I Locked Down a Multi-Tenant Supabase App</title>
      <dc:creator>Domonique Luchin</dc:creator>
      <pubDate>Mon, 06 Apr 2026 12:50:58 +0000</pubDate>
      <link>https://dev.to/domoniqueluchin/row-level-security-is-not-optional-how-i-locked-down-a-multi-tenant-supabase-app-3hgn</link>
      <guid>https://dev.to/domoniqueluchin/row-level-security-is-not-optional-how-i-locked-down-a-multi-tenant-supabase-app-3hgn</guid>
      <description>&lt;p&gt;If you are using Supabase and you have not enabled Row Level Security on every table, stop what you are doing and fix that right now. I am serious.&lt;/p&gt;

&lt;p&gt;Supabase exposes your Postgres database through a REST API using the anon key. That key is public. It is in your frontend code. Anyone can see it. The only thing standing between your data and the internet is RLS.&lt;/p&gt;

&lt;h2&gt;
  
  
  What RLS Does
&lt;/h2&gt;

&lt;p&gt;Row Level Security is a Postgres feature that adds WHERE clauses to every query automatically. You define policies that say "this user can only see rows where user_id matches their auth ID." Postgres enforces it at the database level. No middleware. No application logic. Just policy.&lt;/p&gt;

&lt;h2&gt;
  
  
  My Setup
&lt;/h2&gt;

&lt;p&gt;I run a multi-tenant system. 6 businesses, shared database, different contexts. Here is how I structured it:&lt;/p&gt;

&lt;h3&gt;
  
  
  Brand Isolation
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="n"&gt;POLICY&lt;/span&gt; &lt;span class="nv"&gt;"Users see own brand data"&lt;/span&gt;
&lt;span class="k"&gt;ON&lt;/span&gt; &lt;span class="n"&gt;leads&lt;/span&gt; &lt;span class="k"&gt;FOR&lt;/span&gt; &lt;span class="k"&gt;SELECT&lt;/span&gt;
&lt;span class="k"&gt;USING&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;brand_id&lt;/span&gt; &lt;span class="k"&gt;IN&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="n"&gt;brand_id&lt;/span&gt; &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;user_brands&lt;/span&gt;
  &lt;span class="k"&gt;WHERE&lt;/span&gt; &lt;span class="n"&gt;user_id&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;auth&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;uid&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="p"&gt;));&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Every table that holds business data has a &lt;code&gt;brand_id&lt;/code&gt; column. Users only see data for brands they are assigned to.&lt;/p&gt;

&lt;h3&gt;
  
  
  Service Role Separation
&lt;/h3&gt;

&lt;p&gt;Edge functions that need cross-brand access use the service role key. That key bypasses RLS. It never touches the frontend. Every view that edge functions expose uses &lt;code&gt;SECURITY INVOKER&lt;/code&gt; so the calling user's permissions apply.&lt;/p&gt;

&lt;h3&gt;
  
  
  Foreign Key Indexes
&lt;/h3&gt;

&lt;p&gt;This is the one people miss. If you have RLS policies that join against other tables, those joins need indexes. Without them, your queries slow to a crawl as your data grows. I added foreign key indexes on every table during a security hardening pass.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Audit
&lt;/h2&gt;

&lt;p&gt;I ran a full security audit across both of my Supabase projects:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Listed every table without RLS enabled&lt;/li&gt;
&lt;li&gt;Added policies to each one&lt;/li&gt;
&lt;li&gt;Changed all views to SECURITY INVOKER&lt;/li&gt;
&lt;li&gt;Added missing foreign key indexes&lt;/li&gt;
&lt;li&gt;Tested with the anon key to verify no data leaks&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Total time: about 4 hours. That is 4 hours to prevent a data breach that could end your business.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Rule
&lt;/h2&gt;

&lt;p&gt;If a table has data that belongs to a user or a brand or an organization, it gets RLS. No exceptions. No "I will add it later." The moment you create the table, you add the policy.&lt;/p&gt;

&lt;p&gt;Supabase makes this easy. Do not skip it.&lt;/p&gt;

</description>
      <category>supabase</category>
      <category>security</category>
      <category>postgres</category>
      <category>webdev</category>
    </item>
    <item>
      <title>How I Use Claude API to Auto-Draft Technical Articles From Build Logs</title>
      <dc:creator>Domonique Luchin</dc:creator>
      <pubDate>Mon, 06 Apr 2026 12:50:22 +0000</pubDate>
      <link>https://dev.to/domoniqueluchin/how-i-use-claude-api-to-auto-draft-technical-articles-from-build-logs-48e9</link>
      <guid>https://dev.to/domoniqueluchin/how-i-use-claude-api-to-auto-draft-technical-articles-from-build-logs-48e9</guid>
      <description>&lt;p&gt;Every time I build something, I talk to Claude. That conversation becomes a build log. That build log becomes an article. The whole thing is automated.&lt;/p&gt;

&lt;p&gt;Here is how the pipeline works.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 1: The Build Session
&lt;/h2&gt;

&lt;p&gt;I build everything through Claude conversations. Setting up a VoIP system? That is a conversation. Deploying an edge function? Conversation. Debugging a PJSIP trunk? Long conversation.&lt;/p&gt;

&lt;p&gt;Each conversation is a build log. It has the problem, the solution, the code, and the context.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 2: Auto-Draft
&lt;/h2&gt;

&lt;p&gt;A Supabase edge function called &lt;code&gt;auto-draft-content&lt;/code&gt; takes a session summary and sends it to Claude Haiku with a system prompt:&lt;/p&gt;

&lt;p&gt;"You are a technical writer for dev.to. Convert this build session into a practical, first-person technical article. Use short sentences. No fluff. No em dashes. Include code snippets where relevant. Target 800 to 1200 words."&lt;/p&gt;

&lt;p&gt;Claude Haiku is fast and cheap. At $0.25 per million input tokens, drafting an article costs less than a penny. The output goes into the &lt;code&gt;drafted_content&lt;/code&gt; column of the &lt;code&gt;build_content_queue&lt;/code&gt; table.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 3: Publish
&lt;/h2&gt;

&lt;p&gt;Another edge function called &lt;code&gt;publish-build-content&lt;/code&gt; takes articles with status &lt;code&gt;ready&lt;/code&gt;, parses the frontmatter (title, tags), checks for duplicates against my dev.to profile, and publishes via the dev.to API. Rate limiting is built in with 35-second delays between posts.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 4: Schedule
&lt;/h2&gt;

&lt;p&gt;A pg_cron job fires at 9 AM Central every day and publishes up to 3 articles. Consistent cadence without manual intervention.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Results
&lt;/h2&gt;

&lt;p&gt;32 articles published in under 2 weeks. All from real build sessions. Not AI slop. Real problems, real solutions, real code. The AI just handles the formatting and publishing.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Cost
&lt;/h2&gt;

&lt;p&gt;Claude Haiku for drafting: roughly $0.005 per article&lt;br&gt;
dev.to API: free&lt;br&gt;
Supabase edge functions: free tier&lt;br&gt;
pg_cron: included with Supabase&lt;/p&gt;

&lt;p&gt;Total monthly cost for an automated content pipeline: less than $1.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why This Matters for Your Brand
&lt;/h2&gt;

&lt;p&gt;Every developer has build sessions. Most of them disappear into Slack threads and terminal history. If you capture them and publish them, you build a public record of your work. That record becomes your resume, your portfolio, and your lead generation engine.&lt;/p&gt;

&lt;p&gt;I got asked about my content during a job interview. That pipeline is not just content. It is career infrastructure.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>claude</category>
      <category>automation</category>
      <category>devto</category>
    </item>
    <item>
      <title>How I Built an AI Peer Review System for Structural Engineering Calculations</title>
      <dc:creator>Domonique Luchin</dc:creator>
      <pubDate>Mon, 06 Apr 2026 12:49:40 +0000</pubDate>
      <link>https://dev.to/domoniqueluchin/how-i-built-an-ai-peer-review-system-for-structural-engineering-calculations-5h6a</link>
      <guid>https://dev.to/domoniqueluchin/how-i-built-an-ai-peer-review-system-for-structural-engineering-calculations-5h6a</guid>
      <description>&lt;p&gt;I am a structural engineer. I design steel connections, concrete foundations, and equipment supports for oil and gas facilities. Every calculation package I produce has to be checked by another engineer before it goes to fabrication.&lt;/p&gt;

&lt;p&gt;That peer review process takes days. Sometimes weeks. The reviewer has to verify every load combination, every member check, every connection detail. It is tedious, critical, and expensive.&lt;/p&gt;

&lt;p&gt;So I built an AI system that does the first pass.&lt;/p&gt;

&lt;h2&gt;
  
  
  What StructCalc AI Does
&lt;/h2&gt;

&lt;p&gt;StructCalc AI is a web app built with React, Supabase, and FastAPI on Railway. You input your structural members, loads, and geometry. It runs checks against AISC 360-22, ACI 318-19, and ASCE 7-22 load combinations. Then it generates a PDF calculation package with an AI peer review section.&lt;/p&gt;

&lt;p&gt;The AI peer review does not replace a licensed engineer. It catches the obvious stuff. Did you forget a load combination? Is your demand-to-capacity ratio above 0.95? Did you specify A36 steel when the spec calls for A992? These are the mistakes that waste a reviewer's time.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Tech Stack
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Frontend&lt;/strong&gt; — React with Vite. Clean input forms for member properties, loads, and boundary conditions. No drag-and-drop modeling. Just numbers and dropdowns.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Backend&lt;/strong&gt; — FastAPI on Railway. Handles the actual calculations. AISC steel member checks use the interaction equations from Chapter H. ACI concrete checks follow the strength reduction factors from Chapter 21.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Database&lt;/strong&gt; — Supabase. Stores projects, calculation history, and user preferences. Row Level Security keeps each engineer's work private.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AI Layer&lt;/strong&gt; — Claude API. Takes the calculation output and reviews it against code requirements. Flags anything that looks wrong or marginal.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;PDF Export&lt;/strong&gt; — ReportLab. Generates a professional calculation package with cover sheet, assumptions, calculations, and the AI review summary.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why This Matters
&lt;/h2&gt;

&lt;p&gt;A junior engineer spends 4 to 6 hours on a typical steel connection package. The peer review takes another 2 to 3 hours. StructCalc AI cuts the review portion down to minutes.&lt;/p&gt;

&lt;p&gt;That does not mean the reviewer stops checking. It means they spend their time on the hard stuff instead of verifying that you used the right Cb value for lateral-torsional buckling.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Business Case
&lt;/h2&gt;

&lt;p&gt;I showed this tool during my interview at ESI Inc. They were interested enough to ask follow-up questions about it in the third round. Structural firms spend significant money on QA/QC. A tool that reduces review cycles is worth real money.&lt;/p&gt;

&lt;p&gt;The app is live at structcalc-ai.vercel.app. Still early, but the core engine works.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>engineering</category>
      <category>python</category>
      <category>react</category>
    </item>
    <item>
      <title>From Section 8 to Software: The Builder Mindset That Got Me Here</title>
      <dc:creator>Domonique Luchin</dc:creator>
      <pubDate>Mon, 06 Apr 2026 12:49:04 +0000</pubDate>
      <link>https://dev.to/domoniqueluchin/from-section-8-to-software-the-builder-mindset-that-got-me-here-4mep</link>
      <guid>https://dev.to/domoniqueluchin/from-section-8-to-software-the-builder-mindset-that-got-me-here-4mep</guid>
      <description>&lt;p&gt;I grew up in Section 8 housing. Government assistance. Free lunch program. The kind of childhood where you learn early that nobody is coming to save you.&lt;/p&gt;

&lt;p&gt;That lesson turned out to be the most valuable thing I ever learned.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Path
&lt;/h2&gt;

&lt;p&gt;I studied civil engineering at Prairie View A&amp;amp;M. Graduated in 2020 with a BS in Civil Engineering. Went into oil and gas infrastructure. Spent 6 years designing pipe supports, equipment foundations, and structural steel for facilities on the Gulf Coast and in the Permian Basin.&lt;/p&gt;

&lt;p&gt;Good money. Stable work. But I was building someone else's thing.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Shift
&lt;/h2&gt;

&lt;p&gt;In February 2026, I started building my own. Not one business. Six. Real estate wholesaling, demolition, structural steel detailing, valet trash for apartments, credit repair, and mineral rights acquisition. All under one holding company called Load Bearing Empire.&lt;/p&gt;

&lt;p&gt;People said I was crazy. Six businesses at once? While working full time? From your phone?&lt;/p&gt;

&lt;p&gt;They were not wrong about the workload. They were wrong about the approach.&lt;/p&gt;

&lt;h2&gt;
  
  
  The System
&lt;/h2&gt;

&lt;p&gt;I did not try to manually run 6 businesses. I built infrastructure. One Supabase database with 142 tables. 48 edge functions. 38 scheduled jobs. AI agents handling phone calls, lead scoring, content publishing, and email outreach.&lt;/p&gt;

&lt;p&gt;The businesses run on the system. I build the system. That is the difference between working IN your business and working ON your business.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Mindset
&lt;/h2&gt;

&lt;p&gt;When you grow up with nothing, you learn to build with whatever you have. I did not have a dev team. I had Claude. I did not have a marketing budget. I had a content pipeline. I did not have office space. I had a phone and a VPS.&lt;/p&gt;

&lt;p&gt;The constraint is the advantage. When you cannot buy your way out of a problem, you build your way out. And what you build, you own.&lt;/p&gt;

&lt;h2&gt;
  
  
  Where I Am Now
&lt;/h2&gt;

&lt;p&gt;32 articles published on dev.to. A structural calculation tool live in production. An 8-agent AI system managing operations. A new role as Lead Structural Engineer starting April 20th. And I am writing this on Easter Sunday because the build does not stop for holidays.&lt;/p&gt;

&lt;p&gt;If you are reading this from a place that looks nothing like where you want to be, keep building. The gap between where you are and where you want to be is just infrastructure you have not built yet.&lt;/p&gt;

</description>
      <category>career</category>
      <category>motivation</category>
      <category>startup</category>
      <category>beginners</category>
    </item>
    <item>
      <title>Building a VoIP Phone System for 6 Businesses on One Server</title>
      <dc:creator>Domonique Luchin</dc:creator>
      <pubDate>Mon, 06 Apr 2026 12:48:29 +0000</pubDate>
      <link>https://dev.to/domoniqueluchin/building-a-voip-phone-system-for-6-businesses-on-one-server-348j</link>
      <guid>https://dev.to/domoniqueluchin/building-a-voip-phone-system-for-6-businesses-on-one-server-348j</guid>
      <description>&lt;p&gt;I run 6 businesses. Each one needs its own phone number, its own voicemail, its own AI receptionist. If I used a hosted VoIP provider for all of them, I would be paying $150 to $300 a month just for phone lines and auto-attendants.&lt;/p&gt;

&lt;p&gt;Instead I built the whole thing on one VPS with Asterisk, VoIP.ms, and AI agents.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Architecture
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;VoIP.ms&lt;/strong&gt; — SIP trunk provider. I have 6 DIDs (phone numbers) in the 832 area code, all on flat rate billing through the Dallas POP. Total cost: about $6/month for the DIDs plus usage.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Asterisk&lt;/strong&gt; — Open source PBX running on my VPS (lb-telecom-01). PJSIP trunk registered to VoIP.ms. Each DID routes to a different dialplan context based on the business.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AI Agents&lt;/strong&gt; — Each business has its own AI receptionist. When a call comes in, Asterisk sends the audio through a pipeline: Whisper for speech-to-text, Claude for intent classification and response generation, OpenAI TTS for text-to-speech. The caller talks to an AI that knows about that specific business.&lt;/p&gt;

&lt;h2&gt;
  
  
  The DID Routing
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;832-XXX-1315 → Load Bearing Capital (real estate)
832-XXX-1316 → Load Bearing Demo (demolition)
832-XXX-1317 → Quiet Hours Valet (valet trash)
832-XXX-1318 → Luchin Credit Repair
832-XXX-1319 → Petroleum Noir (mineral rights)
832-XXX-1321 → Load Bearing Detailing (steel)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Each number hits a different AGI script that loads the right system prompt, greeting, and business context.&lt;/p&gt;

&lt;h2&gt;
  
  
  Cost Comparison
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Hosted solution&lt;/strong&gt; (RingCentral, Grasshopper, etc.): $25 to $50 per line per month = $150 to $300/month&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;My setup&lt;/strong&gt;: $6/month DIDs + $20/month VPS + per-minute AI costs (under $0.03/min) = roughly $30 to $40/month total&lt;/p&gt;

&lt;p&gt;Annual savings: $1,300 to $3,100&lt;/p&gt;

&lt;h2&gt;
  
  
  The Hard Part
&lt;/h2&gt;

&lt;p&gt;Asterisk configuration is not fun. The dialplan syntax is archaic. PJSIP debugging will make you question your life choices. But once it works, it works. And you own every piece of it.&lt;/p&gt;

&lt;p&gt;If you are running multiple brands and paying per-seat for phone service, consider building your own. The upfront pain pays dividends.&lt;/p&gt;

</description>
      <category>voip</category>
      <category>selfhosted</category>
      <category>asterisk</category>
      <category>automation</category>
    </item>
    <item>
      <title>Happy Easter From the Build: Why I Code on Holidays</title>
      <dc:creator>Domonique Luchin</dc:creator>
      <pubDate>Mon, 06 Apr 2026 12:47:47 +0000</pubDate>
      <link>https://dev.to/domoniqueluchin/happy-easter-from-the-build-why-i-code-on-holidays-3j62</link>
      <guid>https://dev.to/domoniqueluchin/happy-easter-from-the-build-why-i-code-on-holidays-3j62</guid>
      <description>&lt;p&gt;Most people take holidays off. I get it. Family, food, rest. All valid.&lt;/p&gt;

&lt;p&gt;But if you are building something from nothing, holidays hit different. They are the only days nobody is emailing you. Nobody is calling. Nobody needs a response by EOD. The world goes quiet and that is when I do my best work.&lt;/p&gt;

&lt;p&gt;This Easter Sunday I published 19 articles to dev.to in a single session. Not because I had to. Because the queue was full and I had the time.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Setup
&lt;/h2&gt;

&lt;p&gt;I run 6 businesses under one holding company called Load Bearing Empire. Real estate wholesaling, demolition, structural steel detailing, valet trash, credit repair, and mineral rights. Every single one of them runs on the same Supabase backend with AI agents handling intake, scheduling, and outreach.&lt;/p&gt;

&lt;p&gt;The content pipeline is one of those agents. It drafts articles from my build logs, formats them with frontmatter, deduplicates against my dev.to profile, and publishes with rate limiting built in. All I do is say "push it."&lt;/p&gt;

&lt;h2&gt;
  
  
  Why Holidays Are Build Days
&lt;/h2&gt;

&lt;p&gt;Here is the math. I work a full time structural engineering job Monday through Friday. I run 6 businesses at night. The only uninterrupted blocks I get are weekends and holidays.&lt;/p&gt;

&lt;p&gt;So while everyone else is hunting eggs, I am hunting bugs. And I am not apologizing for it.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Real Point
&lt;/h2&gt;

&lt;p&gt;If you are reading this on Easter, you are probably like me. You are building something. You are not where you want to be yet but you are closer than you were last month. Keep going.&lt;/p&gt;

&lt;p&gt;The people who build on holidays are the same people who do not need holidays later. That is the trade.&lt;/p&gt;

&lt;p&gt;Happy Easter. Now get back to work.&lt;/p&gt;

</description>
      <category>startup</category>
      <category>productivity</category>
      <category>motivation</category>
      <category>webdev</category>
    </item>
    <item>
      <title>Supabase pg_cron Changed How I Run My Businesses</title>
      <dc:creator>Domonique Luchin</dc:creator>
      <pubDate>Mon, 06 Apr 2026 12:47:11 +0000</pubDate>
      <link>https://dev.to/domoniqueluchin/supabase-pgcron-changed-how-i-run-my-businesses-3phn</link>
      <guid>https://dev.to/domoniqueluchin/supabase-pgcron-changed-how-i-run-my-businesses-3phn</guid>
      <description>&lt;p&gt;pg_cron is a Postgres extension that lets you schedule SQL jobs inside your database. No external scheduler. No Lambda functions. No cron servers. Just SQL on a timer.&lt;/p&gt;

&lt;p&gt;I have 38 pg_cron jobs running across my Supabase project right now. They handle everything from publishing articles to scraping county records to refreshing lead scores. Here is how I set it up and why it matters.&lt;/p&gt;

&lt;h2&gt;
  
  
  What pg_cron Actually Does
&lt;/h2&gt;

&lt;p&gt;You write a SQL statement. You give it a cron expression. Postgres runs it on schedule. That is it.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="n"&gt;cron&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;schedule&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="s1"&gt;'publish-content-daily'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="s1"&gt;'0 9 * * *'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="err"&gt;$$&lt;/span&gt;&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="n"&gt;net&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;http_post&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;url&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'https://yourproject.supabase.co/functions/v1/publish-build-content'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;headers&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'{"Content-Type": "application/json"}'&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="n"&gt;jsonb&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;body&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'{"platform": "devto", "limit": 3}'&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="n"&gt;jsonb&lt;/span&gt;
  &lt;span class="p"&gt;);&lt;/span&gt;&lt;span class="err"&gt;$$&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That job fires every day at 9 AM and publishes 3 articles from my content queue. No server. No GitHub Action. No third party.&lt;/p&gt;

&lt;h2&gt;
  
  
  My 38 Jobs
&lt;/h2&gt;

&lt;p&gt;Here is the breakdown by category:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Content Pipeline (4 jobs)&lt;/strong&gt; — Auto-draft articles from build logs, publish to dev.to, refresh content calendar, clean up failed drafts.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Lead Pipeline (14 jobs)&lt;/strong&gt; — Scrape 14 Texas counties nightly for foreclosures, delinquent taxes, code violations, and 311 complaints. Each county gets its own job so failures are isolated.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Lead Scoring (3 jobs)&lt;/strong&gt; — Recalculate scores daily, flag hot leads, archive stale ones older than 90 days.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Communication (5 jobs)&lt;/strong&gt; — Send follow-up sequences, check voicemail transcriptions, rotate DID assignments, refresh VAPI agent configs.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Maintenance (12 jobs)&lt;/strong&gt; — Vacuum tables, refresh materialized views, rotate logs, check API key expiry, monitor edge function error rates.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why This Beats External Schedulers
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Zero cold starts&lt;/strong&gt; — The job runs inside Postgres. No Lambda spin-up. No container boot.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Transactional&lt;/strong&gt; — Your scheduled job can read and write the same database in one transaction. Try doing that with a Zapier webhook.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Free&lt;/strong&gt; — pg_cron is included in every Supabase plan. No per-execution billing.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Observable&lt;/strong&gt; — Query &lt;code&gt;cron.job_run_details&lt;/code&gt; to see every execution, runtime, and error.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  The Gotcha
&lt;/h2&gt;

&lt;p&gt;pg_cron runs in UTC. If you are in Central Time like me, 9 AM CT is &lt;code&gt;0 14 * * *&lt;/code&gt; in cron syntax. Get this wrong and your jobs fire at 3 AM. Ask me how I know.&lt;/p&gt;

&lt;h2&gt;
  
  
  Bottom Line
&lt;/h2&gt;

&lt;p&gt;If you are on Supabase and you are not using pg_cron, you are leaving automation on the table. It is the closest thing to a free operations team I have found.&lt;/p&gt;

</description>
      <category>supabase</category>
      <category>postgres</category>
      <category>automation</category>
      <category>startup</category>
    </item>
    <item>
      <title>I Replaced 7 SaaS Subscriptions With Supabase Edge Functions</title>
      <dc:creator>Domonique Luchin</dc:creator>
      <pubDate>Mon, 06 Apr 2026 12:46:36 +0000</pubDate>
      <link>https://dev.to/domoniqueluchin/i-replaced-7-saas-subscriptions-with-supabase-edge-functions-2ac</link>
      <guid>https://dev.to/domoniqueluchin/i-replaced-7-saas-subscriptions-with-supabase-edge-functions-2ac</guid>
      <description>&lt;p&gt;I was spending $1,069 a month on software. CRM, email automation, phone system, scheduling, lead scoring, content management, and analytics. Seven different platforms. Seven different logins. Seven different bills.&lt;/p&gt;

&lt;p&gt;Now I spend $140. Here is how.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Problem
&lt;/h2&gt;

&lt;p&gt;When you run multiple businesses, every SaaS vendor wants to charge you per seat, per business, per feature. A $29/month tool becomes $174 when you need it across 6 brands. Multiply that across 7 categories and you are bleeding cash before you make a dollar.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Solution: One Database, Edge Functions for Everything
&lt;/h2&gt;

&lt;p&gt;Supabase gives you a Postgres database, auth, storage, edge functions, and cron jobs. All on one bill. I built everything I needed as edge functions that share the same database.&lt;/p&gt;

&lt;h3&gt;
  
  
  What I Replaced
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;CRM&lt;/strong&gt; — Was using HubSpot free tier across 6 brands. Hit limits fast. Replaced with a &lt;code&gt;leads&lt;/code&gt; table, a &lt;code&gt;lead_interactions&lt;/code&gt; table, and a scoring function that runs on insert.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Email Automation&lt;/strong&gt; — Mailchimp wanted $60/month for 6 audiences. Replaced with a Resend integration inside an edge function. Sequences stored in the database. pg_cron triggers the sends.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Phone System&lt;/strong&gt; — VAPI was $0.15/minute. Built Load Bearing Voice with Asterisk, Whisper STT, Claude, and OpenAI TTS. Self-hosted. Per-minute cost dropped to under $0.03.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Scheduling&lt;/strong&gt; — Calendly wanted $12/seat. Built a booking edge function that checks availability against a &lt;code&gt;schedules&lt;/code&gt; table and sends confirmation via Resend.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Lead Scoring&lt;/strong&gt; — Was using a Zapier workflow that cost $49/month. Replaced with a Postgres trigger that scores leads on insert based on property value, county, and distress signals.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Content Management&lt;/strong&gt; — Was manually posting everywhere. Built &lt;code&gt;build_content_queue&lt;/code&gt; table with &lt;code&gt;auto-draft-content&lt;/code&gt; and &lt;code&gt;publish-build-content&lt;/code&gt; edge functions. Drafts with Claude Haiku, publishes to dev.to automatically.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Analytics&lt;/strong&gt; — Replaced Google Analytics with a simple &lt;code&gt;page_views&lt;/code&gt; table and a lightweight script. No cookie banners needed.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Numbers
&lt;/h2&gt;

&lt;p&gt;Before: $1,069/month ($12,828/year)&lt;br&gt;
After: $140/month ($1,680/year)&lt;/p&gt;

&lt;p&gt;Annual savings: $11,148&lt;/p&gt;

&lt;p&gt;That is not optimization. That is elimination.&lt;/p&gt;

&lt;h2&gt;
  
  
  When This Makes Sense
&lt;/h2&gt;

&lt;p&gt;This approach works if you are technical enough to write SQL and deploy edge functions. If you are not, the SaaS tools are worth it. But if you can build, you should build. Every subscription is someone else profiting from your workflow.&lt;/p&gt;

&lt;p&gt;Own your infrastructure. Own your margins.&lt;/p&gt;

</description>
      <category>supabase</category>
      <category>saas</category>
      <category>startup</category>
      <category>selfhosted</category>
    </item>
    <item>
      <title>How a Structural Engineer Built a 6-Business AI System From His Phone</title>
      <dc:creator>Domonique Luchin</dc:creator>
      <pubDate>Mon, 06 Apr 2026 12:35:55 +0000</pubDate>
      <link>https://dev.to/domoniqueluchin/how-a-structural-engineer-built-a-6-business-ai-system-from-his-phone-7pn</link>
      <guid>https://dev.to/domoniqueluchin/how-a-structural-engineer-built-a-6-business-ai-system-from-his-phone-7pn</guid>
      <description>&lt;p&gt;My day job is structural engineering. I check steel connections and review load calculations for oil and gas infrastructure across the Gulf Coast.&lt;/p&gt;

&lt;p&gt;I also run six businesses. On my phone. Using AI agents I built myself.&lt;/p&gt;

&lt;p&gt;Here is how that happened.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Six Businesses
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Load Bearing Capital&lt;/strong&gt; — Real estate wholesaling. AI scrapes 14 Texas counties for distressed properties, calls homeowners, and qualifies leads.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Load Bearing Demo&lt;/strong&gt; — Demolition contracting. AI monitors Gmail for RFQs, parses drawings, generates bids using GPT-4o, and sends them after Telegram approval.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Load Bearing Detailing&lt;/strong&gt; — Structural steel shop drawings. AISC-compliant. The same engineering skills from my day job, monetized as a service.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Quiet Hours Valet&lt;/strong&gt; — Valet trash for apartment complexes. AI receptionist handles inbound calls and books property manager demos.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Luchin Credit Repair&lt;/strong&gt; — FCRA dispute engine. Automated 30-day dispute timelines and follow-up sequences.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Petroleum Noir&lt;/strong&gt; — Mineral rights and landman services. AI scrapes Texas Railroad Commission for new drilling permits and reaches out to operators.&lt;/p&gt;

&lt;p&gt;All six run on one $28/month VPS and one $25/month Supabase instance.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why a Structural Engineer
&lt;/h2&gt;

&lt;p&gt;Engineering teaches you to think in systems. Every structure has load paths, redundancy, and failure modes. Building an AI agent system is the same problem with different terminology.&lt;/p&gt;

&lt;p&gt;Your data flows are load paths. Your retry logic is redundancy. Your approval gates are failure mode analysis.&lt;/p&gt;

&lt;p&gt;The code itself is not harder than structural calculations. It is different notation for similar thinking.&lt;/p&gt;

&lt;h2&gt;
  
  
  Building From a Phone
&lt;/h2&gt;

&lt;p&gt;I do not have a home office setup. I work from my phone most of the time.&lt;/p&gt;

&lt;p&gt;Claude on mobile handles architecture decisions. Claude Code on the server executes the builds. Supabase MCP connects Claude directly to the database without me writing queries.&lt;/p&gt;

&lt;p&gt;The workflow:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;I send a task to Claude: "Build a lead scoring function for Harris County properties"&lt;/li&gt;
&lt;li&gt;Claude writes the SQL migration and inserts it into &lt;code&gt;claude_code_queue&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Claude Code on the server picks up the task and executes it&lt;/li&gt;
&lt;li&gt;Result comes back to Supabase. I query it from my phone.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;I am the decision maker. The agents are the execution layer.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Part That Surprised Me
&lt;/h2&gt;

&lt;p&gt;Building the stack was not the hard part.&lt;/p&gt;

&lt;p&gt;The hard part was building in public while working a full-time job and relocating to a new city for a lead structural engineering position.&lt;/p&gt;

&lt;p&gt;The system has to run without me. That forced better architecture than anything else. If I had to touch it every day, it was not good enough.&lt;/p&gt;

&lt;p&gt;Every agent writes its results to Supabase. Every failure gets logged. Every approval request comes to my Telegram. I check it once in the morning like reading emails.&lt;/p&gt;

&lt;p&gt;The goal is not to work less. The goal is to do $10/hour tasks with agents so I can spend my time on $10,000/hour decisions.&lt;/p&gt;

&lt;h2&gt;
  
  
  What Is Next
&lt;/h2&gt;

&lt;p&gt;RunPod GPU pipeline for AI avatar videos. YouTube and TikTok at scale, all six brands, fully automated.&lt;/p&gt;

&lt;p&gt;The voice samples are in Supabase Storage. The face photos are there too. Coqui XTTS handles voice cloning. MuseTalk handles lip sync. FFmpeg handles the final render.&lt;/p&gt;

&lt;p&gt;One command will produce a video for any brand on any topic in under 10 minutes.&lt;/p&gt;

&lt;p&gt;Building it now.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>python</category>
      <category>entrepreneurship</category>
      <category>automation</category>
    </item>
    <item>
      <title>How I Went From Spreadsheets to a 14-County Real Estate Scraper</title>
      <dc:creator>Domonique Luchin</dc:creator>
      <pubDate>Mon, 06 Apr 2026 12:35:19 +0000</pubDate>
      <link>https://dev.to/domoniqueluchin/how-i-went-from-spreadsheets-to-a-14-county-real-estate-scraper-5cnb</link>
      <guid>https://dev.to/domoniqueluchin/how-i-went-from-spreadsheets-to-a-14-county-real-estate-scraper-5cnb</guid>
      <description>&lt;p&gt;Six months ago my lead list was a Google Sheet with 300 rows I updated manually.&lt;/p&gt;

&lt;p&gt;Now five edge functions scrape 14 Texas counties every night and score 125+ leads automatically.&lt;/p&gt;

&lt;p&gt;Here is the full build.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Scrape
&lt;/h2&gt;

&lt;p&gt;Four data sources per county:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Foreclosure filings&lt;/strong&gt; — Properties where the owner is behind on their mortgage&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Delinquent tax rolls&lt;/strong&gt; — Properties with unpaid property taxes&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Code violations&lt;/strong&gt; — Properties with open city violations&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;311 complaints&lt;/strong&gt; — Nuisance complaints filed by neighbors&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Each one is a signal of seller motivation. Stack all four on one address and you have a highly distressed property whose owner may want to sell fast and cheap.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Database
&lt;/h2&gt;

&lt;p&gt;Four raw tables:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;raw_foreclosures&lt;/span&gt;
&lt;span class="n"&gt;raw_delinquent_taxes&lt;/span&gt;
&lt;span class="n"&gt;raw_code_violations&lt;/span&gt;
&lt;span class="n"&gt;raw_311_complaints&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;One normalized table:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;properties&lt;/span&gt;  &lt;span class="c1"&gt;-- 54 columns, one row per address&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The normalization function matches addresses across all four sources and merges signals. It calculates a &lt;code&gt;distress_score&lt;/code&gt; from 0-100 based on how many signals overlap.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Scraper Architecture
&lt;/h2&gt;

&lt;p&gt;Five Supabase Edge Functions. Each one targets a specific county data source.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// lbc-hcad-scraper/index.ts&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;createClient&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;@supabase/supabase-js&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;sb&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;createClient&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;Deno&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;SUPABASE_URL&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;Deno&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;SUPABASE_SERVICE_KEY&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="nx"&gt;Deno&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;serve&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;fetch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;https://county-data-api.com/delinquent&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;records&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;error&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;sb&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;from&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;raw_delinquent_taxes&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;upsert&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="nx"&gt;records&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;r&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;({&lt;/span&gt;
      &lt;span class="na"&gt;account_number&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;r&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;acct&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;owner_name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;r&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;owner&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;address&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;r&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;situs&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;balance&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;r&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;balance&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;county&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;harris&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
    &lt;span class="p"&gt;}))&lt;/span&gt;
  &lt;span class="p"&gt;)&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Response&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stringify&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;inserted&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;records&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt; &lt;span class="p"&gt;}))&lt;/span&gt;
&lt;span class="p"&gt;})&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;pg_cron runs each scraper nightly:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="n"&gt;cron&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;schedule&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="s1"&gt;'scrape-harris-delinquent'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="s1"&gt;'0 2 * * *'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="err"&gt;$$&lt;/span&gt;&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="n"&gt;net&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;http_post&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;url&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'https://your-project.supabase.co/functions/v1/lbc-hcad-scraper'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;headers&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;jsonb_build_object&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'Authorization'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Bearer '&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="n"&gt;current_setting&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'app.service_key'&lt;/span&gt;&lt;span class="p"&gt;)))&lt;/span&gt;&lt;span class="err"&gt;$$&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Scoring and Prioritization
&lt;/h2&gt;

&lt;p&gt;After normalization, a scoring function runs:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;UPDATE&lt;/span&gt; &lt;span class="n"&gt;properties&lt;/span&gt; &lt;span class="k"&gt;SET&lt;/span&gt; &lt;span class="n"&gt;distress_score&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;CASE&lt;/span&gt; &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;has_foreclosure_notice&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="mi"&gt;30&lt;/span&gt; &lt;span class="k"&gt;ELSE&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt; &lt;span class="k"&gt;END&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt;
  &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;CASE&lt;/span&gt; &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;has_delinquent_taxes&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="mi"&gt;25&lt;/span&gt; &lt;span class="k"&gt;ELSE&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt; &lt;span class="k"&gt;END&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt;
  &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;CASE&lt;/span&gt; &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;has_code_violations&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="mi"&gt;20&lt;/span&gt; &lt;span class="k"&gt;ELSE&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt; &lt;span class="k"&gt;END&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt;
  &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;CASE&lt;/span&gt; &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;has_311_complaints&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="mi"&gt;10&lt;/span&gt; &lt;span class="k"&gt;ELSE&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt; &lt;span class="k"&gt;END&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt;
  &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;CASE&lt;/span&gt; &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;est_equity&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="mi"&gt;20000&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="mi"&gt;15&lt;/span&gt; &lt;span class="k"&gt;ELSE&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt; &lt;span class="k"&gt;END&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="k"&gt;UPDATE&lt;/span&gt; &lt;span class="n"&gt;properties&lt;/span&gt; &lt;span class="k"&gt;SET&lt;/span&gt; &lt;span class="n"&gt;distress_tier&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;CASE&lt;/span&gt;
  &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;distress_score&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;=&lt;/span&gt; &lt;span class="mi"&gt;60&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="s1"&gt;'A'&lt;/span&gt;
  &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;distress_score&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;=&lt;/span&gt; &lt;span class="mi"&gt;40&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="s1"&gt;'B'&lt;/span&gt;
  &lt;span class="k"&gt;ELSE&lt;/span&gt; &lt;span class="s1"&gt;'C'&lt;/span&gt;
&lt;span class="k"&gt;END&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Tier A leads go into the dial queue immediately. Tier B gets a 24-hour delay. Tier C gets a weekly batch.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Skip Trace Step
&lt;/h2&gt;

&lt;p&gt;Raw records have addresses. They do not have phone numbers.&lt;/p&gt;

&lt;p&gt;Skip tracing finds the owner phone from the name and address. I use a batch API that costs about $0.10 per record. Only run it on Tier A and B leads to control cost.&lt;/p&gt;

&lt;p&gt;The result lands in &lt;code&gt;properties.owner_phone&lt;/code&gt;, &lt;code&gt;owner_phone_2&lt;/code&gt;, &lt;code&gt;owner_phone_3&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Then the dial queue picks it up and the AI starts calling.&lt;/p&gt;

&lt;h2&gt;
  
  
  Results
&lt;/h2&gt;

&lt;p&gt;14 counties. 125+ new scored leads per night. Zero manual work after the initial build.&lt;/p&gt;

&lt;p&gt;The build took about 3 weekends. The scraper has been running for two months without touching it.&lt;/p&gt;

</description>
      <category>python</category>
      <category>scraping</category>
      <category>realestate</category>
      <category>supabase</category>
    </item>
  </channel>
</rss>
