<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Johann</title>
    <description>The latest articles on DEV Community by Johann (@methamorphe).</description>
    <link>https://dev.to/methamorphe</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/methamorphe"/>
    <language>en</language>
    <item>
      <title>How I Built a Desktop SEO Crawler That Handles 100k+ Pages (React + Electron + SQLite)</title>
      <dc:creator>Johann</dc:creator>
      <pubDate>Thu, 18 Dec 2025 11:26:49 +0000</pubDate>
      <link>https://dev.to/methamorphe/how-i-built-a-desktop-seo-crawler-that-handles-100k-pages-react-electron-sqlite-lkm</link>
      <guid>https://dev.to/methamorphe/how-i-built-a-desktop-seo-crawler-that-handles-100k-pages-react-electron-sqlite-lkm</guid>
      <description>&lt;h2&gt;
  
  
  How I Built a Desktop SEO Crawler That Handles 100k+ Pages
&lt;/h2&gt;

&lt;p&gt;As an indie dev, I used to pay $200/month for cloud-based crawlers. Desktop alternatives? Mostly Java apps with UIs from 2005.&lt;/p&gt;

&lt;p&gt;So I decided to build my own: &lt;strong&gt;Spider Pro&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F649igl9voybn9so6682c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F649igl9voybn9so6682c.png" alt="Spider Pro Dashboard" width="800" height="449"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Stack
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Frontend
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;React 19&lt;/strong&gt; with Vite for fast dev experience&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;TanStack Table&lt;/strong&gt; for virtualized tables (100k+ rows without lag)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Zustand&lt;/strong&gt; for lightweight state management&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Recharts&lt;/strong&gt; for analytics dashboards&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;react-force-graph&lt;/strong&gt; for 3D link visualization&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Backend
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Node.js&lt;/strong&gt; with Fastify&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Crawlee&lt;/strong&gt; (from the Apify team) for the crawl engine&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Playwright&lt;/strong&gt; for JavaScript rendering&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;better-sqlite3&lt;/strong&gt; for local storage&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Desktop
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Electron&lt;/strong&gt; for cross-platform distribution (Win/Mac/Linux)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The Interesting Technical Challenges
&lt;/h2&gt;

&lt;h3&gt;
  
  
  1. Real-Time Updates with Socket.io
&lt;/h3&gt;

&lt;p&gt;When crawling, users want to see results immediately. I used Socket.io to stream each crawled page to the frontend:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// Backend emits each page as it's crawled&lt;/span&gt;
&lt;span class="nx"&gt;socket&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;emit&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;crawl_update&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;status&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;title&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;issues&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="c1"&gt;// Frontend updates the table in real-time&lt;/span&gt;
&lt;span class="nf"&gt;useSocket&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;crawl_update&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nf"&gt;addPageToTable&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  2. Handling 100k+ Rows in a Table
&lt;/h3&gt;

&lt;p&gt;Traditional React tables would choke on 100k rows. TanStack Table's virtualization only renders visible rows:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;TanStackTable&lt;/span&gt;
  &lt;span class="na"&gt;data&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;pages&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="c1"&gt;// 100k+ items&lt;/span&gt;
  &lt;span class="na"&gt;columns&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;columns&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
  &lt;span class="na"&gt;enableVirtualization&lt;/span&gt;
&lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  3. JavaScript Rendering with Playwright
&lt;/h3&gt;

&lt;p&gt;SPAs don't render content in the initial HTML. I added a toggle to switch between:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;CheerioCrawler&lt;/strong&gt; (fast, HTTP-only)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;PlaywrightCrawler&lt;/strong&gt; (slower, full JS rendering)
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;crawler&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;jsRenderingEnabled&lt;/span&gt;
  &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;PlaywrightCrawler&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;options&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;CheerioCrawler&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;options&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  4. SQLite for Portable Storage
&lt;/h3&gt;

&lt;p&gt;Each project is a single &lt;code&gt;.db&lt;/code&gt; file. No PostgreSQL server needed. Users can backup, share, or move projects easily.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Learned
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Electron apps can be fast&lt;/strong&gt; if you're careful with IPC&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;SQLite is underrated&lt;/strong&gt; for desktop apps&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Virtualization is essential&lt;/strong&gt; for large datasets&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Socket.io makes real-time UX trivial&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Try It Out
&lt;/h2&gt;

&lt;p&gt;Spider Pro is launching on Product Hunt soon. Try it free for 14 days:&lt;/p&gt;

&lt;p&gt;👉 [&lt;a href="https://spiderpro.app/" rel="noopener noreferrer"&gt;https://spiderpro.app/&lt;/a&gt;]&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>seo</category>
      <category>saas</category>
      <category>programming</category>
    </item>
  </channel>
</rss>
