<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Joshua Gutierrez</title>
    <description>The latest articles on DEV Community by Joshua Gutierrez (@joshua_gutierrez).</description>
    <link>https://dev.to/joshua_gutierrez</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/joshua_gutierrez"/>
    <language>en</language>
    <item>
      <title>State of Small Business Websites (2026 Study). 96.9% Fail Core Web Vitals.</title>
      <dc:creator>Joshua Gutierrez</dc:creator>
      <pubDate>Fri, 24 Apr 2026 17:13:34 +0000</pubDate>
      <link>https://dev.to/joshua_gutierrez/state-of-small-business-websites-2026-study-969-fail-core-web-vitals-106i</link>
      <guid>https://dev.to/joshua_gutierrez/state-of-small-business-websites-2026-study-969-fail-core-web-vitals-106i</guid>
      <description>&lt;p&gt;Every performance article on Dev.to quotes the same two stats. You know them. Amazon. Walmart. The 100ms one. Everyone has seen them.&lt;/p&gt;

&lt;p&gt;Nobody measures what their own prospect list looks like. We did. 292 real B2B websites ran through a headless Chromium audit in April 2026. The results are not academic.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;96.9% fail at least one Core Web Vital on mobile. Only 3.1% pass all three. 100% of 191 sites with valid accessibility data failed the axe-core Link Labels rule.&lt;/strong&gt; The full dataset is open. The methodology is reproducible. This post walks through the technical failures one by one with the fixes.&lt;/p&gt;

&lt;p&gt;Report: &lt;a href="https://www.axiondeepdigital.com/research/state-of-small-business-websites-2026" rel="noopener noreferrer"&gt;axiondeepdigital.com/research/state-of-small-business-websites-2026&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Key stats (copy-paste friendly)
&lt;/h3&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Metric&lt;/th&gt;
&lt;th&gt;Sample&lt;/th&gt;
&lt;th&gt;Result&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Fail at least one Core Web Vital (mobile)&lt;/td&gt;
&lt;td&gt;n=191&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;96.9%&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Pass all three Core Web Vitals (mobile)&lt;/td&gt;
&lt;td&gt;n=191&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;3.1%&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Fail axe-core Link Labels rule&lt;/td&gt;
&lt;td&gt;n=191&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;100%&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Post "poor" mobile LCP (&amp;gt;4.0s)&lt;/td&gt;
&lt;td&gt;n=191&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;86.4%&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Post "poor" mobile FCP (&amp;gt;3.0s)&lt;/td&gt;
&lt;td&gt;n=191&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;72.4%&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Pass mobile CLS (&amp;lt;0.1)&lt;/td&gt;
&lt;td&gt;n=191&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;85.5%&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Total domains scanned&lt;/td&gt;
&lt;td&gt;n=292&lt;/td&gt;
&lt;td&gt;—&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;em&gt;Source: State of Small Business Websites 2026, Axion Deep Digital Research. April 2026.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  How we measured
&lt;/h2&gt;

&lt;p&gt;This study runs on our production audit infrastructure. &lt;a href="https://www.axiondeepdigital.com/free-seo-audit" rel="noopener noreferrer"&gt;DeepAudit AI&lt;/a&gt; is the same pipeline we run against client sites daily. The dataset below is 292 consecutive scans pulled from a single week in April 2026. Because the pipeline runs continuously, future snapshots will replicate this analysis against a larger and eventually randomized sample, and the methodology stays frozen so the comparisons hold.&lt;/p&gt;

&lt;p&gt;Each scan runs a real Chromium instance inside a Lambda (Puppeteer plus &lt;a href="https://github.com/Sparticuz/chromium" rel="noopener noreferrer"&gt;@sparticuz/chromium&lt;/a&gt;), renders the page the way Googlebot would, waits for hydration, and then measures against:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Lighthouse mobile Core Web Vitals (LCP, FCP, CLS) using Google's published &lt;a href="https://web.dev/articles/vitals" rel="noopener noreferrer"&gt;thresholds&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://github.com/dequelabs/axe-core" rel="noopener noreferrer"&gt;axe-core&lt;/a&gt; accessibility evaluations&lt;/li&gt;
&lt;li&gt;60+ technical SEO checks (meta, headings, schema, canonicals, alt text)&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://www.domcop.com/openpagerank/" rel="noopener noreferrer"&gt;Open PageRank&lt;/a&gt; for domain authority&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;191 of the 292 sites returned valid Lighthouse mobile data. 191 returned valid axe-core data. 260 had valid Open PageRank. Every finding below is gated on the subset of sites where the relevant measurement succeeded, not averaged over missing data.&lt;/p&gt;

&lt;h3&gt;
  
  
  What this study does and does not prove
&lt;/h3&gt;

&lt;p&gt;Before the numbers, the honest scope:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;This is a purposive sample, not a random one.&lt;/strong&gt; The 292 domains came from a North American B2B prospect list, which skews toward construction, trades, and small professional services. It is not representative of the web as a whole.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;What it does represent&lt;/strong&gt; is the category of business that small digital agencies actually get hired to fix. If you pitch to this market, the sites you convert will look like this sample.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;One snapshot per site.&lt;/strong&gt; Results reflect the state of each site on the day it was scanned.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Lighthouse mobile scores vary.&lt;/strong&gt; Google's Lighthouse CI documentation acknowledges a ±5 point variance across runs. None of the findings below hinge on a difference smaller than that. The 96.9% failure rate and the 100% Link Labels failure are robust to that noise.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The full dataset is open. Every row, every score, every axe-core violation is downloadable from the &lt;a href="https://www.axiondeepdigital.com/research/state-of-small-business-websites-2026/methodology" rel="noopener noreferrer"&gt;methodology page&lt;/a&gt;. If you disagree with a finding, you can reproduce it or refute it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Finding 1: LCP is the silent killer
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;86.4% of sites post poor mobile LCP. Only 5.2% are good.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Largest Contentful Paint over 4.0 seconds is Google's "poor" threshold. Most of the sites we measured clock in between 4 and 8 seconds on mobile. Some over 10.&lt;/p&gt;

&lt;p&gt;What is causing it, in order of frequency:&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Hero images served as full-resolution PNG
&lt;/h3&gt;

&lt;p&gt;One site we looked at ships a 2400x1600 PNG as its hero, served at 3.2 MB. Mobile user on 4G? That is 6 seconds of transfer on a good connection. The image renders last, so LCP measures the whole download.&lt;/p&gt;

&lt;p&gt;The fix is trivial if you are on Next.js:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight tsx"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;Image&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;next/image&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;Image&lt;/span&gt;
  &lt;span class="na"&gt;src&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"/hero.jpg"&lt;/span&gt;
  &lt;span class="na"&gt;alt&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"..."&lt;/span&gt;
  &lt;span class="na"&gt;width&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="mi"&gt;1200&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
  &lt;span class="na"&gt;height&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="mi"&gt;600&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
  &lt;span class="na"&gt;priority&lt;/span&gt;
  &lt;span class="na"&gt;sizes&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"(max-width: 768px) 100vw, 1200px"&lt;/span&gt;
  &lt;span class="na"&gt;quality&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="mi"&gt;75&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;priority&lt;/code&gt; injects a preload hint. &lt;code&gt;sizes&lt;/code&gt; tells the browser to pick the right srcset variant for the viewport. &lt;code&gt;quality=75&lt;/code&gt; on JPEG looks identical to 90 in blind testing. Next.js serves WebP or AVIF automatically based on the Accept header.&lt;/p&gt;

&lt;p&gt;If you are not on Next.js, the same rules apply. &lt;code&gt;&amp;lt;link rel="preload" as="image"&amp;gt;&lt;/code&gt;, &lt;code&gt;&amp;lt;picture&amp;gt;&lt;/code&gt; with AVIF first, and a &lt;code&gt;srcset&lt;/code&gt; + &lt;code&gt;sizes&lt;/code&gt; combo.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Render-blocking JavaScript in the head
&lt;/h3&gt;

&lt;p&gt;Tag Manager. Hotjar. Intercom. Drift. One of every four sites we audited had three or more synchronous third-party scripts in the head. Each one blocks HTML parsing until it is fetched, parsed, and executed.&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;defer&lt;/code&gt; attribute is older than most React developers. Use it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight html"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;script &lt;/span&gt;&lt;span class="na"&gt;src=&lt;/span&gt;&lt;span class="s"&gt;"/vendor/gtm.js"&lt;/span&gt; &lt;span class="na"&gt;defer&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&amp;lt;/script&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Or on Next.js:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight tsx"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;Script&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;next/script&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;Script&lt;/span&gt; &lt;span class="na"&gt;src&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"https://www.googletagmanager.com/gtm.js"&lt;/span&gt; &lt;span class="na"&gt;strategy&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"afterInteractive"&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;Script&lt;/span&gt; &lt;span class="na"&gt;src&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"https://widget.intercom.io/widget.js"&lt;/span&gt; &lt;span class="na"&gt;strategy&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"lazyOnload"&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;afterInteractive&lt;/code&gt; loads after hydration. &lt;code&gt;lazyOnload&lt;/code&gt; waits until the browser is idle. Neither blocks the LCP element.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Custom fonts with no &lt;code&gt;font-display&lt;/code&gt; directive
&lt;/h3&gt;

&lt;p&gt;Default font-loading behavior blocks text rendering for up to 3 seconds. If your LCP element contains text, it cannot paint until the font loads.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight css"&gt;&lt;code&gt;&lt;span class="k"&gt;@font-face&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nl"&gt;font-family&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;'Inter'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="nl"&gt;src&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sx"&gt;url('/fonts/inter.woff2')&lt;/span&gt; &lt;span class="n"&gt;format&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;'woff2'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="py"&gt;font-display&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;swap&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;swap&lt;/code&gt; tells the browser to render with a system fallback immediately, then swap to the custom font once it arrives. On Next.js, &lt;code&gt;next/font&lt;/code&gt; does this automatically. On anything else, you write it in three lines.&lt;/p&gt;

&lt;h2&gt;
  
  
  Finding 2: First Contentful Paint is worse than LCP, and nobody talks about it
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;72.4% of sites post poor mobile FCP.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;FCP is the time to the first pixel of anything meaningful. It is the metric that correlates most directly with user perception of speed. If it takes 4 seconds to paint a single headline, the user has already decided the site is broken, regardless of how fast the rest of the page loads.&lt;/p&gt;

&lt;p&gt;The causes overlap with LCP but there is one specific pattern we saw repeatedly: &lt;strong&gt;a client-side routing framework rehydrating the whole page before rendering anything&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Here is the anti-pattern. A site ships a React SPA that serves an empty shell HTML, boots up, fetches the route data, then renders. The browser sees nothing visible until JavaScript finishes executing. The fix is either:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Server-side render the first paint (Next.js App Router, Remix, SvelteKit, pick one)&lt;/li&gt;
&lt;li&gt;Ship static HTML for the first paint, hydrate on demand (Astro model)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you are shipping a React SPA as the entire site in 2026, you are fighting gravity. The frameworks that win on FCP are the ones that put real HTML on the first GET.&lt;/p&gt;

&lt;h2&gt;
  
  
  Finding 3: 100% of sites failed Link Labels
&lt;/h2&gt;

&lt;p&gt;This is the number that should bother you most.&lt;/p&gt;

&lt;p&gt;axe-core's Link Labels rule checks that every &lt;code&gt;&amp;lt;a&amp;gt;&lt;/code&gt; element has a discernible accessible name. Every single one of the 191 sites we evaluated had at least one link that failed this rule. Every one.&lt;/p&gt;

&lt;p&gt;What does a failing link look like? It looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight html"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;a&lt;/span&gt; &lt;span class="na"&gt;href=&lt;/span&gt;&lt;span class="s"&gt;"/products"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;svg&amp;gt;&lt;/span&gt;...&lt;span class="nt"&gt;&amp;lt;/svg&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/a&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;A screen reader reads "link" and stops. There is no text content and no &lt;code&gt;aria-label&lt;/code&gt;. The user has no idea where the link goes.&lt;/p&gt;

&lt;p&gt;The fix is one attribute:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight html"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;a&lt;/span&gt; &lt;span class="na"&gt;href=&lt;/span&gt;&lt;span class="s"&gt;"/products"&lt;/span&gt; &lt;span class="na"&gt;aria-label=&lt;/span&gt;&lt;span class="s"&gt;"Our products"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;svg&lt;/span&gt; &lt;span class="na"&gt;aria-hidden=&lt;/span&gt;&lt;span class="s"&gt;"true"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;...&lt;span class="nt"&gt;&amp;lt;/svg&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/a&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Social icon sets are the most common offender. Every Twitter, LinkedIn, and Instagram icon in every footer. Almost every one of them was an unlabeled link. Across 191 sites, we did not find a single site that had labeled every one of its icon-only links.&lt;/p&gt;

&lt;p&gt;This is not an edge case. This is the baseline.&lt;/p&gt;

&lt;p&gt;The broader pattern is: &lt;strong&gt;developers ship component library icons without reading the a11y docs&lt;/strong&gt;. React-icons, Font Awesome, Lucide, Heroicons. All of them render SVGs inside anchors by default, and unless you explicitly add an &lt;code&gt;aria-label&lt;/code&gt; or sibling text, the resulting link is unreadable.&lt;/p&gt;

&lt;p&gt;If you ship a component with an icon-only link, wrap it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight tsx"&gt;&lt;code&gt;&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;SocialLink&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="nx"&gt;href&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;label&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;children&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;return &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;a&lt;/span&gt; &lt;span class="na"&gt;href&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;href&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="na"&gt;aria-label&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;label&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt; &lt;span class="na"&gt;aria-hidden&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"true"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;children&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;a&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Three lines. You now pass axe-core.&lt;/p&gt;

&lt;h2&gt;
  
  
  Finding 4: CLS is the one metric most sites get right
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;85.5% of sites pass mobile CLS.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Cumulative Layout Shift is the one Core Web Vital developers have actually internalized. It helps that every modern framework bakes in &lt;code&gt;width&lt;/code&gt; and &lt;code&gt;height&lt;/code&gt; on images by default, and that &lt;code&gt;font-display: swap&lt;/code&gt; cuts down on font-swap shifts. The sites that fail CLS are doing something unusual: late-loading ads, &lt;code&gt;position: sticky&lt;/code&gt; headers without reserving space, or hero carousels that resize after the first slide loads.&lt;/p&gt;

&lt;p&gt;If your site passes everything else but fails CLS, the cause is almost always one of those three.&lt;/p&gt;

&lt;h2&gt;
  
  
  The takeaways for developers
&lt;/h2&gt;

&lt;p&gt;If you read nothing else from this, read this:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Priority-load one image and one font.&lt;/strong&gt; Your LCP element and your primary typeface. Everything else lazy-loads. This single change moves most sites from "poor" LCP to "good."&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Defer or lazy-load every third-party script.&lt;/strong&gt; If Tag Manager is synchronous in your &lt;code&gt;&amp;lt;head&amp;gt;&lt;/code&gt;, you are paying for it on every paint. Move it.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Run axe-core on every PR.&lt;/strong&gt; A CI check that runs &lt;code&gt;@axe-core/cli&lt;/code&gt; against your built site will catch unlabeled links before they ship. It takes 10 minutes to configure and it will save you from the baseline failure.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Test on mobile, not desktop.&lt;/strong&gt; Every site we audited looked fine on desktop. 96.9% of them fail on mobile. Chrome DevTools has a "Mobile" preset in the Performance tab. Use it on every build.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  The dataset
&lt;/h2&gt;

&lt;p&gt;The full 292-site dataset, methodology, and reproducibility notes are open. If you want to cite any of these findings in a post, article, or conference talk, you can pull the raw scan JSON and reproduce every chart.&lt;/p&gt;

&lt;p&gt;Report: &lt;a href="https://www.axiondeepdigital.com/research/state-of-small-business-websites-2026" rel="noopener noreferrer"&gt;State of Small Business Websites 2026&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Suggested citation:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Gutierrez, J.R. (2026). State of Small Business Websites 2026.
Axion Deep Digital Research, n=292.
axiondeepdigital.com/research/state-of-small-business-websites-2026
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you find an error, contact us and we will update the public methodology page with the correction and the reporter credit.&lt;/p&gt;

&lt;h2&gt;
  
  
  The takeaway, in one sentence
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;The small business web is not sophisticated enough to be slow. It is slow because nothing is prioritized, nothing is deferred, and nothing is labeled.&lt;/strong&gt; Every one of the findings above is fixable in a single afternoon by a developer who knows the four keywords: &lt;code&gt;priority&lt;/code&gt;, &lt;code&gt;defer&lt;/code&gt;, &lt;code&gt;font-display: swap&lt;/code&gt;, and &lt;code&gt;aria-label&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;That is what the 3.1% do.&lt;/p&gt;

&lt;h2&gt;
  
  
  Further reading
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://web.dev/articles/vitals" rel="noopener noreferrer"&gt;Core Web Vitals thresholds&lt;/a&gt; (Google)&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://developer.chrome.com/docs/lighthouse/performance/performance-scoring" rel="noopener noreferrer"&gt;Lighthouse scoring weights&lt;/a&gt; (Google)&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://dequeuniversity.com/rules/axe/" rel="noopener noreferrer"&gt;axe-core rule descriptions&lt;/a&gt; (Deque)&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://web.dev/articles/rail" rel="noopener noreferrer"&gt;The RAIL performance model&lt;/a&gt; (Google)&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;&lt;em&gt;Originally published on &lt;a href="https://www.axiondeepdigital.com/blog/state-of-small-business-websites-study-2026" rel="noopener noreferrer"&gt;axiondeepdigital.com&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>webperf</category>
      <category>a11y</category>
      <category>seo</category>
      <category>nextjs</category>
    </item>
    <item>
      <title>Your Website Exists. AI Doesn't Know That.</title>
      <dc:creator>Joshua Gutierrez</dc:creator>
      <pubDate>Fri, 10 Apr 2026 00:02:55 +0000</pubDate>
      <link>https://dev.to/joshua_gutierrez/your-website-exists-ai-doesnt-know-that-56cn</link>
      <guid>https://dev.to/joshua_gutierrez/your-website-exists-ai-doesnt-know-that-56cn</guid>
      <description>&lt;p&gt;Someone just asked ChatGPT for a recommendation in your industry.&lt;/p&gt;

&lt;p&gt;Your competitor was in the answer. You weren't.&lt;/p&gt;

&lt;p&gt;Not because they're better. Because the AI knew what they do. It didn't know what you do.&lt;/p&gt;

&lt;p&gt;That gap is growing every day. And it has a very simple fix.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Problem No One Is Talking About
&lt;/h2&gt;

&lt;p&gt;Google crawls your site. It reads your HTML, follows your links, indexes your pages. That system is 25 years old and it works.&lt;/p&gt;

&lt;p&gt;AI assistants don't do that.&lt;/p&gt;

&lt;p&gt;When Perplexity, ChatGPT, or Claude constructs an answer, it pulls from whatever structured context it can find. If your site is a JavaScript-heavy single-page app with no clear text about what you offer, where you operate, or what you charge -- the AI has nothing. It skips you. Or worse, it gets your details wrong and recommends someone else.&lt;/p&gt;

&lt;p&gt;This is not theoretical.&lt;br&gt;
It is happening right now.&lt;br&gt;
Across every industry.&lt;/p&gt;
&lt;h2&gt;
  
  
  The Fix: One File, 10 Minutes
&lt;/h2&gt;

&lt;p&gt;The solution is called &lt;code&gt;llms.txt&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Think of it as &lt;code&gt;robots.txt&lt;/code&gt; for AI. Where &lt;code&gt;robots.txt&lt;/code&gt; tells search crawlers what to index, &lt;code&gt;llms.txt&lt;/code&gt; tells AI assistants how to understand and recommend your business.&lt;/p&gt;

&lt;p&gt;It's a plain Markdown file at the root of your domain. No plugins. No API. No special server config. Just text that tells AI exactly who you are.&lt;/p&gt;

&lt;p&gt;The standard defines two files:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;llms.txt&lt;/code&gt;&lt;/strong&gt; is the summary. Your name, what you do, key links. Think elevator pitch in plain text.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;llms-full.txt&lt;/code&gt;&lt;/strong&gt; is the deep reference. Every service, pricing, process, FAQs, team, locations, and URLs. Everything an AI needs to answer any question about your business without visiting another page.&lt;/p&gt;
&lt;h2&gt;
  
  
  What Goes In Each File
&lt;/h2&gt;

&lt;p&gt;Here's exactly what the AI needs to see. Miss any of this, and you're invisible again.&lt;/p&gt;
&lt;h3&gt;
  
  
  llms.txt -- Keep It Tight
&lt;/h3&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight markdown"&gt;&lt;code&gt;&lt;span class="gh"&gt;# Your Business Name&lt;/span&gt;
&lt;span class="gt"&gt;
&amp;gt; What you do, in one sentence.&lt;/span&gt;
&lt;span class="p"&gt;
-&lt;/span&gt; Full details: https://yourdomain.com/llms-full.txt

&lt;span class="gu"&gt;## Services&lt;/span&gt;
&lt;span class="p"&gt;
-&lt;/span&gt; &lt;span class="gs"&gt;**Service 1**&lt;/span&gt;: What it is. What makes it different.
&lt;span class="p"&gt;-&lt;/span&gt; &lt;span class="gs"&gt;**Service 2**&lt;/span&gt;: What it is. What makes it different.

&lt;span class="gu"&gt;## Free Tools&lt;/span&gt;
&lt;span class="p"&gt;
-&lt;/span&gt; &lt;span class="gs"&gt;**Tool Name**&lt;/span&gt;: What it does. Direct URL.

&lt;span class="gu"&gt;## Areas Served&lt;/span&gt;
&lt;span class="p"&gt;
-&lt;/span&gt; City, State (headquarters)
&lt;span class="p"&gt;-&lt;/span&gt; City, State

&lt;span class="gu"&gt;## Contact&lt;/span&gt;
&lt;span class="p"&gt;
-&lt;/span&gt; Website: https://yourdomain.com
&lt;span class="p"&gt;-&lt;/span&gt; Email: hello@yourdomain.com
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;No HTML. No JavaScript. No images. Pure Markdown.&lt;/p&gt;
&lt;h3&gt;
  
  
  llms-full.txt -- Go Deep
&lt;/h3&gt;

&lt;p&gt;This is where specificity wins. For each service, include what's included, how the process works, pricing ranges, timelines, and FAQs with real answers. For each location, include the city page URL, population, and key industries. For your team, include names, roles, and credentials.&lt;/p&gt;

&lt;p&gt;The goal: an AI reading this file should be able to answer any question a potential customer would ask. Without visiting your site.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The single biggest mistake people make is being vague.&lt;/strong&gt; "We offer digital solutions" tells an AI nothing. "Custom web development with React and Next.js. 95+ Lighthouse scores. Starting at $2,500." gets quoted verbatim.&lt;/p&gt;
&lt;h2&gt;
  
  
  Implementation
&lt;/h2&gt;
&lt;h3&gt;
  
  
  Next.js / Static Sites
&lt;/h3&gt;

&lt;p&gt;Drop both files in &lt;code&gt;/public&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public/
  llms.txt
  llms-full.txt
  robots.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Served automatically. Done.&lt;/p&gt;

&lt;h3&gt;
  
  
  WordPress
&lt;/h3&gt;

&lt;p&gt;Upload to your root directory via FTP. Or use any plugin that serves static files from document root.&lt;/p&gt;

&lt;h3&gt;
  
  
  Optional: Reference in robots.txt
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight conf"&gt;&lt;code&gt;&lt;span class="n"&gt;Sitemap&lt;/span&gt;: &lt;span class="n"&gt;https&lt;/span&gt;://&lt;span class="n"&gt;yourdomain&lt;/span&gt;.&lt;span class="n"&gt;com&lt;/span&gt;/&lt;span class="n"&gt;sitemap&lt;/span&gt;.&lt;span class="n"&gt;xml&lt;/span&gt;
&lt;span class="n"&gt;Llms&lt;/span&gt;: &lt;span class="n"&gt;https&lt;/span&gt;://&lt;span class="n"&gt;yourdomain&lt;/span&gt;.&lt;span class="n"&gt;com&lt;/span&gt;/&lt;span class="n"&gt;llms&lt;/span&gt;.&lt;span class="n"&gt;txt&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Not part of the formal spec yet. But it makes the file more discoverable.&lt;/p&gt;

&lt;h2&gt;
  
  
  Three Mistakes That KillThe Problem No One Is Talking About Your AI Visibility
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Writing marketing copy instead of facts.&lt;/strong&gt; AI assistants ignore superlatives. "Award-winning" and "industry-leading" get filtered out. Specific numbers and concrete descriptions get quoted. Write like you're filling out a form, not a brochure.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Skipping the full version.&lt;/strong&gt; &lt;code&gt;llms.txt&lt;/code&gt; alone is a business card. Without &lt;code&gt;llms-full.txt&lt;/code&gt;, the AI fills in the blanks with guesses -- or with your competitor's details. The full file is where the real value lives.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Forgetting to update it.&lt;/strong&gt; New service? New city page? Changed pricing? Both files need to reflect current state. Stale information is worse than no information, because the AI will confidently repeat it.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to Know It's Working
&lt;/h2&gt;

&lt;p&gt;Visit &lt;code&gt;yourdomain.com/llms.txt&lt;/code&gt; in your browser. If you see plain text, it's live.&lt;/p&gt;

&lt;p&gt;Then test it. Ask ChatGPT or Perplexity about your business. Ask about your services, your pricing, your locations. If the AI references details from your &lt;code&gt;llms.txt&lt;/code&gt;, it's beiYour account may not be allowed to perform this action. Please refresh the page and try again.ng consumed.&lt;/p&gt;

&lt;p&gt;This is new enough that the feedback loop is fast. Add the file today, and AI assistants can reference it within days.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Bigger Picture
&lt;/h2&gt;

&lt;p&gt;&lt;code&gt;llms.txt&lt;/code&gt; is one piece of a broader shift. AI discoverability is becoming as important as search engine optimization was 10 years ago.&lt;/p&gt;

&lt;p&gt;Structured data (JSON-LD) on every page tells Google's AI Overviews about your services and FAQs. City-specific landing pages let AI assistants answer location queries. FAQ schema gives AI direct question-answer pairs to pull from. Clean semantic HTML gives crawlers content they can parse without rendering JavaScript.&lt;/p&gt;

&lt;p&gt;The businesses that build for AI discoverability now will own the answers for the next 2-3 years. The ones that wait will wonder why they keep getting skipped.&lt;/p&gt;

&lt;p&gt;The file takes 10 minutes to write. The cost of not having it compounds every day.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Written by Joshua R. Gutierrez in collaboration with the engineering team at &lt;a href="https://www.axiondeepdigital.com" rel="noopener noreferrer"&gt;Axion Deep Digital&lt;/a&gt;. We build high-performance websites, rank them on Google, and make sure AI knows they exist.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Right now, AI is recommending someone else. &lt;a href="https://www.axiondeepdigital.com/blog/why-your-website-not-showing-up-on-google" rel="noopener noreferrer"&gt;Find out if you're even in the conversation&lt;/a&gt;. 60 seconds. No signup.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Try DeepAudit AI | SEO Scanning Tool for Free&lt;/em&gt; &lt;a href="https://www.axiondeepdigital.com/free-seo-audit" rel="noopener noreferrer"&gt;&lt;em&gt;(No Signup Required):&lt;/em&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>seo</category>
      <category>ai</category>
      <category>marketing</category>
      <category>webdev</category>
    </item>
    <item>
      <title>What a 98 Lighthouse Score Actually Takes</title>
      <dc:creator>Joshua Gutierrez</dc:creator>
      <pubDate>Tue, 31 Mar 2026 21:41:06 +0000</pubDate>
      <link>https://dev.to/joshua_gutierrez/what-a-98-lighthouse-score-actually-takes-1539</link>
      <guid>https://dev.to/joshua_gutierrez/what-a-98-lighthouse-score-actually-takes-1539</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn03444cg0ww0kan7nn2i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn03444cg0ww0kan7nn2i.png" alt=" " width="800" height="533"&gt;&lt;/a&gt;&lt;br&gt;
Everybody talks about Lighthouse scores, but almost nobody actually has a high one.&lt;/p&gt;

&lt;p&gt;Across the sites we have audited at Axion Deep Digital, the average score is 44 out of 100. Not a typo. And these are not abandoned websites. These are real businesses actively spending on ads, running campaigns, and trying to grow.&lt;/p&gt;

&lt;p&gt;The problem is that most people treat performance like a nice to have. It is not. Google uses Core Web Vitals as a ranking signal, and users feel performance immediately. A slow site does not just rank lower, it converts worse. You end up paying for traffic that leaves before it even engages.&lt;/p&gt;

&lt;p&gt;We rebuilt our own site with one clear goal. Reach a 98 plus Lighthouse score across all categories on mobile, under real world conditions. Not on a fast laptop, but on a throttled connection the way Google actually tests it.&lt;/p&gt;

&lt;p&gt;Here is what that actually required.&lt;/p&gt;

&lt;p&gt;The biggest shift starts with how your site is built. Your framework sets your ceiling. If your stack is heavy or overly dynamic, you are working against yourself from the beginning. We use static generation so pages are pre rendered and delivered instantly. That removes a huge amount of overhead before optimization even begins.&lt;/p&gt;

&lt;p&gt;From there, images are almost always the largest issue. On most sites, they make up the majority of page weight. It is common to see oversized, uncompressed assets that slow everything down. We convert everything to modern formats, resize images to their actual display size, and lazy load anything below the fold. Small changes here can dramatically reduce load time.&lt;/p&gt;

&lt;p&gt;Fonts are another silent performance killer. Many sites load fonts from external providers with multiple round trips before text even appears. We self host, reduce font sizes, and ensure text renders immediately. That alone can shave seconds off perceived load time.&lt;/p&gt;

&lt;p&gt;JavaScript is where things really add up. Every piece of it has to be downloaded, parsed, and executed. Most sites are carrying far more than they need. We aggressively remove unused code, split what remains, and defer anything that is not critical. The result is a much lighter, faster experience.&lt;/p&gt;

&lt;p&gt;CSS also plays a role in how quickly a page becomes visible. Instead of forcing the browser to wait for a full stylesheet, we inline only what is needed for the initial view and load the rest afterward. That allows the page to render immediately.&lt;/p&gt;

&lt;p&gt;All of this feeds into the metrics that actually matter. Core Web Vitals are not abstract scores. They directly measure how fast your content appears, how stable the layout is, and how responsive the page feels. Getting those right requires coordination across everything above.&lt;/p&gt;

&lt;p&gt;Even with a well built frontend, server configuration still matters. Compression, caching, and proper headers all contribute to how efficiently your site is delivered. Many sites overlook this completely.&lt;/p&gt;

&lt;p&gt;Accessibility is another area that is often ignored, even though it overlaps heavily with both usability and SEO. Simple things like proper headings, alt text, and labeled inputs make a measurable difference.&lt;/p&gt;

&lt;p&gt;When you put it all together, the work is not about chasing a score. It is about building a site that loads quickly, behaves predictably, and works for real users in real conditions.&lt;/p&gt;

&lt;p&gt;That is also why most testing is misleading. If you are checking your site on fast WiFi on a new device, you are not seeing what your customers see. The real test is a slower device on a weaker connection.&lt;/p&gt;

&lt;p&gt;We built DeepAudit AI to evaluate sites the way Google does. It runs your site in a real browser and analyzes what actually renders, not just the source code.&lt;/p&gt;

&lt;p&gt;It is free, takes about a minute, and gives you a clear picture of what is holding your site back.&lt;/p&gt;

&lt;p&gt;If you want to see where your site really stands, you can run it here&lt;br&gt;
axiondeepdigital.com/free-seo-audit&lt;/p&gt;

</description>
      <category>freeseotool</category>
      <category>seo</category>
      <category>seotools</category>
      <category>webdev</category>
    </item>
    <item>
      <title>One Button Too Many</title>
      <dc:creator>Joshua Gutierrez</dc:creator>
      <pubDate>Wed, 25 Mar 2026 10:39:13 +0000</pubDate>
      <link>https://dev.to/joshua_gutierrez/one-button-too-many-3jnh</link>
      <guid>https://dev.to/joshua_gutierrez/one-button-too-many-3jnh</guid>
      <description>&lt;p&gt;We had five OAuth login options on our sign in page.&lt;/p&gt;

&lt;p&gt;Google. GitHub. LinkedIn. X. Facebook.&lt;/p&gt;

&lt;p&gt;It looked complete. It felt right. More options, more flexibility.&lt;/p&gt;

&lt;p&gt;Then one night around 11pm, we were staring at logs trying to figure out why our X integration kept connecting the wrong account.&lt;/p&gt;

&lt;p&gt;That is when things got interesting.&lt;/p&gt;




&lt;h2&gt;
  
  
  What We Build
&lt;/h2&gt;

&lt;p&gt;We are building Made4Founders, a business platform for startup founders.&lt;/p&gt;

&lt;p&gt;One of the features is social posting. You connect your X account, write a post in the dashboard, and it publishes for you.&lt;/p&gt;

&lt;p&gt;Simple idea. Should be straightforward.&lt;/p&gt;

&lt;p&gt;We also let users sign in with X.&lt;/p&gt;

&lt;p&gt;At the time, it felt obvious. If we support X, we should support logging in with it too.&lt;/p&gt;




&lt;h2&gt;
  
  
  What Went Wrong
&lt;/h2&gt;

&lt;p&gt;Here is the problem.&lt;/p&gt;

&lt;p&gt;A user signs in using their personal X account. That is the account already active in their browser.&lt;/p&gt;

&lt;p&gt;Later, they go to connect their company account for posting.&lt;/p&gt;

&lt;p&gt;But X does not give you an account picker. It does not ask which account you want. It just grabs whatever session is active.&lt;/p&gt;

&lt;p&gt;So it silently reconnects the personal account.&lt;/p&gt;

&lt;p&gt;The user thinks they connected their business account. They write a company update, hit post, and it goes to their personal feed.&lt;/p&gt;

&lt;p&gt;Not ideal.&lt;/p&gt;




&lt;h2&gt;
  
  
  The “Fixes” That Did Not Work
&lt;/h2&gt;

&lt;p&gt;We tried to patch it.&lt;/p&gt;

&lt;p&gt;We added a hint telling users to switch accounts on X first.&lt;/p&gt;

&lt;p&gt;It did not help.&lt;/p&gt;

&lt;p&gt;Even after switching inside X, OAuth still defaulted to the primary session.&lt;/p&gt;

&lt;p&gt;The only workaround was telling users to:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open an incognito window
&lt;/li&gt;
&lt;li&gt;Log into their business account only
&lt;/li&gt;
&lt;li&gt;Come back and connect
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;We actually wrote that into a tooltip.&lt;/p&gt;

&lt;p&gt;Then we looked at it and realized how ridiculous that was.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Question We Should Have Asked Earlier
&lt;/h2&gt;

&lt;p&gt;We stopped debugging and checked our data.&lt;/p&gt;

&lt;p&gt;How many users had actually signed up using “Sign in with X”?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Zero.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Not low. Zero.&lt;/p&gt;

&lt;p&gt;Every user used Google or email. A few used GitHub or LinkedIn.&lt;/p&gt;

&lt;p&gt;Nobody used X.&lt;/p&gt;

&lt;p&gt;So we had been maintaining:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;OAuth flows
&lt;/li&gt;
&lt;li&gt;Token refresh logic
&lt;/li&gt;
&lt;li&gt;API credentials
&lt;/li&gt;
&lt;li&gt;Debugging time
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For a feature nobody used.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Fix Took 10 Minutes
&lt;/h2&gt;

&lt;p&gt;We deleted the button.&lt;/p&gt;

&lt;p&gt;Removed the frontend component. Cleaned up the API call.&lt;/p&gt;

&lt;p&gt;About 20 to 30 lines of code.&lt;/p&gt;

&lt;p&gt;Then we separated our X apps:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;One app for login (now disabled)
&lt;/li&gt;
&lt;li&gt;One app for posting
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Same pattern we already used for LinkedIn.&lt;/p&gt;

&lt;p&gt;Immediately, everything worked.&lt;/p&gt;

&lt;p&gt;Four login options. No conflicts. Posting works as expected.&lt;/p&gt;




&lt;h2&gt;
  
  
  What We Actually Learned
&lt;/h2&gt;

&lt;p&gt;This was not really about X.&lt;/p&gt;

&lt;p&gt;It was about a common engineering habit.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Adding things because we can, not because they are needed.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Each OAuth provider feels cheap to add.&lt;/p&gt;

&lt;p&gt;Just another button. Another callback. Another env variable.&lt;/p&gt;

&lt;p&gt;But they are not free.&lt;/p&gt;

&lt;p&gt;Each one adds:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Token flows that can break
&lt;/li&gt;
&lt;li&gt;Credentials that expire
&lt;/li&gt;
&lt;li&gt;APIs that change
&lt;/li&gt;
&lt;li&gt;Edge cases you do not control
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And sometimes, they conflict with other features.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Real Cost of “Just One More Option”
&lt;/h2&gt;

&lt;p&gt;We spent hours debugging X.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Token issues
&lt;/li&gt;
&lt;li&gt;Session conflicts
&lt;/li&gt;
&lt;li&gt;API limitations
&lt;/li&gt;
&lt;li&gt;OAuth redirect loops
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;X has also become harder to work with:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Pay per use pricing
&lt;/li&gt;
&lt;li&gt;Limited free tier
&lt;/li&gt;
&lt;li&gt;Hidden requirements in the developer console
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;All of that for a login button nobody clicked.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Broader Point
&lt;/h2&gt;

&lt;p&gt;Every feature has a maintenance cost.&lt;/p&gt;

&lt;p&gt;Not just code.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Debugging time
&lt;/li&gt;
&lt;li&gt;Support tickets
&lt;/li&gt;
&lt;li&gt;Mental overhead
&lt;/li&gt;
&lt;li&gt;Edge cases
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;OAuth is especially messy.&lt;/p&gt;

&lt;p&gt;Every provider behaves differently:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Google gives an account picker
&lt;/li&gt;
&lt;li&gt;LinkedIn requires separate apps
&lt;/li&gt;
&lt;li&gt;Facebook has business requirements
&lt;/li&gt;
&lt;li&gt;X uses whatever session is active
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;There is no standard. Only variations.&lt;/p&gt;




&lt;h2&gt;
  
  
  Our Rule Now
&lt;/h2&gt;

&lt;p&gt;Before adding a login option, we ask one question:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Would our target user actually sign up this way?&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;For founders using a business tool:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Google and email cover almost everything
&lt;/li&gt;
&lt;li&gt;GitHub works for technical users
&lt;/li&gt;
&lt;li&gt;LinkedIn makes sense
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;X and Facebook do not.&lt;/p&gt;

&lt;p&gt;If the answer is no, we do not add it.&lt;/p&gt;




&lt;h2&gt;
  
  
  If You Are Building Something Similar
&lt;/h2&gt;

&lt;p&gt;A few things we learned the hard way.&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Do not mix login and integration OAuth
&lt;/h3&gt;

&lt;p&gt;Session conflicts are real and you cannot control them.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Separate your developer apps
&lt;/h3&gt;

&lt;p&gt;Authentication and integrations should not share the same setup.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Test with multiple accounts
&lt;/h3&gt;

&lt;p&gt;Most bugs only appear in real world usage patterns.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. Remove what is not earning its place
&lt;/h3&gt;

&lt;p&gt;It feels wrong, but it is often the right move.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Result
&lt;/h2&gt;

&lt;p&gt;Four login buttons.&lt;/p&gt;

&lt;p&gt;No conflicts.&lt;/p&gt;

&lt;p&gt;No late night debugging sessions over something nobody uses.&lt;/p&gt;

&lt;p&gt;Sometimes the best feature you can ship is a deletion.&lt;/p&gt;




&lt;h2&gt;
  
  
  Closing
&lt;/h2&gt;

&lt;p&gt;We are building Made4Founders.&lt;/p&gt;

&lt;p&gt;One dashboard to replace a stack of disconnected tools for startup founders.&lt;/p&gt;

&lt;p&gt;If you are building in this space, you already know how messy it gets.&lt;/p&gt;

&lt;p&gt;We are trying to make it simpler.&lt;/p&gt;

</description>
      <category>devjournal</category>
      <category>socialmedia</category>
      <category>startup</category>
      <category>ux</category>
    </item>
    <item>
      <title># I Audited 61 Business Websites With AI. Here’s What Broke on Almost Every Single One.</title>
      <dc:creator>Joshua Gutierrez</dc:creator>
      <pubDate>Mon, 23 Mar 2026 23:01:13 +0000</pubDate>
      <link>https://dev.to/joshua_gutierrez/-i-audited-61-business-websites-with-ai-heres-what-broke-on-almost-every-single-one-5g54</link>
      <guid>https://dev.to/joshua_gutierrez/-i-audited-61-business-websites-with-ai-heres-what-broke-on-almost-every-single-one-5g54</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9db0hw5i4leuuhrsnmet.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9db0hw5i4leuuhrsnmet.webp" alt=" " width="800" height="533"&gt;&lt;/a&gt;&lt;br&gt;
I run a small engineering shop. We build websites, handle technical SEO, and set up lead capture systems for businesses that want their site to actually do something, not just sit there.&lt;/p&gt;

&lt;p&gt;A few months ago we built an internal tool called DeepAudit AI. It uses a real headless browser to crawl a site and score it across more than 60 checks. Technical SEO. Content. Performance. Accessibility. Security. Structured data.&lt;/p&gt;

&lt;p&gt;Not basic HTML scraping. It renders the page the same way Google does.&lt;/p&gt;

&lt;p&gt;At first we used it to prep for sales calls. Run the audit, find the weak points, have a real conversation instead of guessing. But after running it on 61 business websites back to back, the same problems kept showing up.&lt;/p&gt;

&lt;p&gt;So I wrote them down.&lt;/p&gt;

&lt;p&gt;These were not random personal projects. These were established businesses. Agencies. SaaS companies. Consultants. Real revenue, real clients. The kind of teams that should not be missing the basics.&lt;/p&gt;

&lt;p&gt;Here is what we found.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Numbers
&lt;/h2&gt;

&lt;p&gt;Out of 61 websites:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Average score: 71 out of 100. Barely passing.&lt;/li&gt;
&lt;li&gt;Only 1 in 5 scored 80 or higher.&lt;/li&gt;
&lt;li&gt;Nearly a third were below 70.&lt;/li&gt;
&lt;li&gt;The lowest score was 21. That site had no title tag, no meta description, no viewport tag, no charset, and no H1. It was basically invisible.&lt;/li&gt;
&lt;li&gt;The highest was 88. Good, but still not clean.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The median was 73. So if your site feels average, it is still failing about a third of what actually matters.&lt;/p&gt;




&lt;h2&gt;
  
  
  The 7 Most Common Problems
&lt;/h2&gt;

&lt;p&gt;These are ranked by how often they showed up as top issues across all audits. This is not edge case stuff. This is what most sites are doing wrong.&lt;/p&gt;

&lt;h3&gt;
  
  
  1. HTML Validation Errors (43%)
&lt;/h3&gt;

&lt;p&gt;This was the most common issue by far.&lt;/p&gt;

&lt;p&gt;Unclosed tags. Duplicate IDs. Deprecated attributes. Broken markup. One site had more than 60 errors on a single page.&lt;/p&gt;

&lt;p&gt;Most developers never validate after launch. If it looks fine in Chrome, it ships. But crawlers do not behave like browsers. They read your markup literally. If your HTML is broken, parts of your content might not even exist to them.&lt;/p&gt;




&lt;h3&gt;
  
  
  2. Missing or Broken H1 (40%)
&lt;/h3&gt;

&lt;p&gt;The H1 tells Google what the page is about. It is one of the most basic signals you can send.&lt;/p&gt;

&lt;p&gt;Four out of ten sites either had no H1, had multiple competing H1s, or used something vague like “Welcome.”&lt;/p&gt;

&lt;p&gt;Your homepage should have one clear H1 that explains exactly what you do. Not clever. Not abstract. Clear.&lt;/p&gt;




&lt;h3&gt;
  
  
  3. Accessibility and Form Issues (26%)
&lt;/h3&gt;

&lt;p&gt;This one stood out.&lt;/p&gt;

&lt;p&gt;More than a quarter of sites had form fields with no labels. That means screen readers cannot tell users what to enter.&lt;/p&gt;

&lt;p&gt;Beyond usability, this is a compliance risk.&lt;/p&gt;

&lt;p&gt;Looking deeper, the average accessibility score across all sites was under 50. Missing focus states. Links with no readable text. Poor contrast everywhere.&lt;/p&gt;

&lt;p&gt;If you work with enterprise clients, this matters more than most people realize. Procurement teams are starting to check for this before signing anything.&lt;/p&gt;




&lt;h3&gt;
  
  
  4. No Sitemap.xml (23%)
&lt;/h3&gt;

&lt;p&gt;Almost a quarter of sites had no sitemap.&lt;/p&gt;

&lt;p&gt;A sitemap is just a list of your pages for search engines. It tells them what exists and what has changed.&lt;/p&gt;

&lt;p&gt;Without it, Google has to guess by following links. That means some pages never get indexed.&lt;/p&gt;

&lt;p&gt;If you are on WordPress, this is automatic. If you are on a custom setup, it takes maybe 15 minutes to create.&lt;/p&gt;




&lt;h3&gt;
  
  
  5. Missing Meta Description (21%)
&lt;/h3&gt;

&lt;p&gt;One in five sites had no meta description.&lt;/p&gt;

&lt;p&gt;That means Google is pulling random text from the page to display in search results.&lt;/p&gt;

&lt;p&gt;This is your chance to control how your site shows up. It is the short block of text that decides whether someone clicks your link or skips it.&lt;/p&gt;

&lt;p&gt;Keep it under 155 characters. Say what you do. Give a reason to click.&lt;/p&gt;




&lt;h3&gt;
  
  
  6. No Structured Data (19%)
&lt;/h3&gt;

&lt;p&gt;Structured data tells search engines what your business actually is.&lt;/p&gt;

&lt;p&gt;Name. Services. Location. Reviews. Hours.&lt;/p&gt;

&lt;p&gt;Without it, you miss out on enhanced search results like star ratings and rich listings.&lt;/p&gt;

&lt;p&gt;Almost 1 in 5 sites had none at all. Adding basic schema takes minutes and immediately improves how your site appears in search.&lt;/p&gt;




&lt;h3&gt;
  
  
  7. Broken Internal Links (17%)
&lt;/h3&gt;

&lt;p&gt;Nearly 1 in 5 sites had links pointing to pages that no longer exist.&lt;/p&gt;

&lt;p&gt;Every broken link is a dead end. Users hit it. Crawlers hit it. Both stop.&lt;/p&gt;

&lt;p&gt;One of the worst cases we saw was a JavaScript error leaking into the HTML and generating links to something that was not even a real URL. The site was literally creating broken paths on its own.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Pattern Nobody Talks About
&lt;/h2&gt;

&lt;p&gt;These problems do not show up alone. They stack.&lt;/p&gt;

&lt;p&gt;A site missing meta descriptions is usually missing Open Graph tags too. A site with no H1 often has no structured data. A site with broken HTML usually has accessibility issues and bad links on top of it.&lt;/p&gt;

&lt;p&gt;The lowest scoring sites were not failing one thing. They were failing everything.&lt;/p&gt;

&lt;p&gt;Nobody had ever done a real technical audit. The site looked fine, so everyone assumed it was fine.&lt;/p&gt;

&lt;p&gt;The higher scoring sites were different. Not perfect, but clearly maintained. Someone had actually looked under the hood.&lt;/p&gt;




&lt;h2&gt;
  
  
  What This Means for Your Business
&lt;/h2&gt;

&lt;p&gt;If your website has never gone through a real technical audit, not a design review, not a marketing check, but an actual engineering level audit, there is a good chance you are losing traffic and leads without realizing it.&lt;/p&gt;

&lt;p&gt;The fixes themselves are not hard.&lt;/p&gt;

&lt;p&gt;Writing a meta description takes seconds. Adding structured data takes minutes. Fixing an H1 is one line.&lt;/p&gt;

&lt;p&gt;The hard part is knowing what is broken.&lt;/p&gt;

&lt;p&gt;That is why we made DeepAudit AI free. No signup. No call. Just paste your URL and get a full report in about a minute.&lt;/p&gt;

&lt;p&gt;👉 &lt;a href="https://axiondeepdigital.com/free-seo-audit" rel="noopener noreferrer"&gt;https://axiondeepdigital.com/free-seo-audit&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You might be sitting at an 88. You might be at a 21.&lt;/p&gt;

&lt;p&gt;Either way, you should know.&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;Joshua R. Gutierrez, M.S.&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
CEO, Axion Deep Labs&lt;br&gt;&lt;br&gt;
Founder, Axion Deep Digital&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>seo</category>
      <category>marketing</category>
    </item>
    <item>
      <title>We Built 3 Websites. None of Them Ranked.</title>
      <dc:creator>Joshua Gutierrez</dc:creator>
      <pubDate>Fri, 20 Mar 2026 17:08:53 +0000</pubDate>
      <link>https://dev.to/joshua_gutierrez/we-built-3-websites-none-of-them-ranked-4dd5</link>
      <guid>https://dev.to/joshua_gutierrez/we-built-3-websites-none-of-them-ranked-4dd5</guid>
      <description>&lt;p&gt;We launched three products this year at Axion Deep Labs. Three websites. All built on Next.js, all statically exported, all "optimized for SEO." Except they weren't. Not even close. Here's every real mistake we made and how we found out the hard way.&lt;/p&gt;

&lt;h2&gt;
  
  
  1. Our Entire Website Was Invisible to Google
&lt;/h2&gt;

&lt;p&gt;This one still hurts.&lt;/p&gt;

&lt;p&gt;Next.js has a file called &lt;code&gt;loading.tsx&lt;/code&gt;. Drop it in your app directory and it shows a loading spinner while the page hydrates. Great for UX. Terrible for SEO.&lt;/p&gt;

&lt;p&gt;That file creates a Suspense boundary around ALL page content in the static HTML export. When Google's crawler hit our site, it didn't see our homepage, our service pages, or our blog posts. It saw a loading spinner. On every single page.&lt;/p&gt;

&lt;p&gt;We spent days fixing heading hierarchy, anchor text, meta descriptions. None of it mattered. The crawler was looking at an empty page the whole time. The fix was deleting one file.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Lesson:&lt;/strong&gt; Don't trust your source code for SEO. Run the build, open the actual HTML output, and look at what the crawler sees. We wrote a script that counts words and checks for spinners in our built files now.&lt;/p&gt;




&lt;h2&gt;
  
  
  2. Google Analytics Said Our Tag Was Working. It Wasn't.
&lt;/h2&gt;

&lt;p&gt;Google has a tag verification tool. You click "Test my website," get a green checkmark, and move on. We got the green checkmark. Two weeks later: zero data. Not low traffic. Zero.&lt;/p&gt;

&lt;p&gt;The problem was Google Consent Mode v2. We'd implemented a cookie consent banner that defaulted &lt;code&gt;analytics_storage&lt;/code&gt; to "denied" and only switched to "granted" when someone clicked "Accept." Most people don't click cookie banners. They ignore them.&lt;/p&gt;

&lt;p&gt;Google's tag test checks if the script is present in the HTML. It doesn't check if consent mode is blocking data collection. Green checkmark. Zero data. Weeks of analytics, gone.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Lesson:&lt;/strong&gt; If you're a US company, you probably don't need opt-in consent for analytics. Default to granted, make the banner an opt-out. And check your Realtime report after deploying, don't just trust the tag test.&lt;/p&gt;




&lt;h2&gt;
  
  
  3. We Had Five H1 Tags on One Page
&lt;/h2&gt;

&lt;p&gt;Our homepage had a server-rendered article with an H1. A client component with another H1. Service cards with H1s inside them. Google doesn't know which one matters when there are five, so it kind of shrugs.&lt;/p&gt;

&lt;p&gt;We also had H4 tags in the footer that appeared in the DOM before the H1. Heading hierarchy was completely broken. We didn't notice because visually it looked fine.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Lesson:&lt;/strong&gt; One H1 per page. Everything else is H2 or lower. The H1 has to be the first heading in the DOM, not just the first one visible on screen.&lt;/p&gt;




&lt;h2&gt;
  
  
  4. Every Link on Our Site Said "Learn More"
&lt;/h2&gt;

&lt;p&gt;Twelve service cards. Every single one had a "Learn more" link. Identical anchor text, twelve times, pointing to twelve different pages.&lt;/p&gt;

&lt;p&gt;Google uses anchor text to understand what the linked page is about. When every anchor says "Learn more," you're telling Google twelve different pages are about nothing specific.&lt;/p&gt;

&lt;p&gt;We replaced them with descriptive anchors. "Explore our SEO services." "See our web development process." "Learn how lead capture works." Took 20 minutes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Lesson:&lt;/strong&gt; Every link on your page should have unique, descriptive anchor text. If you can't tell what the target page is about from the anchor text alone, rewrite it.&lt;/p&gt;




&lt;h2&gt;
  
  
  5. Client Components Killed Our Server-Rendered Content
&lt;/h2&gt;

&lt;p&gt;Next.js server components render to HTML on the server. Client components need JavaScript to render, which means crawlers might not see them.&lt;/p&gt;

&lt;p&gt;We had components using &lt;code&gt;usePathname()&lt;/code&gt; from Next.js. That single hook forces the entire component to become a client component. Our content shell and footer were client components for no good reason. Just to highlight the current nav item.&lt;/p&gt;

&lt;p&gt;We refactored them to server components, moved the pathname logic to a tiny client child, and suddenly our static HTML had actual content instead of empty divs waiting for hydration.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Lesson:&lt;/strong&gt; If you're using Next.js, grep your components for &lt;code&gt;"use client"&lt;/code&gt; and ask if each one truly needs to be client-side. One misplaced hook can make an entire page invisible to crawlers.&lt;/p&gt;




&lt;h2&gt;
  
  
  6. Our Privacy Page Had the Wrong Analytics ID
&lt;/h2&gt;

&lt;p&gt;We switched Google Analytics properties and updated the tag in our layout. But our privacy policy still had the old measurement ID hardcoded in the text. Anyone trying to verify what we track would have seen a different ID than what was actually running.&lt;/p&gt;

&lt;p&gt;Not an SEO issue technically. But it's a trust issue. And trust issues become bounce rate issues.&lt;/p&gt;




&lt;h2&gt;
  
  
  7. Our Sitemap Had Wrong Dates
&lt;/h2&gt;

&lt;p&gt;Our &lt;code&gt;sitemap.xml&lt;/code&gt; was generated at build time using &lt;code&gt;new Date()&lt;/code&gt; in JavaScript. The build ran on a CI server with a slightly off clock. Our sitemap was telling Google that pages were last modified at times that didn't match reality. We hardcoded the dates.&lt;/p&gt;




&lt;h2&gt;
  
  
  8. 78 Images With Zero Metadata
&lt;/h2&gt;

&lt;p&gt;Every image was optimized for file size and format. WebP, compressed, proper dimensions. But none had IPTC or XMP metadata. No titles, no descriptions, no keywords, no copyright.&lt;/p&gt;

&lt;p&gt;Google Image Search uses this metadata for indexing. Social platforms use it as fallback when sharing. It's also how you prove ownership if someone takes your images.&lt;/p&gt;

&lt;p&gt;We wrote &lt;code&gt;exiftool&lt;/code&gt; scripts to tag every image with contextual titles, descriptions, keywords, author, copyright, and source URLs. 78 images. Took an hour.&lt;/p&gt;




&lt;h2&gt;
  
  
  9. Our Old Brand Name Was Still in Five Places
&lt;/h2&gt;

&lt;p&gt;We built from a template and rebranded everything. Or so we thought. Five places still had the old brand name. Two had the old email. The hero section still had template copy. All buried in secondary pages, JSON-LD schema, and error states that we never checked.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Lesson:&lt;/strong&gt; Find and replace across your entire codebase after a rebrand. Then do it again.&lt;/p&gt;




&lt;h2&gt;
  
  
  The 4 Things That Actually Matter
&lt;/h2&gt;

&lt;p&gt;After months of fixing SEO across three sites, it comes down to this:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Crawlability&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
Can Google actually see your content? Suspense boundaries, client rendering, and JavaScript dependencies can make your pages invisible.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Rendered Content&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
What does the built HTML look like? Not your source code. The actual output. Check it.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Data Accuracy&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
Is your analytics collecting? Are your sitemaps correct? Are your meta tags matching reality? If your data is wrong, every decision you make from it is wrong.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Structure&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
One H1. Descriptive anchors. Proper heading hierarchy. Image metadata. These are boring. They also compound over every page on your site.&lt;/p&gt;

&lt;p&gt;Get these right first. Then worry about keyword strategy and backlinks.&lt;/p&gt;

&lt;p&gt;We got a 98 Lighthouse score on one of our sites and it generated zero leads. Performance is necessary but not sufficient. The basics have to work.&lt;/p&gt;




&lt;p&gt;Most of these issues are invisible unless you know where to look.&lt;br&gt;&lt;br&gt;
We built a free tool that checks 60+ of them in seconds.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Try DeepAudit AI Free →&lt;/strong&gt;&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>nextjs</category>
      <category>seo</category>
      <category>javascript</category>
    </item>
    <item>
      <title>We Built a Free AI SEO Audit Tool — Here's What We Learned Scanning 500+ Sites</title>
      <dc:creator>Joshua Gutierrez</dc:creator>
      <pubDate>Mon, 16 Mar 2026 21:12:54 +0000</pubDate>
      <link>https://dev.to/joshua_gutierrez/we-built-a-free-ai-seo-audit-tool-heres-what-we-learned-scanning-500-sites-58ag</link>
      <guid>https://dev.to/joshua_gutierrez/we-built-a-free-ai-seo-audit-tool-heres-what-we-learned-scanning-500-sites-58ag</guid>
      <description>&lt;p&gt;We're Axion Deep Digital, a web development and SEO agency. Six months ago we built &lt;a href="https://www.axiondeepdigital.com/free-seo-audit" rel="noopener noreferrer"&gt;DeepAudit AI&lt;/a&gt; — a free tool that renders pages in a real Chromium browser, runs 60+ SEO checks, and uses AI to rewrite your meta tags.&lt;/p&gt;

&lt;p&gt;Since launch, hundreds of sites have been scanned through it. Here's what we found — and what we built to fix it.&lt;/p&gt;




&lt;h2&gt;
  
  
  Why We Built It
&lt;/h2&gt;

&lt;p&gt;Every free SEO tool we tried did the same thing: parse raw HTML, give a vague score, and upsell a $99/month plan.&lt;/p&gt;

&lt;p&gt;The problems:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;They don't render JavaScript, so they miss content on React/Next.js/Vue sites&lt;/li&gt;
&lt;li&gt;They check 10-15 surface-level things and call it an "audit"&lt;/li&gt;
&lt;li&gt;They tell you &lt;em&gt;what's wrong&lt;/em&gt; but never &lt;em&gt;how to fix it&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;We wanted a tool that works the way Google actually crawls — real browser rendering, real Lighthouse scores, and actual fix snippets you can copy-paste.&lt;/p&gt;




&lt;h2&gt;
  
  
  What We Found Scanning 500+ Sites
&lt;/h2&gt;

&lt;h3&gt;
  
  
  78% have broken or missing meta descriptions
&lt;/h3&gt;

&lt;p&gt;The most common issue. Either the meta description is missing entirely, or it's a generic CMS default like "Just another WordPress site." AI rewrites these in seconds.&lt;/p&gt;

&lt;h3&gt;
  
  
  65% fail basic accessibility checks
&lt;/h3&gt;

&lt;p&gt;Missing alt text, no skip navigation, poor color contrast, unlabeled form inputs. These aren't just compliance issues — Google factors accessibility into rankings.&lt;/p&gt;

&lt;h3&gt;
  
  
  71% have no structured data
&lt;/h3&gt;

&lt;p&gt;No Schema.org markup at all. No FAQ schema, no LocalBusiness, no breadcrumbs. This is free real estate in search results that almost nobody claims.&lt;/p&gt;

&lt;h3&gt;
  
  
  Average Lighthouse performance score: 44
&lt;/h3&gt;

&lt;p&gt;The average small business site scores below 50 on Lighthouse performance. Unoptimized images, no lazy loading, render-blocking scripts. Most don't even know their score.&lt;/p&gt;

&lt;h3&gt;
  
  
  83% missing at least one security header
&lt;/h3&gt;

&lt;p&gt;HSTS, CSP, X-Frame-Options — most sites ship with none of these. Search engines increasingly factor security signals into trust.&lt;/p&gt;

&lt;h3&gt;
  
  
  Only 12% have proper internal linking
&lt;/h3&gt;

&lt;p&gt;Orphan pages, broken links, no logical hierarchy. Crawlers can't find what isn't linked.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Tech Stack
&lt;/h2&gt;

&lt;p&gt;For anyone curious about how we built it:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Frontend:&lt;/strong&gt; Next.js 16, Tailwind CSS, Framer Motion&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Audit Engine:&lt;/strong&gt; AWS Lambda with Puppeteer (real Chromium rendering)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AI Layer:&lt;/strong&gt; DeepSeek for meta tag rewrites and action plan generation&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Performance Data:&lt;/strong&gt; Google PageSpeed Insights API&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Security Checks:&lt;/strong&gt; Mozilla Observatory methodology&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Report Generation:&lt;/strong&gt; Puppeteer PDF export, stored in S3 with presigned URLs&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The Lambda function spins up a headless Chromium instance, navigates to the target URL, waits for JavaScript to execute, then extracts everything — DOM structure, meta tags, headings, images, links, structured data, security headers, and more.&lt;/p&gt;

&lt;p&gt;All 60+ checks are deterministic. AI only comes in at the end to analyze results and generate the rewrite suggestions and action plan.&lt;/p&gt;




&lt;h2&gt;
  
  
  The 7 Categories We Score
&lt;/h2&gt;

&lt;p&gt;Each site gets a weighted score across:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Technical SEO&lt;/strong&gt; — HTTPS, canonical URLs, robots.txt, sitemap, structured data&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Content &amp;amp; Keywords&lt;/strong&gt; — word count, keyword density, heading hierarchy, readability&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Performance&lt;/strong&gt; — Lighthouse scores, Core Web Vitals, image optimization&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;On-Page SEO&lt;/strong&gt; — title tag, meta description, alt text, internal/external links, Open Graph&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Accessibility&lt;/strong&gt; — contrast, form labels, ARIA landmarks, focus indicators&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Security&lt;/strong&gt; — HSTS, CSP, X-Frame-Options, referrer policy&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Site Health&lt;/strong&gt; — multi-page crawl, broken links, redirect chains&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Final score: 0-100 with a letter grade.&lt;/p&gt;




&lt;h2&gt;
  
  
  What Makes It Different
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Real browser rendering.&lt;/strong&gt; Most tools fetch raw HTML with a simple HTTP request. We spin up Chromium. If your site uses React, Next.js, or any client-side rendering, we actually see your content.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Copy-paste fixes.&lt;/strong&gt; Every failing check includes the exact HTML, CSS, or server config snippet to fix it. Not "improve your meta description" — the actual rewritten tag you can paste into your &lt;code&gt;&amp;lt;head&amp;gt;&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AI-generated action plan.&lt;/strong&gt; After all checks run, AI reads every result, your Lighthouse scores, and your keyword data, then generates a prioritized list ranked by revenue impact.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Competitor comparison.&lt;/strong&gt; Run any two sites side-by-side across all 60+ checks.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;No signup.&lt;/strong&gt; 3 audits per day, no account, no credit card, downloadable PDF report.&lt;/p&gt;




&lt;h2&gt;
  
  
  Try It
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;&lt;a href="https://www.axiondeepdigital.com/free-seo-audit" rel="noopener noreferrer"&gt;DeepAudit AI — Free AI SEO Audit Tool&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Paste any URL and get a full report in under 60 seconds. If you find bugs or have feature requests, drop a comment — we're actively building.&lt;/p&gt;

</description>
      <category>seo</category>
      <category>webdev</category>
      <category>ai</category>
      <category>productivity</category>
    </item>
  </channel>
</rss>
