<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: john</title>
    <description>The latest articles on DEV Community by john (@john123ab).</description>
    <link>https://dev.to/john123ab</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/john123ab"/>
    <language>en</language>
    <item>
      <title>Stop Guessing, Start Measuring: A Python Script to Automate Your SEO Data Collection</title>
      <dc:creator>john</dc:creator>
      <pubDate>Mon, 29 Sep 2025 12:26:54 +0000</pubDate>
      <link>https://dev.to/john123ab/stop-guessing-start-measuring-a-python-script-to-automate-your-seo-data-collection-5ena</link>
      <guid>https://dev.to/john123ab/stop-guessing-start-measuring-a-python-script-to-automate-your-seo-data-collection-5ena</guid>
      <description>&lt;p&gt;&lt;strong&gt;Introduction (The Hook)&lt;/strong&gt;&lt;br&gt;
Hey developers! 👋 Ever feel like SEO is a dark art of best guesses and confusing metrics? What if you could use your coding skills to bring some data-driven clarity to the process?&lt;/p&gt;

&lt;p&gt;As a developer who also works in SEO, I often found myself manually checking rankings and technical stats—a huge time sink. So, I built a simple Python script to automate the boring parts and free me up for the actual analysis.&lt;/p&gt;

&lt;p&gt;In this tutorial, I’ll show you how to build a script that automatically pulls critical SEO data, giving you a custom dashboard of your site's health. Let's bridge the gap between code and SEO.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What We're Building: The SEO Data Fetcher&lt;/strong&gt;&lt;br&gt;
Our script will use free APIs and libraries to fetch:&lt;/p&gt;

&lt;p&gt;Google Search Console-style data (Approximate rankings &amp;amp; clicks via SerpApi).&lt;/p&gt;

&lt;p&gt;Core Web Vitals (The performance metrics Google actually uses for ranking).&lt;/p&gt;

&lt;p&gt;Basic Backlink Count (To monitor your site's authority).&lt;/p&gt;

&lt;p&gt;Prerequisites:&lt;/p&gt;

&lt;p&gt;Basic knowledge of Python.&lt;/p&gt;

&lt;p&gt;A free SerpApi account (they have a generous free tier).&lt;/p&gt;

&lt;p&gt;The following Python packages: requests, python-dotenv&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Code: Let's Get Technical&lt;/strong&gt;&lt;br&gt;
First, set up your environment. Create a .env file to keep your API key safe:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;# .env&lt;br&gt;
SERP_API_KEY=your_serpapi_key_here&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Now, let's dive into the script (seo_dashboard.py):&lt;br&gt;
`import requests&lt;br&gt;
import os&lt;br&gt;
from dotenv import load_dotenv&lt;/p&gt;

&lt;p&gt;load_dotenv()&lt;/p&gt;

&lt;p&gt;class SEODashboard:&lt;br&gt;
    def &lt;strong&gt;init&lt;/strong&gt;(self, domain):&lt;br&gt;
        self.domain = domain&lt;br&gt;
        self.serpapi_key = os.getenv('SERP_API_KEY')&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def get_ranking_data(self, query):
    """Fetches approximate ranking data for a keyword using SerpApi."""
    print(f"🔍 Checking ranking for: '{query}'")
    params = {
        "engine": "google",
        "q": query,
        "api_key": self.serpapi_key
    }

    try:
        response = requests.get('https://serpapi.com/search', params=params)
        results = response.json()
        organic_results = results.get('organic_results', [])

        for index, result in enumerate(organic_results):
            if self.domain in result.get('link', ''):
                print(f"✅ Rank #{index + 1} for '{query}'")
                return index + 1
        print(f"❌ Not in top 100 for '{query}'")
        return None

    except Exception as e:
        print(f"⚠️  SerpApi Error: {e}")
        return None

def get_core_web_vitals(self, url):
    """Uses the CrUX API to get real-world Core Web Vitals data."""
    print(f"📊 Checking Core Web Vitals for: {url}")
    API_ENDPOINT = f"https://chromeuxreport.googleapis.com/v1/records:queryRecord?key={os.getenv('CRUX_API_KEY', '')}"
    payload = {"url": url}

    try:
        response = requests.post(API_ENDPOINT, json=payload)
        data = response.json()

        if 'record' in data:
            metrics = data['record']['metrics']
            print("--- Core Web Vitals Results ---")
            print(f"LCP (Loading): {metrics.get('largest_contentful_paint', {}).get('percentile', {}).get('p75', 'N/A')}ms")
            print(f"FID (Interactivity): {metrics.get('first_input_delay', {}).get('percentile', {}).get('p75', 'N/A')}ms")
            print(f"CLS (Visual Stability): {metrics.get('cumulative_layout_shift', {}).get('percentile', {}).get('p75', 'N/A')}")
        else:
            print("❌ No CrUX data available for this URL.")

    except Exception as e:
        print(f"⚠️  CrUX API Error: {e}")

def get_backlink_count(self):
    """Gets an approximate backlink count using the Moz API (free tier)."""
    print(f"🔗 Checking backlink count for: {self.domain}")
    # Note: This is a placeholder. For a real implementation, you would use:
    # - Moz's API (requires free account)
    # - or the Ahrefs API (paid)
    # For this demo, we'll simulate the concept.
    print("💡 Pro Tip: Integrate with Moz's Links API for actual backlink data.")
    return "Simulated - Integrate with Moz/Abrefs API"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;if &lt;strong&gt;name&lt;/strong&gt; == "&lt;strong&gt;main&lt;/strong&gt;":&lt;br&gt;
    # Initialize with your domain&lt;br&gt;
    my_domain = "theseolighthouse.blogspot.com"&lt;br&gt;
    dashboard = SEODashboard(my_domain)&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Check rankings for target keywords
keywords = ["SEO tips", "Python SEO automation", "technical SEO"]
for keyword in keywords:
    dashboard.get_ranking_data(keyword)

# Check Core Web Vitals
dashboard.get_core_web_vitals(f"https://{my_domain}")

# Check backlinks
dashboard.get_backlink_count()`
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Running the Script &amp;amp; Expected Output&lt;/strong&gt;&lt;br&gt;
Save the script and run it from your terminal:&lt;br&gt;
&lt;code&gt;python seo_dashboard.py&lt;/code&gt;&lt;br&gt;
You should see output like this:&lt;br&gt;
&lt;code&gt;🔍 Checking ranking for: 'SEO tips'&lt;br&gt;
✅ Rank #14 for 'SEO tips'&lt;br&gt;
🔍 Checking ranking for: 'Python SEO automation'&lt;br&gt;
❌ Not in top 100 for 'Python SEO automation'&lt;br&gt;
📊 Checking Core Web Vitals for: https://theseolighthouse.blogspot.com&lt;br&gt;
--- Core Web Vitals Results ---&lt;br&gt;
LCP (Loading): 2200ms&lt;br&gt;
FID (Interactivity): 105ms&lt;br&gt;
CLS (Visual Stability): 0.12&lt;br&gt;
🔗 Checking backlink count for: theseolighthouse.blogspot.com&lt;br&gt;
💡 Pro Tip: Integrate with Moz's Links API for actual backlink data.&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Taking It Further: Next Steps&lt;br&gt;
This is just the beginning! Here's how you can extend this script:&lt;/p&gt;

&lt;p&gt;Add scheduling with cron to run daily and log results to a CSV.&lt;/p&gt;

&lt;p&gt;Integrate with Google Search Console API for your actual click/impression data.&lt;/p&gt;

&lt;p&gt;Build a simple Flask dashboard to visualize the trends over time.&lt;/p&gt;

&lt;p&gt;Add a broken link checker using the requests library to scan your site.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;br&gt;
By combining development skills with SEO, you can create powerful tools that provide real insights, not just guesses. This script is a starting point—something I've built upon and written about extensively on my own blog, &lt;a href="https://theseolighthouse.blogspot.com/" rel="noopener noreferrer"&gt;The SEO Lighthouse&lt;/a&gt;, where I explore the intersection of code and search marketing.&lt;/p&gt;

</description>
    </item>
  </channel>
</rss>
