<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Mouaz</title>
    <description>The latest articles on DEV Community by Mouaz (@movoid).</description>
    <link>https://dev.to/movoid</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/movoid"/>
    <language>en</language>
    <item>
      <title>How to Block AI Web Scrapers on Your WordPress Site with robots.txt (2025 Update)</title>
      <dc:creator>Mouaz</dc:creator>
      <pubDate>Thu, 10 Jul 2025 12:27:27 +0000</pubDate>
      <link>https://dev.to/movoid/how-to-block-ai-web-scrapers-on-your-wordpress-site-with-robotstxt-2025-update-41d</link>
      <guid>https://dev.to/movoid/how-to-block-ai-web-scrapers-on-your-wordpress-site-with-robotstxt-2025-update-41d</guid>
      <description>&lt;p&gt;AI web scrapers are increasingly crawling WordPress sites to gather content for training large language models and powering AI search results. As a site owner, you can use your robots.txt file to help protect your original content from being scraped and used without your permission.&lt;/p&gt;

&lt;p&gt;This guide explains:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;What the robots.txt file does&lt;/li&gt;
&lt;li&gt;Why blocking AI bots matters&lt;/li&gt;
&lt;li&gt;How to add up-to-date blocking rules&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A current table of the main AI bots and the exact syntax you need&lt;/p&gt;

&lt;h3&gt;
  
  
  What Is robots.txt?
&lt;/h3&gt;

&lt;p&gt;The robots.txt file is a simple text file in your website’s root directory (e.g., &lt;a href="https://yourdomain.com/robots.txt" rel="noopener noreferrer"&gt;https://yourdomain.com/robots.txt&lt;/a&gt;). It tells web crawlers which parts of your site they can or cannot access. Most reputable AI and search engine bots will respect these rules.&lt;/p&gt;

&lt;h3&gt;
  
  
  Why Block AI Bots?
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Protect your original content from being used to train AI models without your consent&lt;/li&gt;
&lt;li&gt;Maintain control over your website’s data&lt;/li&gt;
&lt;li&gt;Limit how your content appears in AI-powered search and chatbots&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  How to Edit robots.txt in WordPress
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Access your website’s root directory (via FTP, file manager, or a WordPress SEO plugin).&lt;/li&gt;
&lt;li&gt;Open or create the robots.txt file.&lt;/li&gt;
&lt;li&gt;Add the blocking rules from the table below.&lt;/li&gt;
&lt;li&gt;Save and upload the file.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Table: AI Bots and robots.txt Syntax (2025)&lt;/strong&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;AI Bot / Model&lt;/th&gt;
&lt;th&gt;Syntax to Block (add to robots.txt)&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt; &lt;/td&gt;
&lt;td&gt; &lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;GPTBot (OpenAI)&lt;/td&gt;
&lt;td&gt;User-agent: GPTBot&lt;br&gt;Disallow: /&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; &lt;/td&gt;
&lt;td&gt; &lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Google-Extended&lt;/td&gt;
&lt;td&gt;User-agent: Google-Extended&lt;br&gt;Disallow: /&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; &lt;/td&gt;
&lt;td&gt; &lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;ClaudeBot (Anthropic)&lt;/td&gt;
&lt;td&gt;User-agent: ClaudeBot&lt;br&gt;Disallow: /&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; &lt;/td&gt;
&lt;td&gt; &lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;PerplexityBot&lt;/td&gt;
&lt;td&gt;User-agent: PerplexityBot&lt;br&gt;Disallow: /&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; &lt;/td&gt;
&lt;td&gt; &lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;CCBot (Common Crawl)&lt;/td&gt;
&lt;td&gt;User-agent: CCBot&lt;br&gt;Disallow: /&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; &lt;/td&gt;
&lt;td&gt; &lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Bytespider (ByteDance)&lt;/td&gt;
&lt;td&gt;User-agent: Bytespider&lt;br&gt;Disallow: /&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; &lt;/td&gt;
&lt;td&gt; &lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Amazonbot&lt;/td&gt;
&lt;td&gt;User-agent: Amazonbot&lt;br&gt;Disallow: /&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; &lt;/td&gt;
&lt;td&gt; &lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Applebot&lt;/td&gt;
&lt;td&gt;User-agent: Applebot&lt;br&gt;Disallow: /&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; &lt;/td&gt;
&lt;td&gt; &lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;em&gt;Copy and paste the lines for each bot you want to block.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Example robots.txt&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;User-agent: GPTBot
Disallow: /

User-agent: Google-Extended
Disallow: /

User-agent: ClaudeBot
Disallow: /

User-agent: PerplexityBot
Disallow: /

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Important Notes&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;robots.txt is not foolproof&lt;/strong&gt;. Only bots that respect the protocol will comply. &lt;strong&gt;Some scrapers may ignore it&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Update regularly&lt;/strong&gt;. New bots appear all the time—keep your file current.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Don’t block search engines&lt;/strong&gt; (like Googlebot or Bingbot) unless you want your site removed from search results.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>seo</category>
      <category>wordpress</category>
      <category>webscraping</category>
      <category>privacy</category>
    </item>
    <item>
      <title>Next.js Routing in 2025</title>
      <dc:creator>Mouaz</dc:creator>
      <pubDate>Tue, 25 Mar 2025 20:20:23 +0000</pubDate>
      <link>https://dev.to/movoid/nextjs-routing-in-2025-45mj</link>
      <guid>https://dev.to/movoid/nextjs-routing-in-2025-45mj</guid>
      <description>&lt;h2&gt;
  
  
  Overview of Next.js Routing
&lt;/h2&gt;

&lt;p&gt;Next.js provides two primary routing systems: the &lt;strong&gt;Pages Router&lt;/strong&gt; and the &lt;strong&gt;App Router&lt;/strong&gt;. Understanding these systems is crucial for building efficient and scalable applications.&lt;/p&gt;

&lt;h3&gt;
  
  
  Pages Router
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Introduction&lt;/strong&gt;: The Pages Router was the original routing system in Next.js, available before version 13. It uses a file-system-based approach where each file in the &lt;code&gt;pages&lt;/code&gt; directory corresponds to a route.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Features&lt;/strong&gt;:

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Server-Side Rendering (SSR)&lt;/strong&gt;: Supports SSR, which can enhance SEO and initial page load performance.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Static Site Generation (SSG)&lt;/strong&gt;: Allows for pre-rendering pages at build time, improving performance.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Simple Setup&lt;/strong&gt;: Easy to set up and understand, especially for smaller applications.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;strong&gt;Use Cases&lt;/strong&gt;: Ideal for applications where SEO and initial performance are critical, or for maintaining legacy projects.&lt;/li&gt;

&lt;/ul&gt;

&lt;h3&gt;
  
  
  App Router
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Introduction&lt;/strong&gt;: Introduced in Next.js 13, the App Router is designed to leverage modern React features like Server Components and Streaming.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Features&lt;/strong&gt;:

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Server Components&lt;/strong&gt;: Enables server-side rendering by default, with client-side rendering for interactive components.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Dynamic, Nested, and Parallel Routes&lt;/strong&gt;: Supports complex routing scenarios, making it suitable for dynamic and interactive applications.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Performance Optimization&lt;/strong&gt;: Offers advanced caching and streaming capabilities to improve user experience.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Flexibility&lt;/strong&gt;: Allows for more flexible rendering and handling of UI scenarios.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;strong&gt;Use Cases&lt;/strong&gt;: Recommended for single-page applications or projects requiring advanced routing features and performance optimizations.&lt;/li&gt;

&lt;/ul&gt;

&lt;h2&gt;
  
  
  Key Differences
&lt;/h2&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Feature&lt;/th&gt;
&lt;th&gt;Pages Router&lt;/th&gt;
&lt;th&gt;App Router&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Routing Approach&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;File-system based&lt;/td&gt;
&lt;td&gt;More flexible, supports dynamic and parallel routes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Server-Side Rendering&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Optional, via &lt;code&gt;getServerSideProps&lt;/code&gt;
&lt;/td&gt;
&lt;td&gt;Default for most components&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;SEO &amp;amp; Performance&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Better SEO and initial performance due to SSR/SSG&lt;/td&gt;
&lt;td&gt;Advanced caching and streaming for improved performance&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Complexity&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Simpler setup&lt;/td&gt;
&lt;td&gt;More complex, but offers advanced features&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Recommended Use&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Legacy projects, SEO-focused applications&lt;/td&gt;
&lt;td&gt;New projects, single-page applications, dynamic routing&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;To improve your skills in Next.js, focus on the following:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Familiarize yourself with the App Router&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Practice with Real Projects&lt;/strong&gt;: Start building projects that utilize the App Router's advanced features like Server Components and parallel routes.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Stay Updated with Documentation&lt;/strong&gt;: Regularly check the official Next.js documentation for updates and best practices.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Experiment with Different Routing Scenarios&lt;/strong&gt;: Try implementing dynamic, nested, and parallel routes to understand how they can enhance your application's user experience.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Happy Coding! 🤓&lt;/p&gt;

</description>
      <category>nextjs</category>
      <category>ssr</category>
      <category>webdev</category>
      <category>javascript</category>
    </item>
  </channel>
</rss>
