<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Nathan Skiles</title>
    <description>The latest articles on DEV Community by Nathan Skiles (@nate_serpapi).</description>
    <link>https://dev.to/nate_serpapi</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/nate_serpapi"/>
    <language>en</language>
    <item>
      <title>Searching eBay with AI and Automation using n8n</title>
      <dc:creator>Nathan Skiles</dc:creator>
      <pubDate>Mon, 04 Aug 2025 23:33:55 +0000</pubDate>
      <link>https://dev.to/nate_serpapi/searching-ebay-with-ai-and-automation-using-n8n-3fh9</link>
      <guid>https://dev.to/nate_serpapi/searching-ebay-with-ai-and-automation-using-n8n-3fh9</guid>
      <description>&lt;p&gt;Welcome to Part 2 of our tutorial series. Previously (Part 1), we created an automated workflow that takes an image and returns potential item names using Google’s visual search.&lt;/p&gt;

&lt;p&gt;In case you missed it, you can find Part 1 here:&lt;/p&gt;


&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
    &lt;div class="c-embed__content"&gt;
        &lt;div class="c-embed__cover"&gt;
          &lt;a href="https://serpapi.com/blog/uploading-images-and-searching-with-google-lens-via-serpapi/" class="c-link align-middle" rel="noopener noreferrer"&gt;
            &lt;img alt="" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fserpapi.com%2Fblog%2Fcontent%2Fimages%2F2025%2F05%2Fprice_blog.png" height="auto" class="m-0"&gt;
          &lt;/a&gt;
        &lt;/div&gt;
      &lt;div class="c-embed__body"&gt;
        &lt;h2 class="fs-xl lh-tight"&gt;
          &lt;a href="https://serpapi.com/blog/uploading-images-and-searching-with-google-lens-via-serpapi/" rel="noopener noreferrer" class="c-link"&gt;
            Uploading Images and Searching with Google Lens via SerpApi
          &lt;/a&gt;
        &lt;/h2&gt;
          &lt;p class="truncate-at-3"&gt;
            In this tutorial, we’ll build an automation that identifies an item using visual search. By leveraging Google Lens (via SerpApi) for image recognition, you can quickly determine the exact name of a card or collectible.
          &lt;/p&gt;
        &lt;div class="color-secondary fs-s flex items-center"&gt;
            &lt;img alt="favicon" class="c-embed__favicon m-0 mr-2 radius-0" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fserpapi.com%2Fblog%2Fcontent%2Fimages%2Fsize%2Fw256h256%2F2021%2F07%2Fserpapi-favicon.png"&gt;
          serpapi.com
        &lt;/div&gt;
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;


&lt;p&gt;Now, we’ll close the loop by integrating AI and price lookup. The goal for this second part is to automatically figure out which of those candidate titles is the correct one for our item, and then find how much it’s selling for on eBay.&lt;/p&gt;

&lt;p&gt;We will use AI to pick the best title, and then use SerpApi’s eBay Search API to fetch live pricing data. We’ll also touch on ways to log or store this info for future use.&lt;/p&gt;

&lt;h3&gt;
  
  
  What You’ll Need
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;n8n&lt;/strong&gt;: the same n8n setup from Part 1. The workflow will be extended, so have that ready.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;SerpApi API key&lt;/strong&gt;: continue using your SerpApi account for the eBay search step. (No new setup if you have it from Part 1.)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;OpenAI API Key&lt;/strong&gt;: an OpenAI account (or another supported LLM in n8n). You’ll input this in n8n’s credentials so the AI Agent node can access the model.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;(optional) Google Sheet or Database&lt;/strong&gt;: If you want to store the results (item names and prices) for record-keeping or price tracking, you can prepare a Google Sheet or database. This is optional, but we’ll discuss how you could send data to Google Sheets as an example.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Workflow Continuation
&lt;/h2&gt;

&lt;p&gt;Now we will extend the workflow from Part 1. After extracting the list of possible titles, we’ll add steps for AI processing and eBay lookup. Referring to the workflow diagram (from Part 1) for context, our new nodes will come after the “Extract Visual Match Titles” step.&lt;/p&gt;

&lt;p&gt;Your workflow should currently look like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu46krwskg3yeaey513xb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu46krwskg3yeaey513xb.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Using the AI Agent to Refine Results
&lt;/h3&gt;

&lt;p&gt;n8n offers a variety of AI nodes, with capabilities ranging from text summarization to classification. For our use case, we’ll use the &lt;strong&gt;AI Agent&lt;/strong&gt; node, which lets you attach multiple chat models and tools to the agent.&lt;/p&gt;


&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
    &lt;div class="c-embed__content"&gt;
      &lt;div class="c-embed__body flex items-center justify-between"&gt;
        &lt;a href="https://docs.n8n.io/integrations/builtin/cluster-nodes/root-nodes/n8n-nodes-langchain.agent/?utm_source=n8n_app&amp;amp;amp;utm_medium=node_settings_modal-credential_link&amp;amp;amp;utm_campaign=%40n8n%2Fn8n-nodes-langchain.agent" rel="noopener noreferrer" class="c-link fw-bold flex items-center"&gt;
          &lt;span class="mr-2"&gt;docs.n8n.io&lt;/span&gt;
          

        &lt;/a&gt;
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;


&lt;p&gt;We’ll use the AI Agent to determine the most likely product name based on the title list we generated in the previous blog post.&lt;/p&gt;

&lt;p&gt;Add the AI Agent node to your workflow and set the &lt;strong&gt;Source for Prompt&lt;/strong&gt; option to &lt;strong&gt;Define below&lt;/strong&gt;. This allows us to pass our title list to the chatbot and provide it with context.&lt;/p&gt;

&lt;p&gt;For the prompt (user message), I’ve used the following:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Here are the top visual matches from a Google Lens search for a product image: {{ $json.titlesList }}

Return the most likely product title from these matches that best describes the specific item in the image.

Do not include bundle listings, condition, or unrelated items.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Next, add a System Message (not shown by default) to give the chatbot additional instructions:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;You are an AI assistant that specializes in identifying and generating accurate, search-optimized product titles from visual search data.

Given a list of titles extracted from a reverse image search, return the most likely full product title for the specific item in the image.

Your output should be a clean, concise string that includes the key identifiers (brand, model, year, type, edition, etc.) — formatted like a product listing title someone might search for on a marketplace (e.g., eBay or Amazon).

Only return the title — no explanations or extra words.

Do not include specific model numbers unless they are relevant to the product. For example:

Sony PlayStation 5 Digital Edition Console (Model CFI-1218B)

Can just be:

Sony PlayStation 5 Digital Edition Console
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Your AI Agent parameters should now look like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7qpgkodnsjuhzfl5egrd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7qpgkodnsjuhzfl5egrd.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We’re not quite done configuring the AI Agent. We still need to add a Chat Model, along with any tools that might help the AI Agent process the data.&lt;/p&gt;

&lt;p&gt;To add a Chat Model, click the plus (+) button shown at the bottom of the previous screenshot. I’ll be using an OpenAI chat model, but feel free to experiment with others to see which best suits your use case.&lt;/p&gt;

&lt;p&gt;Configuration will vary depending on the model you choose. In most cases, you’ll need to provide an API key. Refer to n8n’s documentation for setup instructions specific to your model.&lt;/p&gt;

&lt;p&gt;Once configured, execute the step to ensure the model returns a clear and relevant product title for the input provided:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fokw382zekhegqucbkhiq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fokw382zekhegqucbkhiq.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Searching eBay via SerpApi
&lt;/h3&gt;

&lt;p&gt;Next, we'll use eBay to get a list of recent sales for our product to determine a fair price.&lt;/p&gt;

&lt;p&gt;From the SerpApi node, add the Search eBay action to the workflow. From the parameters menu, we'll only need to set the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Search Query: &lt;code&gt;{{ $json.output }}&lt;/code&gt; - This will pass the output from our AI Agent step as the query parameter.&lt;/li&gt;
&lt;li&gt;Results filter (additional field): &lt;code&gt;Sold&lt;/code&gt; - It is important that this parameter is capitalized. This parameter ensures that we only get results back for recently sold items.&lt;/li&gt;
&lt;li&gt;Disable Caching (optional): &lt;code&gt;true&lt;/code&gt; - SerpApi caches queries for 1 hour after they are performed. Meaning, a cached search will be returned if you perform another search with the same exact search parameters within an hour of the initial search. I've chosen to disable this parameter to ensure the freshest data possible.
Once configured, our node should look like this:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwd9w1ad7q0vrqpwvuu2m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwd9w1ad7q0vrqpwvuu2m.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, let's go ahead and test the step by executing the step:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnqdbbw0qvwt5dwkncoha.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnqdbbw0qvwt5dwkncoha.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Saving the Data
&lt;/h3&gt;

&lt;p&gt;n8n offers a multitude of options for working with and saving data returned by the eBay API. You can write directly to a SQL database, push data to cloud-based data warehouses like Snowflake, or simply log it to a Google Sheet. From this point forward, I would recommend exploring n8n to best fit your specific use case, but in this example, we’ll be using Google Sheets.&lt;/p&gt;

&lt;p&gt;For each sold item returned, we want to create a row in our sheet to make it easy to filter and sort information such as the title, condition, and price.&lt;/p&gt;

&lt;p&gt;First, we need to extract the organic results from the eBay API response using a Code node. I’ve used JavaScript here, but you can also use Python:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9c5u465mfthitpk53y2l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9c5u465mfthitpk53y2l.png" alt=" "&gt;&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;results&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;$input&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;first&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nx"&gt;json&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;organic_results&lt;/span&gt;

&lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;results&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will output each result individually so we can pass them to the Google Sheets node.&lt;/p&gt;

&lt;p&gt;Next, configure the Google Sheets node by selecting the &lt;strong&gt;“Append row in sheet”&lt;/strong&gt; operation. Choose the document and sheet you want to write to, and make sure you’ve already added columns for each data point you plan to include. In this example, I’ve added Title, Link, Condition, and Price.&lt;/p&gt;

&lt;p&gt;Finally, map the appropriate attributes from n8n to the corresponding columns in your Google Sheet.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn615wunb60554vcs68qo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn615wunb60554vcs68qo.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If all went well, all of our results should have been written to a Google Sheet:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwhyd17icawcfy64munw3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwhyd17icawcfy64munw3.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrap-Up
&lt;/h2&gt;

&lt;p&gt;In Part 2, we completed our journey from an image to actionable data. We used an AI agent to refine the search results and pinpoint the exact product name, then tapped into eBay’s data to find its market price. At this point, our “Visual Search to Price Tracking” workflow is fully functional: you can POST an image and get back the likely item name and its price on eBay within seconds.&lt;/p&gt;

&lt;p&gt;Ideas for Scaling Further: Now that the basic workflow is done, you can enhance it even more:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Batch Processing: Have multiple images? You could modify the workflow to handle an array of images in one go, or simply trigger it multiple times. n8n could even watch an email or folder for new images and process them automatically.&lt;/li&gt;
&lt;li&gt;Scheduled Price Checks: As mentioned, you could use a Scheduler (Cron) node to periodically run the eBay search for saved item names, building a daily price tracker. This might help you notice trends (e.g., prices going up or down over time).&lt;/li&gt;
&lt;li&gt;Alerts for Price Drops or Increases: With a slight extension, you could compare the latest price to a previous one (store the last known price in a workflow variable or database). If the price drops by a certain percentage, have n8n send you an alert – useful for deciding when to sell or buy more.&lt;/li&gt;
&lt;li&gt;Integration into Listing Workflow: If you are a seller, you could integrate this into your listing creation process. For example, when you want to list a new item, run it through this workflow to get the suggested title and price, then automatically draft an eBay listing or a listing in your inventory system with those details.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;We hope this two-part tutorial showed the power of combining AI, web scraping and automation. By chaining these tools (image recognition, language understanding, and web data retrieval), you can automate typically monotonous manual tasks. Whether you’re a card collector pricing out your collection or an online seller researching products, this workflow can save you a ton of time.&lt;/p&gt;

</description>
      <category>automation</category>
      <category>n8n</category>
      <category>serpapi</category>
      <category>ebay</category>
    </item>
    <item>
      <title>Fetching AI Overviews with Node.js</title>
      <dc:creator>Nathan Skiles</dc:creator>
      <pubDate>Fri, 11 Jul 2025 01:14:34 +0000</pubDate>
      <link>https://dev.to/nate_serpapi/fetching-ai-overviews-with-nodejs-2bg2</link>
      <guid>https://dev.to/nate_serpapi/fetching-ai-overviews-with-nodejs-2bg2</guid>
      <description>&lt;h1&gt;
  
  
  Fetching AI Overviews with Node.js
&lt;/h1&gt;

&lt;p&gt;AI Overviews are quickly becoming a prominent feature in today’s SEO landscape, and with SerpApi, extracting this data is simple.&lt;/p&gt;

&lt;p&gt;While Google hasn’t shared many details about how sources are selected for AI Overviews, early signs suggest that they rely on many of the same SEO principles already in use. This makes tracking AI Overviews a valuable addition to your SEO strategy, offering insight into how users discover and interact with your content.&lt;/p&gt;

&lt;p&gt;By targeting the correct queries and optimizing your content accordingly, you may improve your chances of being cited in an AI Overview. And although SerpApi simplifies data collection, there are a few common pitfalls to be aware of. In this post, I’ll walk through what to expect and how to avoid issues during implementation.&lt;/p&gt;

&lt;h2&gt;
  
  
  Getting Started
&lt;/h2&gt;

&lt;p&gt;For this guide, I'll be using SerpApi's Node.js package (&lt;a href="https://serpapi.com/integrations/javascript" rel="noopener noreferrer"&gt;documentation&lt;/a&gt;) to retrieve AI Overviews. That said, the same principles apply no matter which programming language you prefer.&lt;/p&gt;

&lt;p&gt;I also recommend reviewing the SerpApi Google Search API and AI Overview API documentation to get more familiar with the endpoints and data structure.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/search-api" rel="noopener noreferrer"&gt;Google Search Engine Results API&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/google-ai-overview-api" rel="noopener noreferrer"&gt;Google AI Overview API&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Requirements
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;SerpApi API Key&lt;/strong&gt; - If you don’t have an account yet, you can sign up for a free plan with 100 searches per month. If you already have an account, your API key is available in your dashboard.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Node.js 7.10.1 or newer&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;SerpApi Node.js Package&lt;/strong&gt; - You can install it using &lt;code&gt;npm install serpapi&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Search Parameters
&lt;/h3&gt;

&lt;p&gt;AI Overviews are returned for some queries to SerpApi's Google Search API. To get started, let’s define the search parameters:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;searchParams&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="na"&gt;engine&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;google&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;q&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;query&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;api_key&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;API_KEY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;no_cache&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here’s what each parameter does:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;engine : "google"&lt;/code&gt; - Tells SerpApi to use the standard Google Search engine.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;q: query&lt;/code&gt; - The search query to submit.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;api_key: API_KEY&lt;/code&gt; - Your SerpApi key for authentication.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;no_cache: true&lt;/code&gt; - Forces SerpApi to fetch fresh data from Google instead of returning a cached result.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If &lt;code&gt;no_cache&lt;/code&gt; is not used, you may receive a cached response that includes an expired &lt;code&gt;page_token&lt;/code&gt;. Always use &lt;code&gt;no_cache&lt;/code&gt;: true when you intend to follow up with a second request using the &lt;code&gt;page_token&lt;/code&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Perform the Initial Search and Check for AI Overview
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;searchResponse&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;getJson&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;searchParams&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;overviewData&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;searchResponse&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;ai_overview&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This performs the initial search and attempts to extract the &lt;code&gt;ai_overview&lt;/code&gt; object from the response. If the query triggers an AI Overview, the overviewDatathe variable will contain its content; otherwise, it will be undefined.Not all searches return the full AI Overview here. Instead, some may only include a &lt;code&gt;page_token&lt;/code&gt;, which is a temporary reference used to fetch the actual AI Overview in a second request.&lt;/p&gt;

&lt;h3&gt;
  
  
  Check for page_token and Fetch the Full AI Overview
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;overviewData&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;page_token&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;overviewParams&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;engine&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;google_ai_overview&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;api_key&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;API_KEY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;};&lt;/span&gt;

  &lt;span class="nx"&gt;overviewParams&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;page_token&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;overviewData&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;page_token&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;ai_overview_response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;getJson&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;overviewParams&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

  &lt;span class="nx"&gt;overviewResult&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;ai_overview_response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;ai_overview&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;overviewResult&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;overviewData&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This checks whether the initial response includes a page_token. If it does:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A second request is made using the &lt;code&gt;google_ai_overview&lt;/code&gt; engine and the &lt;code&gt;page_token&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;The full AI Overview is returned and stored in &lt;code&gt;overviewResult&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If no token is found, it assumes the AI Overview was included in the original response.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;page_token&lt;/code&gt; values are short-lived, expiring within ~4 minutes. You must make the follow-up request immediately.&lt;/p&gt;

&lt;h3&gt;
  
  
  Full Example
&lt;/h3&gt;

&lt;p&gt;In the example below, we iterate over an array of queries, performing a Google Search for each one and checking if an AI Overview is returned.&lt;/p&gt;

&lt;p&gt;If an AI Overview is present, we then check whether it includes a &lt;code&gt;page_token&lt;/code&gt;. If it does, we make a follow-up request to the &lt;code&gt;google_ai_overview&lt;/code&gt; engine using that token to retrieve the full AI Overview.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;getJson&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;serpapi&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;dotenv&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;dotenv&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="nx"&gt;dotenv&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;config&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;API_KEY&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;SERPAPI_KEY&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;queries&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;what makes a good web hosting service&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;best crm platform&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;top programming languages to learn in 2025&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;];&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;fetchAIOverview&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;query&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;searchParams&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;engine&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;google&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;q&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;query&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;api_key&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;API_KEY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;no_cache&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;};&lt;/span&gt;

  &lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;searchResponse&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;getJson&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;searchParams&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;overviewData&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;searchResponse&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;ai_overview&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;overviewResult&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;overviewData&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;No AI overview available.&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="k"&gt;return&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;overviewData&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;page_token&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;overviewParams&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="na"&gt;engine&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;google_ai_overview&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="na"&gt;api_key&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;API_KEY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="p"&gt;};&lt;/span&gt;
      &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Page token found. Fetching AI overview page...&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="nx"&gt;overviewParams&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;page_token&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;overviewData&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;page_token&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
      &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;ai_overview_response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;getJson&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;overviewParams&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;AI overview retrieved.&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="nx"&gt;overviewResult&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;ai_overview_response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;ai_overview&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;AI overview retrieved.&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="nx"&gt;overviewResult&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;overviewData&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;AI Overview:&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;overviewResult&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Error fetching data from SerpAPI:&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="nx"&gt;queries&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;forEach&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;query&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`Fetching AI overview for query: "&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;query&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="nf"&gt;fetchAIOverview&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;query&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Best Practices Summary
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Use no_cache: true to avoid stale results and expired tokens.&lt;/li&gt;
&lt;li&gt;Make the follow-up request to google_ai_overview immediately if a page_token is present.&lt;/li&gt;
&lt;li&gt;Process each query sequentially, not in parallel, to avoid expired page_tokens.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Resources
&lt;/h2&gt;

&lt;p&gt;For a high-level introduction to the Google AI Overviews API, I recommend checking out the following blog post:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/blog/scrape-google-ai-overviews/" rel="noopener noreferrer"&gt;How to Scrape Google AI Overviews (AIO)&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you have questions or run into any issues with AI Overviews, don’t hesitate to reach out to SerpApi’s support team at &lt;a href="//mailto:contact@serpapi.com"&gt;contact@serpapi.com&lt;/a&gt;&lt;/p&gt;

</description>
      <category>serpapi</category>
      <category>ai</category>
      <category>google</category>
      <category>seo</category>
    </item>
    <item>
      <title>Unlocking SEO Insights: Leveraging 'People Also Ask' for Smarter Content Strategies</title>
      <dc:creator>Nathan Skiles</dc:creator>
      <pubDate>Thu, 27 Feb 2025 22:09:27 +0000</pubDate>
      <link>https://dev.to/nate_serpapi/unlocking-seo-insights-leveraging-people-also-ask-for-smarter-content-strategies-2ffo</link>
      <guid>https://dev.to/nate_serpapi/unlocking-seo-insights-leveraging-people-also-ask-for-smarter-content-strategies-2ffo</guid>
      <description>&lt;p&gt;Creating content that resonates with your target audience requires more than just keyword research and competitor analysis. Gaining insight into your audience’s questions, concerns, and information-seeking behaviors can provide a deeper understanding of their needs. This knowledge helps in crafting content that directly addresses their interests and challenges.&lt;/p&gt;

&lt;p&gt;The "People Also Ask" (PAA) boxes have become a prominent feature in Google's search engine results pages (SERPs), which are the pages displayed by search engines in response to a user's query. These dynamic elements offer valuable insights into user intent and reveal emerging content opportunities that can enhance any content strategy.&lt;/p&gt;

&lt;h2&gt;
  
  
  Understanding People Also Ask
&lt;/h2&gt;

&lt;p&gt;The People Also Ask feature represents Google's effort to provide users with additional, relevant questions related to their search queries. These expandable boxes appear within search results and contain a series of questions that users frequently ask about related topics. When a user clicks on a question, the box expands to reveal a brief answer, typically extracted from a high-ranking webpage.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fecoh2660vgajp0zzn49p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fecoh2660vgajp0zzn49p.png" width="800" height="559"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Expandable "People also ask" box&lt;/p&gt;

&lt;p&gt;What makes PAA boxes particularly valuable is their dynamic nature. For example, if a user searches for "content strategy" and clicks on a PAA question like "What are the steps to creating a content strategy?", new related questions will appear, such as "How do you measure the success of a content strategy?" or "What tools help with content strategy development?" This continuous expansion of queries provides deeper insights into user search intent. This behavior reflects real user interests and search patterns, providing authentic insights into an audience's thought processes and information needs.&lt;/p&gt;

&lt;p&gt;While PAA data is often associated with straightforward Q&amp;amp;A content, it also uncovers:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  The various angles and perspectives from which users research a topic.&lt;/li&gt;
&lt;li&gt;  Common misconceptions or concerns that need addressing.&lt;/li&gt;
&lt;li&gt;  Related topics that users frequently explore together.&lt;/li&gt;
&lt;li&gt;  The level of expertise in their target audience's questions.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Analyzing PAA data enables content strategists to craft detailed plans that target key search terms, cover topics comprehensively, and guide user journeys. This approach aligns with Google's quality guidelines, which prioritize comprehensive topic coverage and demonstrable subject expertise. In doing so, it paves the way for improved search visibility and engagement.&lt;/p&gt;

&lt;h2&gt;
  
  
  Transforming PAA Insights into Content
&lt;/h2&gt;

&lt;p&gt;Converting PAA data into actionable content strategies requires a systematic approach. When analyzing PAA questions, start by identifying patterns in user inquiries. For instance, if you're creating content about digital marketing, you'll often notice a progression—from basic definitional queries to more complex implementation questions. This progression can help structure your content, guiding readers from foundational concepts to advanced applications.&lt;/p&gt;

&lt;p&gt;Consider a real-world example: a software company launching a new project management tool. By analyzing PAA data related to "project management software," they might discover that users frequently ask questions about integration capabilities, pricing models, and team collaboration features. This intelligence helps prioritize which aspects of the product to highlight in their content marketing efforts. By leveraging PAA data, content teams can take several concrete steps:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Build detailed FAQ sections that proactively address common audience queries.&lt;/li&gt;
&lt;li&gt;  Structure blog posts and articles to naturally answer related questions.&lt;/li&gt;
&lt;li&gt;  Create content that covers topics exhaustively.&lt;/li&gt;
&lt;li&gt;  Spot content gaps to uncover overlooked topics and opportunities for expansion.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These strategies help ensure your content directly addresses user needs and fills critical gaps in your existing offerings.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Role of PAA in SEO Strategy
&lt;/h2&gt;

&lt;p&gt;Incorporating PAA insights into an SEO strategy can improve search visibility and audience engagement. By targeting PAA questions, content creators can:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Increase their chances of ranking for featured snippets.&lt;/li&gt;
&lt;li&gt;  Improve content relevance and user engagement.&lt;/li&gt;
&lt;li&gt;  Enhance topic authority and site credibility.&lt;/li&gt;
&lt;li&gt;  Drive organic traffic by addressing real search queries.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Optimizing content for PAA involves structuring answers concisely, using schema markup, and ensuring that content provides direct, valuable responses to user queries. Additionally, updating content based on newly emerging PAA questions can help maintain relevance and rankings over time.&lt;/p&gt;

&lt;h2&gt;
  
  
  Implementing PAA Insights with SerpApi
&lt;/h2&gt;

&lt;p&gt;SerpApi is a powerful tool designed to scrape and extract structured data from live search engine results pages, including the "People Also Ask" feature. Using SerpApi, businesses and SEO professionals can automate the collection of PAA data, ensuring continuous insights into evolving user queries.&lt;/p&gt;

&lt;p&gt;With real-time search results and an easy-to-use API, SerpApi eliminates the need for manual research or time-intensive in-house solutions. This allows content teams to focus on strategy and execution rather than data collection.&lt;/p&gt;

&lt;h3&gt;
  
  
  Google Search API
&lt;/h3&gt;

&lt;p&gt;SerpApi’s &lt;strong&gt;Google Search API&lt;/strong&gt; enables users to extract various search result elements, including organic listings, ads, featured snippets, and the People Also Ask section. By integrating this API, businesses can track PAA questions dynamically, monitor changes, and identify emerging trends in user queries. This allows content creators to refine their SEO and content strategies with up-to-date information.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/search-api" rel="noopener noreferrer"&gt;Google Search API Documentation&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The remainder of this post will focus specifically on the Google Related Questions API, I highly recommend checking out the following blog post to get similar with SerpApi and the Google Search API:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/blog/how-to-scrape-google-search-results-serps-2023-guide/" rel="noopener noreferrer"&gt;How to Scrape Google Search Results (SERPs) - 2025 Guide&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Additionally, it's worth performing test queries in our playground environment to get a better idea of the results returned as well as the structure of data returned from both the Google Search and Related Questions APIs:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/playground" rel="noopener noreferrer"&gt;SerpApi Playground - SerpApi&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;Google Related Questions API&lt;/strong&gt; specifically focuses on retrieving PAA data. This API provides structured data on related questions that users frequently ask, along with their respective answers. By leveraging this API, content strategists can:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Identify trending PAA questions for their niche.&lt;/li&gt;
&lt;li&gt;  Monitor how PAA questions evolve over time.&lt;/li&gt;
&lt;li&gt;  Optimize content to align with high-impact queries.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Using SerpApi’s capabilities, businesses can efficiently gather, analyze, and implement PAA insights into their SEO and content marketing efforts.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/google-related-questions-api" rel="noopener noreferrer"&gt;Google Related Questions API - SerpApi&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Google Related Questions API Documentation&lt;/p&gt;

&lt;h3&gt;
  
  
  Fetching PAA Questions Using SerpApi
&lt;/h3&gt;

&lt;p&gt;In this section, we will walk through using Node.js to fetch PAA questions using SerpApi's &lt;strong&gt;Google Search API&lt;/strong&gt; and &lt;strong&gt;Google Related Questions API&lt;/strong&gt;. I will use JavaScript, but the examples can be easily translated into other programming languages. You can find a complete list of SerpApi integrations here:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/integrations" rel="noopener noreferrer"&gt;SerpApi: Integrations&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1: Install SerpApi’s JavaScript Package
&lt;/h3&gt;

&lt;p&gt;While you can use any of your favorite packages to make API calls, we will use the SerpApi Node.js package (&lt;a href="https://github.com/serpapi/serpapi-javascript?tab=readme-ov-file#serpapi-for-javascripttypescript" rel="noopener noreferrer"&gt;documentation&lt;/a&gt;) to keep things clean and consistent.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm &lt;span class="nb"&gt;install &lt;/span&gt;serpapi
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 2: Fetch Data Using Google Search API
&lt;/h3&gt;

&lt;p&gt;PAA questions appear organically in Google Search results, so to start with, we can query the Google Search API to return a typical SERP.&lt;/p&gt;

&lt;p&gt;Below, we perform a search using the &lt;code&gt;google&lt;/code&gt; engine, and our query is "content strategy." We then store the related questions returned in an array that we can add to later.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;getJson&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;serpapi&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;getJson&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
  &lt;span class="na"&gt;engine&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;google&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;api_key&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;YOUR_API_KEY&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;q&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;content strategy&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="c1"&gt;// Store PAA questions&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;relatedQuestionsArray&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;related_questions&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]);&lt;/span&gt; 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;While related questions are returned for &lt;strong&gt;most&lt;/strong&gt; queries, I recommend adding some more robust error handling for edge cases where no related questions are returned.&lt;/p&gt;

&lt;p&gt;Example Response:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"related_questions"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"question"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"What are the 7 steps in creating a content strategy?"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"snippet"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;""&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"title"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"7-step content development process: step-by-step guide"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"date"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Jun 17, 2024"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"link"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://contentsnare.com/content-development-process/"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"list"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"Step 1: Do your research. ... "&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"Step 2: Analyze the information. ... "&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"Step 3: Plan your strategy. ... "&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"Step 4: Write. ... "&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"Step 5: Editing, SEO and publishing. ... "&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"Step 6: Take to social media. ... "&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"Step 7: Analyze and begin again."&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"displayed_link"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://contentsnare.com › content-development-process"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"source_logo"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://serpapi.com/searches/67bf85dac71a675e242edf6b/images/b63f19fbb9fff39987251b02abdc8c2d341978702805ce28c8d754c35cf0eb70.png"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"next_page_token"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"eyJvbnMiOiIxMDA0MSIsImZjIjoiRXFFQkNtSkJRUzFMVkdoa1..."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"serpapi_link"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://serpapi.com/search.json?device=desktop&amp;amp;engine=google_related_questions&amp;amp;google_domain=google.com&amp;amp;next_page_token=eyJvbnMiOiIxMDA0MSIsImZ..."&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Typically, 3-4 PAA questions are returned on any given Google SERP. However, as mentioned previously, expanding these questions returns additional related questions.&lt;/p&gt;

&lt;p&gt;Each question returned in Step 2 will include a &lt;code&gt;next_page_token&lt;/code&gt;, which we can use to retrieve additional questions related to the question using the Google Related Questions API.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;relatedResponse&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;getJson&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
  &lt;span class="na"&gt;engine&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;google_related_questions&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;api_key&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;YOUR_API_KEY&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;next_page_token&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;NEXT_PAGE_TOKEN&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="c1"&gt;// Save additional question data&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;additionalQuestions&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;relatedResponse&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;related_questions&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The response to this query will be similar to the one seen in Step 2, returning &lt;code&gt;next_page_tokens&lt;/code&gt; for each additional question.&lt;/p&gt;

&lt;p&gt;By automating the extraction of PAA insights, businesses can refine their SEO approach, ensuring content aligns with what users are actively searching for.&lt;/p&gt;

&lt;h3&gt;
  
  
  Example Node.js CLI App
&lt;/h3&gt;

&lt;p&gt;To get you started, I've created a sample CLI app that allows you to enter a query and specify the depth at which you would like to retrieve PAA questions.&lt;/p&gt;

&lt;p&gt;The tool takes your search query, sends it to the Google Search API, and extracts the "People Also Ask" questions from the search results. For each question, it recursively fetches additional related questions up to the specified depth.&lt;/p&gt;

&lt;p&gt;The output displays the relationship between questions in a tree structure:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;1 → First level question
1.2 → Second question that stems from the first question
2.1.3 → Third question that stems from the first child of the second question
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://github.com/NateSkiles/paa-example" rel="noopener noreferrer"&gt;GitHub - NateSkiles/paa-example&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Measuring Success
&lt;/h2&gt;

&lt;p&gt;After implementing PAA insights into your content and SEO strategy, measuring success is crucial. Key performance indicators (KPIs) to track include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;strong&gt;Organic traffic growth&lt;/strong&gt;: Increased visits from search engines.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;SERP rankings&lt;/strong&gt;: Improved positions for targeted queries.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Click-through rates (CTR)&lt;/strong&gt;: Higher engagement on PAA-optimized content.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;User engagement metrics&lt;/strong&gt;: Time on page, bounce rate, and session duration.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Conversions&lt;/strong&gt;: Lead generation and customer inquiries driven by content.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Regularly updating content based on fresh PAA insights ensures that your strategy remains effective and aligned with user search behavior.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;The "People Also Ask" feature is a powerful tool for understanding user intent and enhancing content strategies. By leveraging SerpApi’s capabilities, marketers and SEO professionals can efficiently extract and analyze PAA data to create highly relevant, user-focused content.&lt;/p&gt;

&lt;p&gt;By systematically integrating PAA insights, businesses can improve their search visibility, attract more organic traffic, and establish authority in their niche. Leveraging these insights allows content creators to address user queries effectively, optimize for featured snippets, and provide comprehensive information that enhances engagement. Regularly refining content based on PAA data ensures continued relevance and strengthens a brand’s presence in search results. Whether you're developing blog posts, FAQs, or in-depth guides, PAA-driven content strategies can set your brand apart in an increasingly competitive digital landscape.&lt;/p&gt;

</description>
      <category>seo</category>
      <category>serpapi</category>
      <category>contentwriting</category>
      <category>api</category>
    </item>
    <item>
      <title>No-code Solutions for Turning Search Results Into Markdown for LLMs</title>
      <dc:creator>Nathan Skiles</dc:creator>
      <pubDate>Thu, 27 Feb 2025 21:56:35 +0000</pubDate>
      <link>https://dev.to/nate_serpapi/no-code-solutions-for-turning-search-results-into-markdown-for-llms-4pi5</link>
      <guid>https://dev.to/nate_serpapi/no-code-solutions-for-turning-search-results-into-markdown-for-llms-4pi5</guid>
      <description>&lt;p&gt;Intro&lt;br&gt;
In my last post, I walked through a coding solution in which we used Node.js to scrape webpages returned in Google Search results and parse them to Markdown for use in LLMs or other use cases. Today, we will do something similar, but we will utilize no-code solutions to return results from SerpApi's Google Search API and parse to markdown using the Reader API by Jina.ai.&lt;/p&gt;

&lt;p&gt;Turning Search Results Into Markdown for LLMs&lt;br&gt;
Intro This article will walk through converting search results into Markdown format, suitable for use in large language models (LLMs) and other applications. Markdown is a lightweight markup language that provides a simple, readable way to format text with plain-text syntax. Checkout Markdown Guide for more information: Markdown GuideA free&lt;/p&gt;

&lt;p&gt;SerpApi&lt;br&gt;
Nathan Skiles&lt;/p&gt;

&lt;p&gt;Reader by Jina&lt;br&gt;
Jina AI’s Reader API (&lt;a href="https://jina.ai/reader/" rel="noopener noreferrer"&gt;https://jina.ai/reader/&lt;/a&gt;) offers an alternative for extracting structured content from web pages, maintaining document structure, and handling diverse content types.&lt;/p&gt;

&lt;p&gt;💡&lt;br&gt;
Jina AI provides 1 million test credits to get started, but keep in mind that tokens are used based on the amount of content parsed, not the number of requests made.&lt;br&gt;
Scraping Search Results&lt;br&gt;
Several options integrate well with SerpApi for those who prefer no-code solutions. This section will walk through configuring Make and Zapier to query search results through SerpApi.&lt;/p&gt;

&lt;p&gt;Make&lt;br&gt;
Starting with Make, we will create a Google Doc from the data returned by SerpApi and the Reader API. This is a relatively simple example, but Make provides many integrations with AI providers such as OpenAI, Anthropic, and more. Instead of sending the data to a Google Doc, you can replace the last module and send it wherever you need it!&lt;/p&gt;

&lt;p&gt;SerpApi has an official Make app, making it much more straightforward than Zapier to configure. While I will only be walking through how to configure the SerpApi Google Search module, for more information on our Make app, check out this blog post:&lt;/p&gt;

&lt;p&gt;Announcing SerpApi’s Make App&lt;br&gt;
We are pleased to announce the release of our Custom App on Make.com. This makes it easier than ever to integrate SerpApi with other systems without having to write any code.&lt;/p&gt;

&lt;p&gt;SerpApi&lt;br&gt;
Alex Barron&lt;/p&gt;

&lt;p&gt;Create a new scenario and search for the SerpApi app to get started. Select your preferred search engine module; we will use Google for this example.&lt;/p&gt;

&lt;p&gt;After selecting your search engine, we can start configuring our call to SerpApi. You'll add your SerpApi API key by creating a connection. Click "Create a connection."&lt;/p&gt;

&lt;p&gt;Enter your API key and name your connection:&lt;/p&gt;

&lt;p&gt;From here, we can configure our query:&lt;/p&gt;

&lt;p&gt;You can fetch additional pages by increasing the “Pagination Limit” setting but note that each page will cost 1 SerpApi search credit.&lt;/p&gt;

&lt;p&gt;Click “Run once” and verify the data coming back from SerpApi:&lt;/p&gt;

&lt;p&gt;Now that we have results we can work with, let’s add an “Iterator” module to allow us to use the data returned in the organic_results array. The Iterator module (documentation) converts arrays into a series of bundles. Each element in the array will output as a separate bundle, allowing us to query the Reader API for each link returned in our Google results.&lt;/p&gt;

&lt;p&gt;To query the Reader API for each link returned, let’s add an HTTP “Make a request” module that allows us to send HTTP requests.&lt;/p&gt;

&lt;p&gt;Set the URL parameter to &lt;a href="https://r.jina.ai/" rel="noopener noreferrer"&gt;https://r.jina.ai/&lt;/a&gt;, then click the box again to pull up a list of available data we can pass to this parameter. From our Iterator module, select the “link” attribute. This passes the link from our organic Google search results to the Reader URL. Here’s what that should look like:&lt;/p&gt;

&lt;p&gt;Now, we need to set our header for authorization by passing our Bearer token provided by Jina AI:&lt;/p&gt;

&lt;p&gt;Before running again, let’s add our final step to create a Google Doc to store the results. Again, this final step can be changed depending on your use case and where you wish to send the data.&lt;/p&gt;

&lt;p&gt;Add the Google Docs “Create a Document” module and add a connection to your Google account. Now you can set a name for the document as well as pass data from the Reader API as the content of the doc:&lt;/p&gt;

&lt;p&gt;Now lets run our scenario once more to ensure that everything is working correctly.&lt;/p&gt;

&lt;p&gt;💡&lt;br&gt;
While testing, we can set a limiter filter after the Iterator module to prevent the flow from continuing more than once.&lt;/p&gt;

&lt;p&gt;Click the wrench icon under the link between the Iterator and HTTP modules. Create a condition only to continue if the Iterator’s “position” attribute is numerically equal to “1”:&lt;/p&gt;

&lt;p&gt;Make sure to remove the limiter before publishing if you wish to query for all the results. Alternatively, you set the num parameter in the SerpApi Search Google module to limit results returned by SerpApi.&lt;/p&gt;

&lt;p&gt;Zapier&lt;br&gt;
While this article is focused on no-code solutions, SerpApi currently does not have a Zapier integration. Meaning we must use a little code to make our API requests to SerpApi. I have provided a code below that you can drop into Zapier Code steps without issue, though some adjustments may be needed to fit your use case better. I recommend starting with the code I've provided and then adjusting once you get results from SerpApi and the Reader API.&lt;/p&gt;

&lt;p&gt;In this section, we will walk through:&lt;/p&gt;

&lt;p&gt;Use the Schedule Trigger to initiate Zap&lt;br&gt;
Retrieve search results URLs from SerpApi&lt;br&gt;
Process webpages from URLs through Reader API to Markdown&lt;br&gt;
Send formatted content to your desired apps&lt;br&gt;
As this Zap requires API calls to both SerpApi and the Reader API, you will likely need a Trial or Professional/Team plan. Zapier limits code runtime to 1 second on the free plan, and both API requests will likely take longer (documentation).&lt;/p&gt;

&lt;p&gt;To get started, add a trigger to your Zap. For this post, we will use the Schedule by Zapier trigger to our Zap every day at 12:00 a.m. You can select a trigger that best fits your needs.&lt;/p&gt;

&lt;p&gt;💡&lt;br&gt;
Make sure to test each step before continuing to ensure you've created test data for future steps. Zapier will typically force you to test a step after changes before publishing.&lt;/p&gt;

&lt;p&gt;Next, we add a Code by Zapier action to make our request to SerpApi. This example uses JavaScript, but you can choose Python if you prefer to use the code below as a guideline.&lt;/p&gt;

&lt;p&gt;I've included the JavaScript code for this action below:&lt;/p&gt;

&lt;p&gt;const { SERPAPI_KEY, QUERY, LOCATION } = inputData;&lt;/p&gt;

&lt;p&gt;async function fetchGoogleSearchResults(query, location) {&lt;br&gt;
  try {&lt;br&gt;
    const response = await fetch(&lt;br&gt;
      &lt;code&gt;https://serpapi.com/search.json?engine=google&amp;amp;q=${encodeURIComponent(&lt;br&gt;
        query&lt;br&gt;
      )}&amp;amp;location=${encodeURIComponent(&lt;br&gt;
        location&lt;br&gt;
      )}&amp;amp;gl=us&amp;amp;hl=en&amp;amp;api_key=${SERPAPI_KEY}&lt;/code&gt;&lt;br&gt;
    );&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;if (!response.ok) {
  throw new Error(`HTTP error! status: ${response.status}`);
}

const data = await response.json();

if (!data) {
  throw new Error("No data returned from SerpAPI");
}

return data;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;} catch (error) {&lt;br&gt;
    // Instead of just logging, throw the error&lt;br&gt;
    throw new Error(&lt;code&gt;SerpAPI request failed: ${error.message}&lt;/code&gt;);&lt;br&gt;
  }&lt;br&gt;
}&lt;/p&gt;

&lt;p&gt;try {&lt;br&gt;
  const searchResults = await fetchGoogleSearchResults(QUERY, LOCATION);&lt;/p&gt;

&lt;p&gt;output = {&lt;br&gt;
    organicResults: searchResults?.organic_results || "No Organic Result Found"&lt;br&gt;
  };&lt;/p&gt;

&lt;p&gt;return output;&lt;br&gt;
} catch (error) {&lt;br&gt;
  console.error("Error:", error.message);&lt;br&gt;
  // Return a default response or throw error based on your needs&lt;br&gt;
  return { organic_results: [] };&lt;br&gt;
}&lt;br&gt;
After adding the above code to the action, add your input data:&lt;/p&gt;

&lt;p&gt;SERPAPI_KEY - your SerpApi key goes here. You can find your API key in your SerpApi dashboard.&lt;br&gt;
QUERY - query parameter for the search you wish to perform.&lt;br&gt;
LOCATION - location parameter for the search you wish to perform.&lt;br&gt;
If you wish to add additional search parameters, you can set them as Input Data or update the URL in the code directly. Setting them as Input Data allows you to update them more easily in the future.&lt;/p&gt;

&lt;p&gt;Your first code step should now look something like this:&lt;/p&gt;

&lt;p&gt;Our request from SerpApi will return an array of links for our organic results. To send each link to the Reader API, we must iterate through the links using the Looping by Zapier action.&lt;/p&gt;

&lt;p&gt;This action takes an array of values and runs the actions within the loop for each value. For our "Values to Loop," we can pass the "Organic Results Link" values from our second step to a key called "links".&lt;/p&gt;

&lt;p&gt;Let's also set the maximum number of loop iterations to 1 for testing to ensure we aren't spending Reader API credits needlessly.&lt;/p&gt;

&lt;p&gt;Inside the Looping by Zapier action, we can add another Code by Zapier action to make our request to the Reader API. Paste the JavaScript below in the code section of this action:&lt;/p&gt;

&lt;p&gt;const { link, READERAPI_KEY } = inputData;&lt;/p&gt;

&lt;p&gt;async function fetchReaderData(link, apiKey) {&lt;br&gt;
  try {&lt;br&gt;
    // Encode URL for path parameter&lt;br&gt;
    const encodedUrl = encodeURIComponent(link);&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const response = await fetch(`https://r.jina.ai/${encodedUrl}`, {
  method: 'GET',
  headers: {
    'Authorization': `Bearer ${apiKey}`,
    'Accept': 'application/json'
  }
});

if (!response.ok) {
  throw new Error(`HTTP error! status: ${response.status}`);
}

const data = await response.json();

if (!data) {
  throw new Error('No data returned from Reader API');
}

return data;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;} catch (error) {&lt;br&gt;
    console.error('Reader API request failed:', error.message);&lt;br&gt;
    throw error;&lt;br&gt;
  }&lt;br&gt;
}&lt;/p&gt;

&lt;p&gt;try {&lt;br&gt;
  const readerResults = await fetchReaderData(link, READERAPI_KEY);&lt;/p&gt;

&lt;p&gt;output = {&lt;br&gt;
    readerResults&lt;br&gt;
  };&lt;/p&gt;

&lt;p&gt;return output;&lt;br&gt;
} catch (error) {&lt;br&gt;
  return {&lt;br&gt;
    error: error.message,&lt;br&gt;
    content: null,&lt;br&gt;
    title: null,&lt;br&gt;
    metadata: null&lt;br&gt;
  };&lt;br&gt;
}&lt;br&gt;
Again, set your Input Data for this action:&lt;/p&gt;

&lt;p&gt;link - the value of the link from our Looping action.&lt;br&gt;
READERAPI_KEY - your Jina AI Reader API key, found on the Reader API website.&lt;br&gt;
Once complete, your second Code by Zapier action should look like this:&lt;/p&gt;

&lt;p&gt;Finally, we can send our data from the Reader API to an app. For this example, we will use Google Docs, but feel free to explore Zapier's integrations and find the action that best fits your use case. I recommend trying with Google Docs first to better understand the output data.&lt;/p&gt;

&lt;p&gt;Add the "Create Document From Text in Google Docs" action to your Loop. This will create a new Google Doc for each URL the Loop processes. Feel free to adjust as needed.&lt;/p&gt;

&lt;p&gt;If you have not done so previously, you may need to authorize Zapier to access your Google account. Once complete, your Setup tab should look something like this:&lt;/p&gt;

&lt;p&gt;Click "Continue" or navigate to the "Configure" tab. Here, we can set a name for the document and the Google Drive folder in which you would like to save the document and pass the content returned from the Reader API to the document's content.&lt;/p&gt;

&lt;p&gt;And you're all set! Make sure to test the Zap thoroughly to ensure there are no issues, and check your Google Drive to ensure the document was created.&lt;/p&gt;

&lt;p&gt;Conclusion&lt;br&gt;
No-code tools such as Make and Zapier provide powerful tools for users who might not be comfortable writing or maintaining scripts necessary to scrape valuable data found in search results. While SerpApi isn't necessarily a no-code solution, it abstracts away many complex concepts required to scrape search engines. When paired with no-code tools, even non-technical users can leverage this data.&lt;/p&gt;

</description>
      <category>nocode</category>
      <category>webscraping</category>
      <category>zapier</category>
      <category>serpapi</category>
    </item>
    <item>
      <title>No-code Solutions for Turning Search Results Into Markdown for LLMs</title>
      <dc:creator>Nathan Skiles</dc:creator>
      <pubDate>Wed, 08 Jan 2025 22:59:36 +0000</pubDate>
      <link>https://dev.to/nate_serpapi/no-code-solutions-for-turning-search-results-into-markdown-for-llms-2d01</link>
      <guid>https://dev.to/nate_serpapi/no-code-solutions-for-turning-search-results-into-markdown-for-llms-2d01</guid>
      <description>&lt;h2&gt;
  
  
  Intro
&lt;/h2&gt;

&lt;p&gt;In my last post, I walked through a coding solution in which we used Node.js to scrape webpages returned in Google Search results and parse them to Markdown for use in LLMs or other use cases. Today, we will do something similar, but we will utilize no-code solutions to return results from SerpApi’s Google Search API and parse to markdown using the Reader API by Jina.ai.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/blog/turning-search-results-into-markdown-for-llms-2/" rel="noopener noreferrer"&gt;Turning Search Results Into Markdown for LLMs ↗&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Reader by Jina
&lt;/h2&gt;

&lt;p&gt;Jina AI’s Reader API (&lt;a href="https://jina.ai/reader/" rel="noopener noreferrer"&gt;https://jina.ai/reader/&lt;/a&gt;) offers an alternative for extracting structured content from web pages, maintaining document structure, and handling diverse content types.&lt;/p&gt;

&lt;p&gt;💡 Jina AI provides 1 million test credits to get started, but keep in mind that tokens are used based on the amount of content parsed, not the number of requests made.&lt;/p&gt;

&lt;h2&gt;
  
  
  Scraping Search Results
&lt;/h2&gt;

&lt;p&gt;Several options integrate well with SerpApi for those who prefer no-code solutions. This section will walk through configuring Make and Zapier to query search results through SerpApi.&lt;/p&gt;

&lt;h2&gt;
  
  
  Make
&lt;/h2&gt;

&lt;p&gt;Starting with &lt;a href="https://make.com/" rel="noopener noreferrer"&gt;Make&lt;/a&gt;, we will create a Google Doc from the data returned by SerpApi and the Reader API. This is a relatively simple example, but Make provides many integrations with AI providers such as OpenAI, Anthropic, and more. Instead of sending the data to a Google Doc, you can replace the last module and send it wherever you need it!&lt;/p&gt;

&lt;p&gt;SerpApi has an official Make app, making it much more straightforward than Zapier to configure. While I will only be walking through how to configure the SerpApi Google Search module, for more information on our Make app, check out this blog post:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/blog/announcing-serpapis-make-app/" rel="noopener noreferrer"&gt;Announcing SerpApi’s Make App ↗&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Create a new scenario and search for the SerpApi app to get started. Select your preferred search engine module; we will use Google for this example.&lt;/p&gt;

&lt;p&gt;After selecting your search engine, we can start configuring our call to SerpApi. You’ll add your SerpApi API key by creating a connection. Click “Create a connection.”&lt;/p&gt;

&lt;p&gt;Enter your API key and name your connection:&lt;/p&gt;

&lt;p&gt;From here, we can configure our query:&lt;/p&gt;

&lt;p&gt;You can fetch additional pages by increasing the “Pagination Limit” setting but note that each page will cost 1 SerpApi search credit.&lt;/p&gt;

&lt;p&gt;Click “Run once” and verify the data coming back from SerpApi:&lt;/p&gt;

&lt;p&gt;Now that we have results we can work with, let’s add an “Iterator” module to allow us to use the data returned in the &lt;code&gt;organic_results&lt;/code&gt; array. The Iterator module (&lt;a href="https://www.make.com/en/help/tools/flow-control#iterator-935250" rel="noopener noreferrer"&gt;documentation&lt;/a&gt;) converts arrays into a series of bundles. Each element in the array will output as a separate bundle, allowing us to query the Reader API for each link returned in our Google results.&lt;/p&gt;

&lt;p&gt;To query the Reader API for each link returned, let’s add an HTTP “Make a request” module that allows us to send HTTP requests.&lt;/p&gt;

&lt;p&gt;Set the URL parameter to &lt;code&gt;https://r.jina.ai/&lt;/code&gt;, then click the box again to pull up a list of available data we can pass to this parameter. From our Iterator module, select the “link” attribute. This passes the link from our organic Google search results to the Reader URL. Here’s what that should look like:&lt;/p&gt;

&lt;p&gt;Now, we need to set our header for authorization by passing our Bearer token provided by Jina AI:&lt;/p&gt;

&lt;p&gt;Before running again, let’s add our final step to create a Google Doc to store the results. Again, this final step can be changed depending on your use case and where you wish to send the data.&lt;/p&gt;

&lt;p&gt;Add the Google Docs “Create a Document” module and add a connection to your Google account. Now you can set a name for the document as well as pass data from the Reader API as the content of the doc:&lt;/p&gt;

&lt;p&gt;Now lets run our scenario once more to ensure that everything is working correctly.&lt;/p&gt;

&lt;p&gt;While testing, we can set a limiter filter after the Iterator module to prevent the flow from continuing more than once.&lt;/p&gt;

&lt;p&gt;Click the wrench icon under the link between the Iterator and HTTP modules. Create a condition only to continue if the Iterator’s “position” attribute is numerically equal to “1”:&lt;/p&gt;

&lt;p&gt;Make sure to remove the limiter before publishing if you wish to query for all the results. Alternatively, you set the &lt;code&gt;num&lt;/code&gt; parameter in the SerpApi Search Google module to limit results returned by SerpApi.&lt;/p&gt;

&lt;h2&gt;
  
  
  Zapier
&lt;/h2&gt;

&lt;p&gt;While this article is focused on no-code solutions, SerpApi currently does not have a Zapier integration. Meaning we must use a little code to make our API requests to SerpApi. I have provided a code below that you can drop into Zapier Code steps without issue, though some adjustments may be needed to fit your use case better. I recommend starting with the code I’ve provided and then adjusting once you get results from SerpApi and the Reader API.&lt;/p&gt;

&lt;p&gt;In this section, we will walk through:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Use the Schedule Trigger to initiate Zap&lt;/li&gt;
&lt;li&gt;  Retrieve search results URLs from SerpApi&lt;/li&gt;
&lt;li&gt;  Process webpages from URLs through Reader API to Markdown&lt;/li&gt;
&lt;li&gt;  Send formatted content to your desired apps&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;As this Zap requires API calls to both SerpApi and the Reader API, you will likely need a Trial or Professional/Team plan. Zapier limits code runtime to 1 second on the free plan, and both API requests will likely take longer (&lt;a href="https://help.zapier.com/hc/en-us/articles/29971850476173-Code-by-Zapier-rate-limits" rel="noopener noreferrer"&gt;documentation&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;To get started, add a trigger to your Zap. For this post, we will use the Schedule by Zapier trigger to our Zap every day at 12:00 a.m. You can select a trigger that best fits your needs.&lt;/p&gt;

&lt;p&gt;💡 Make sure to test each step before continuing to ensure you’ve created test data for future steps. Zapier will typically force you to test a step after changes before publishing.&lt;/p&gt;

&lt;p&gt;Next, we add a Code by Zapier action to make our request to SerpApi. This example uses JavaScript, but you can choose Python if you prefer to use the code below as a guideline.&lt;/p&gt;

&lt;p&gt;I’ve included the JavaScript code for this action below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;SERPAPI_KEY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;QUERY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;LOCATION&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;inputData&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;fetchGoogleSearchResults&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;query&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;location&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;fetch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
      &lt;span class="s2"&gt;`https://serpapi.com/search.json?engine=google&amp;amp;q=&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nf"&gt;encodeURIComponent&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="nx"&gt;query&lt;/span&gt;
      &lt;span class="p"&gt;)}&lt;/span&gt;&lt;span class="s2"&gt;&amp;amp;location=&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nf"&gt;encodeURIComponent&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="nx"&gt;location&lt;/span&gt;
      &lt;span class="p"&gt;)}&lt;/span&gt;&lt;span class="s2"&gt;&amp;amp;gl=us&amp;amp;hl=en&amp;amp;api_key=&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;SERPAPI_KEY&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;
    &lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;ok&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="k"&gt;throw&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`HTTP error! status: &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;status&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="k"&gt;throw&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;No data returned from SerpAPI&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="c1"&gt;// Instead of just logging, throw the error&lt;/span&gt;
    &lt;span class="k"&gt;throw&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`SerpAPI request failed: &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;searchResults&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;fetchGoogleSearchResults&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;QUERY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;LOCATION&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="nx"&gt;output&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;organicResults&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;searchResults&lt;/span&gt;&lt;span class="p"&gt;?.&lt;/span&gt;&lt;span class="nx"&gt;organic_results&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;No Organic Result Found&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
  &lt;span class="p"&gt;};&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;output&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Error:&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="c1"&gt;// Return a default response or throw error based on your needs&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;organic_results&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt; &lt;span class="p"&gt;};&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After adding the above code to the action, add your input data:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;code&gt;SERPAPI_KEY&lt;/code&gt; - your SerpApi key goes here. You can find your API key in your SerpApi &lt;a href="https://serpapi.com/manage-api-key" rel="noopener noreferrer"&gt;dashboard&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;  &lt;code&gt;QUERY&lt;/code&gt; - query parameter for the search you wish to perform.&lt;/li&gt;
&lt;li&gt;  &lt;code&gt;LOCATION&lt;/code&gt; - location parameter for the search you wish to perform.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you wish to add additional search parameters, you can set them as Input Data or update the URL in the code directly. Setting them as Input Data allows you to update them more easily in the future.&lt;/p&gt;

&lt;p&gt;Your first code step should now look something like this:&lt;/p&gt;

&lt;p&gt;Our request from SerpApi will return an array of links for our organic results. To send each link to the Reader API, we must iterate through the links using the Looping by Zapier action.&lt;/p&gt;

&lt;p&gt;This action takes an array of values and runs the actions within the loop for each value. For our “Values to Loop,” we can pass the “Organic Results Link” values from our second step to a key called “links”.&lt;/p&gt;

&lt;p&gt;Let’s also set the maximum number of loop iterations to &lt;code&gt;1&lt;/code&gt; for testing to ensure we aren't spending Reader API credits needlessly.&lt;/p&gt;

&lt;p&gt;Inside the Looping by Zapier action, we can add another Code by Zapier action to make our request to the Reader API. Paste the JavaScript below in the code section of this action:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;link&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;READERAPI_KEY&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;inputData&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;fetchReaderData&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;link&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;apiKey&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="c1"&gt;// Encode URL for path parameter&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;encodedUrl&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;encodeURIComponent&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;link&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;fetch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`https://r.jina.ai/&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;encodedUrl&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;method&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;GET&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;headers&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Authorization&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`Bearer &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;apiKey&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Accept&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;application/json&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;
    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;ok&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="k"&gt;throw&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`HTTP error! status: &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;status&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="k"&gt;throw&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;No data returned from Reader API&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Reader API request failed:&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="k"&gt;throw&lt;/span&gt; &lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;readerResults&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;fetchReaderData&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;link&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;READERAPI_KEY&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="nx"&gt;output&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;readerResults&lt;/span&gt;
  &lt;span class="p"&gt;};&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;output&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;error&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;content&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;title&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;metadata&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;
  &lt;span class="p"&gt;};&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Again, set your Input Data for this action:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;code&gt;link&lt;/code&gt; - the value of the link from our Looping action.&lt;/li&gt;
&lt;li&gt;  &lt;code&gt;READERAPI_KEY&lt;/code&gt; - your Jina AI Reader API key, found on the Reader API &lt;a href="https://jina.ai/reader/" rel="noopener noreferrer"&gt;website&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Once complete, your second Code by Zapier action should look like this:&lt;/p&gt;

&lt;p&gt;Finally, we can send our data from the Reader API to an app. For this example, we will use Google Docs, but feel free to explore Zapier’s integrations and find the action that best fits your use case. I recommend trying with Google Docs first to better understand the output data.&lt;/p&gt;

&lt;p&gt;Add the “Create Document From Text in Google Docs” action to your Loop. This will create a new Google Doc for each URL the Loop processes. Feel free to adjust as needed.&lt;/p&gt;

&lt;p&gt;If you have not done so previously, you may need to authorize Zapier to access your Google account. Once complete, your Setup tab should look something like this:&lt;/p&gt;

&lt;p&gt;Click “Continue” or navigate to the “Configure” tab. Here, we can set a name for the document and the Google Drive folder in which you would like to save the document and pass the content returned from the Reader API to the document’s content.&lt;/p&gt;

&lt;p&gt;And you’re all set! Make sure to test the Zap thoroughly to ensure there are no issues, and check your Google Drive to ensure the document was created.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;No-code tools such as Make and Zapier provide powerful tools for users who might not be comfortable writing or maintaining scripts necessary to scrape valuable data found in search results. While SerpApi isn’t necessarily a no-code solution, it abstracts away many complex concepts required to scrape search engines. When paired with no-code tools, even non-technical users can leverage this data.&lt;/p&gt;

</description>
      <category>nocode</category>
      <category>webscraping</category>
      <category>zapier</category>
      <category>serpapi</category>
    </item>
    <item>
      <title>No-code Solutions for Turning Search Results Into Markdown for LLMs</title>
      <dc:creator>Nathan Skiles</dc:creator>
      <pubDate>Wed, 08 Jan 2025 22:59:36 +0000</pubDate>
      <link>https://dev.to/nate_serpapi/no-code-solutions-for-turning-search-results-into-markdown-for-llms-4jmb</link>
      <guid>https://dev.to/nate_serpapi/no-code-solutions-for-turning-search-results-into-markdown-for-llms-4jmb</guid>
      <description>&lt;h2&gt;
  
  
  Intro
&lt;/h2&gt;

&lt;p&gt;In my last post, I walked through a coding solution in which we used Node.js to scrape webpages returned in Google Search results and parse them to Markdown for use in LLMs or other use cases. Today, we will do something similar, but we will utilize no-code solutions to return results from SerpApi’s Google Search API and parse to markdown using the Reader API by Jina.ai.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/blog/turning-search-results-into-markdown-for-llms-2/" rel="noopener noreferrer"&gt;Turning Search Results Into Markdown for LLMs ↗&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Reader by Jina
&lt;/h2&gt;

&lt;p&gt;Jina AI’s Reader API (&lt;a href="https://jina.ai/reader/" rel="noopener noreferrer"&gt;https://jina.ai/reader/&lt;/a&gt;) offers an alternative for extracting structured content from web pages, maintaining document structure, and handling diverse content types.&lt;/p&gt;

&lt;p&gt;💡 Jina AI provides 1 million test credits to get started, but keep in mind that tokens are used based on the amount of content parsed, not the number of requests made.&lt;/p&gt;

&lt;h2&gt;
  
  
  Scraping Search Results
&lt;/h2&gt;

&lt;p&gt;Several options integrate well with SerpApi for those who prefer no-code solutions. This section will walk through configuring Make and Zapier to query search results through SerpApi.&lt;/p&gt;

&lt;h2&gt;
  
  
  Make
&lt;/h2&gt;

&lt;p&gt;Starting with &lt;a href="https://make.com/" rel="noopener noreferrer"&gt;Make&lt;/a&gt;, we will create a Google Doc from the data returned by SerpApi and the Reader API. This is a relatively simple example, but Make provides many integrations with AI providers such as OpenAI, Anthropic, and more. Instead of sending the data to a Google Doc, you can replace the last module and send it wherever you need it!&lt;/p&gt;

&lt;p&gt;SerpApi has an official Make app, making it much more straightforward than Zapier to configure. While I will only be walking through how to configure the SerpApi Google Search module, for more information on our Make app, check out this blog post:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/blog/announcing-serpapis-make-app/" rel="noopener noreferrer"&gt;Announcing SerpApi’s Make App ↗&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Create a new scenario and search for the SerpApi app to get started. Select your preferred search engine module; we will use Google for this example.&lt;/p&gt;

&lt;p&gt;After selecting your search engine, we can start configuring our call to SerpApi. You’ll add your SerpApi API key by creating a connection. Click “Create a connection.”&lt;/p&gt;

&lt;p&gt;Enter your API key and name your connection:&lt;/p&gt;

&lt;p&gt;From here, we can configure our query:&lt;/p&gt;

&lt;p&gt;You can fetch additional pages by increasing the “Pagination Limit” setting but note that each page will cost 1 SerpApi search credit.&lt;/p&gt;

&lt;p&gt;Click “Run once” and verify the data coming back from SerpApi:&lt;/p&gt;

&lt;p&gt;Now that we have results we can work with, let’s add an “Iterator” module to allow us to use the data returned in the &lt;code&gt;organic_results&lt;/code&gt; array. The Iterator module (&lt;a href="https://www.make.com/en/help/tools/flow-control#iterator-935250" rel="noopener noreferrer"&gt;documentation&lt;/a&gt;) converts arrays into a series of bundles. Each element in the array will output as a separate bundle, allowing us to query the Reader API for each link returned in our Google results.&lt;/p&gt;

&lt;p&gt;To query the Reader API for each link returned, let’s add an HTTP “Make a request” module that allows us to send HTTP requests.&lt;/p&gt;

&lt;p&gt;Set the URL parameter to &lt;code&gt;https://r.jina.ai/&lt;/code&gt;, then click the box again to pull up a list of available data we can pass to this parameter. From our Iterator module, select the “link” attribute. This passes the link from our organic Google search results to the Reader URL. Here’s what that should look like:&lt;/p&gt;

&lt;p&gt;Now, we need to set our header for authorization by passing our Bearer token provided by Jina AI:&lt;/p&gt;

&lt;p&gt;Before running again, let’s add our final step to create a Google Doc to store the results. Again, this final step can be changed depending on your use case and where you wish to send the data.&lt;/p&gt;

&lt;p&gt;Add the Google Docs “Create a Document” module and add a connection to your Google account. Now you can set a name for the document as well as pass data from the Reader API as the content of the doc:&lt;/p&gt;

&lt;p&gt;Now lets run our scenario once more to ensure that everything is working correctly.&lt;/p&gt;

&lt;p&gt;While testing, we can set a limiter filter after the Iterator module to prevent the flow from continuing more than once.&lt;/p&gt;

&lt;p&gt;Click the wrench icon under the link between the Iterator and HTTP modules. Create a condition only to continue if the Iterator’s “position” attribute is numerically equal to “1”:&lt;/p&gt;

&lt;p&gt;Make sure to remove the limiter before publishing if you wish to query for all the results. Alternatively, you set the &lt;code&gt;num&lt;/code&gt; parameter in the SerpApi Search Google module to limit results returned by SerpApi.&lt;/p&gt;

&lt;h2&gt;
  
  
  Zapier
&lt;/h2&gt;

&lt;p&gt;While this article is focused on no-code solutions, SerpApi currently does not have a Zapier integration. Meaning we must use a little code to make our API requests to SerpApi. I have provided a code below that you can drop into Zapier Code steps without issue, though some adjustments may be needed to fit your use case better. I recommend starting with the code I’ve provided and then adjusting once you get results from SerpApi and the Reader API.&lt;/p&gt;

&lt;p&gt;In this section, we will walk through:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Use the Schedule Trigger to initiate Zap&lt;/li&gt;
&lt;li&gt;  Retrieve search results URLs from SerpApi&lt;/li&gt;
&lt;li&gt;  Process webpages from URLs through Reader API to Markdown&lt;/li&gt;
&lt;li&gt;  Send formatted content to your desired apps&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;As this Zap requires API calls to both SerpApi and the Reader API, you will likely need a Trial or Professional/Team plan. Zapier limits code runtime to 1 second on the free plan, and both API requests will likely take longer (&lt;a href="https://help.zapier.com/hc/en-us/articles/29971850476173-Code-by-Zapier-rate-limits" rel="noopener noreferrer"&gt;documentation&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;To get started, add a trigger to your Zap. For this post, we will use the Schedule by Zapier trigger to our Zap every day at 12:00 a.m. You can select a trigger that best fits your needs.&lt;/p&gt;

&lt;p&gt;💡 Make sure to test each step before continuing to ensure you’ve created test data for future steps. Zapier will typically force you to test a step after changes before publishing.&lt;/p&gt;

&lt;p&gt;Next, we add a Code by Zapier action to make our request to SerpApi. This example uses JavaScript, but you can choose Python if you prefer to use the code below as a guideline.&lt;/p&gt;

&lt;p&gt;I’ve included the JavaScript code for this action below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;SERPAPI_KEY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;QUERY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;LOCATION&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;inputData&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;fetchGoogleSearchResults&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;query&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;location&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;fetch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
      &lt;span class="s2"&gt;`https://serpapi.com/search.json?engine=google&amp;amp;q=&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nf"&gt;encodeURIComponent&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="nx"&gt;query&lt;/span&gt;
      &lt;span class="p"&gt;)}&lt;/span&gt;&lt;span class="s2"&gt;&amp;amp;location=&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nf"&gt;encodeURIComponent&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="nx"&gt;location&lt;/span&gt;
      &lt;span class="p"&gt;)}&lt;/span&gt;&lt;span class="s2"&gt;&amp;amp;gl=us&amp;amp;hl=en&amp;amp;api_key=&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;SERPAPI_KEY&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;
    &lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;ok&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="k"&gt;throw&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`HTTP error! status: &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;status&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="k"&gt;throw&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;No data returned from SerpAPI&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="c1"&gt;// Instead of just logging, throw the error&lt;/span&gt;
    &lt;span class="k"&gt;throw&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`SerpAPI request failed: &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;searchResults&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;fetchGoogleSearchResults&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;QUERY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;LOCATION&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="nx"&gt;output&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;organicResults&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;searchResults&lt;/span&gt;&lt;span class="p"&gt;?.&lt;/span&gt;&lt;span class="nx"&gt;organic_results&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;No Organic Result Found&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
  &lt;span class="p"&gt;};&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;output&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Error:&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="c1"&gt;// Return a default response or throw error based on your needs&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;organic_results&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt; &lt;span class="p"&gt;};&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After adding the above code to the action, add your input data:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;code&gt;SERPAPI_KEY&lt;/code&gt; - your SerpApi key goes here. You can find your API key in your SerpApi &lt;a href="https://serpapi.com/manage-api-key" rel="noopener noreferrer"&gt;dashboard&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;  &lt;code&gt;QUERY&lt;/code&gt; - query parameter for the search you wish to perform.&lt;/li&gt;
&lt;li&gt;  &lt;code&gt;LOCATION&lt;/code&gt; - location parameter for the search you wish to perform.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you wish to add additional search parameters, you can set them as Input Data or update the URL in the code directly. Setting them as Input Data allows you to update them more easily in the future.&lt;/p&gt;

&lt;p&gt;Your first code step should now look something like this:&lt;/p&gt;

&lt;p&gt;Our request from SerpApi will return an array of links for our organic results. To send each link to the Reader API, we must iterate through the links using the Looping by Zapier action.&lt;/p&gt;

&lt;p&gt;This action takes an array of values and runs the actions within the loop for each value. For our “Values to Loop,” we can pass the “Organic Results Link” values from our second step to a key called “links”.&lt;/p&gt;

&lt;p&gt;Let’s also set the maximum number of loop iterations to &lt;code&gt;1&lt;/code&gt; for testing to ensure we aren't spending Reader API credits needlessly.&lt;/p&gt;

&lt;p&gt;Inside the Looping by Zapier action, we can add another Code by Zapier action to make our request to the Reader API. Paste the JavaScript below in the code section of this action:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;link&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;READERAPI_KEY&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;inputData&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;fetchReaderData&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;link&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;apiKey&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="c1"&gt;// Encode URL for path parameter&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;encodedUrl&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;encodeURIComponent&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;link&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;fetch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`https://r.jina.ai/&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;encodedUrl&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;method&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;GET&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;headers&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Authorization&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`Bearer &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;apiKey&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Accept&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;application/json&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;
    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;ok&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="k"&gt;throw&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`HTTP error! status: &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;status&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="k"&gt;throw&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;No data returned from Reader API&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Reader API request failed:&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="k"&gt;throw&lt;/span&gt; &lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;readerResults&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;fetchReaderData&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;link&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;READERAPI_KEY&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="nx"&gt;output&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;readerResults&lt;/span&gt;
  &lt;span class="p"&gt;};&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;output&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;error&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;content&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;title&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;metadata&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;
  &lt;span class="p"&gt;};&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Again, set your Input Data for this action:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;code&gt;link&lt;/code&gt; - the value of the link from our Looping action.&lt;/li&gt;
&lt;li&gt;  &lt;code&gt;READERAPI_KEY&lt;/code&gt; - your Jina AI Reader API key, found on the Reader API &lt;a href="https://jina.ai/reader/" rel="noopener noreferrer"&gt;website&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Once complete, your second Code by Zapier action should look like this:&lt;/p&gt;

&lt;p&gt;Finally, we can send our data from the Reader API to an app. For this example, we will use Google Docs, but feel free to explore Zapier’s integrations and find the action that best fits your use case. I recommend trying with Google Docs first to better understand the output data.&lt;/p&gt;

&lt;p&gt;Add the “Create Document From Text in Google Docs” action to your Loop. This will create a new Google Doc for each URL the Loop processes. Feel free to adjust as needed.&lt;/p&gt;

&lt;p&gt;If you have not done so previously, you may need to authorize Zapier to access your Google account. Once complete, your Setup tab should look something like this:&lt;/p&gt;

&lt;p&gt;Click “Continue” or navigate to the “Configure” tab. Here, we can set a name for the document and the Google Drive folder in which you would like to save the document and pass the content returned from the Reader API to the document’s content.&lt;/p&gt;

&lt;p&gt;And you’re all set! Make sure to test the Zap thoroughly to ensure there are no issues, and check your Google Drive to ensure the document was created.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;No-code tools such as Make and Zapier provide powerful tools for users who might not be comfortable writing or maintaining scripts necessary to scrape valuable data found in search results. While SerpApi isn’t necessarily a no-code solution, it abstracts away many complex concepts required to scrape search engines. When paired with no-code tools, even non-technical users can leverage this data.&lt;/p&gt;

</description>
      <category>nocode</category>
      <category>webscraping</category>
      <category>zapier</category>
      <category>serpapi</category>
    </item>
    <item>
      <title>Turning search results into Markdown for LLMs</title>
      <dc:creator>Nathan Skiles</dc:creator>
      <pubDate>Fri, 06 Dec 2024 00:24:02 +0000</pubDate>
      <link>https://dev.to/nate_serpapi/turning-search-results-into-markdown-for-llms-1jc5</link>
      <guid>https://dev.to/nate_serpapi/turning-search-results-into-markdown-for-llms-1jc5</guid>
      <description>&lt;h2&gt;
  
  
  Intro
&lt;/h2&gt;

&lt;p&gt;This article will walk through converting search results into Markdown format, suitable for use in large language models (LLMs) and other applications.&lt;/p&gt;

&lt;p&gt;Markdown is a lightweight markup language that provides a simple, readable way to format text with plain-text syntax. Checkout Markdown Guide for more information:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.markdownguide.org/" rel="noopener noreferrer"&gt;Markdown Guide&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Use Case
&lt;/h2&gt;

&lt;p&gt;Markdown's simple, readable format allows for the transformation of raw webpage data into clean, actionable information across different use cases:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;LLM Training:&lt;/strong&gt; Generate Q&amp;amp;A datasets or custom knowledge bases.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Content Aggregation:&lt;/strong&gt; Create training datasets or compile research.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Market Research:&lt;/strong&gt; Monitor competitors or gather product information.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  SerpApi
&lt;/h2&gt;

&lt;p&gt;SerpApi is a web scraping company that allows developers to extract search engine results and data from various search engines, including Google, Bing, Yahoo, Baidu, Yandex, and others. It provides a simple way to access search engine data programmatically without dealing directly with the complexities of web scraping.&lt;/p&gt;

&lt;p&gt;This guide focuses on the Google Search API, but the concepts and techniques discussed can be adapted for use with SerpApi’s other APIs.&lt;/p&gt;

&lt;h2&gt;
  
  
  Google Search API
&lt;/h2&gt;

&lt;p&gt;The Google Search API lets developers programmatically retrieve structured JSON data from live Google searches. Key benefits include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;strong&gt;CAPTCHA and browser automation:&lt;/strong&gt; Avoid manual intervention or IP blocks.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Structured data:&lt;/strong&gt; Output is clean and easy to parse.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Global and multilingual support:&lt;/strong&gt; Search in specific languages or regions.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Scalability:&lt;/strong&gt; Perform high-volume searches without disruptions.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/search-api" rel="noopener noreferrer"&gt;Google Search Engine Results API&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Gettings Started
&lt;/h2&gt;

&lt;p&gt;This section provides a complete code example for fetching Google search results using SerpApi, parsing the webpage content, and converting it to Markdown. While this example uses Node.js (JavaScript), the same principles apply in other languages.&lt;/p&gt;

&lt;h3&gt;
  
  
  Required Packages
&lt;/h3&gt;

&lt;p&gt;Make sure to install the following pages in your Node.js project.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;SerpApi JavaScript:&lt;/strong&gt; Scrape and parse search engine results using SerpApi. Get search results from Google, Bing, Baidu, Yandex, Yahoo, Home Depot, eBay and more.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/serpapi/serpapi-javascript" rel="noopener noreferrer"&gt;SerpApi JavaScript&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Cheerio:&lt;/strong&gt; A fast, flexible, and elegant library for parsing and manipulating HTML and XML.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/cheeriojs/cheerio" rel="noopener noreferrer"&gt;Cheerio&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Turndown:&lt;/strong&gt; Convert HTML into Markdown with JavaScript.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/mixmark-io/turndown" rel="noopener noreferrer"&gt;Turndown&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Importing Packages
&lt;/h3&gt;

&lt;p&gt;First, we must import all of our required packages:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;dotenv&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;dotenv&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;fetch&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;node-fetch&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;fs&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;fs/promises&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;path&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;path&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;getJson&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;serpapi&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="nx"&gt;cheerio&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;cheerio&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;TurndownService&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;turndown&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Fetching Search Results
&lt;/h3&gt;

&lt;p&gt;The &lt;code&gt;fetchSearchResults&lt;/code&gt; function retrieves search results using SerpApi’s Google Search API:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;fetchSearchResults&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;query&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;getJson&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;google&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;api_key&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;SERPAPI_KEY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;q&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;query&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;num&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Create a &lt;strong&gt;.env&lt;/strong&gt; file, include your SerpApi key, and install the dotenv package. Or, replace the &lt;code&gt;process.env.SERPAPI_KEY&lt;/code&gt; process with your API key if you are simply running the script locally.&lt;/p&gt;

&lt;h3&gt;
  
  
  Parsing Webpage Content
&lt;/h3&gt;

&lt;p&gt;The &lt;code&gt;parseUrl&lt;/code&gt; function fetches the HTML of a given URL, cleans it, and converts it to Markdown:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;parseUrl&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;url&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="c1"&gt;// Configure fetch request with browser-like headers&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;fetch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;headers&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;User-Agent&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7)&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="na"&gt;Accept&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
          &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;

    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;ok&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="k"&gt;throw&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`HTTP error! status: &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;status&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;html&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;text&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

    &lt;span class="c1"&gt;// Initialize HTML parser and markdown converter&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;$&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;cheerio&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;load&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;html&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;turndown&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;TurndownService&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
      &lt;span class="na"&gt;headingStyle&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;atx&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;codeBlockStyle&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;fenced&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;

    &lt;span class="c1"&gt;// Clean up HTML by removing unnecessary elements&lt;/span&gt;
    &lt;span class="nf"&gt;$&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;script, style, nav, footer, iframe, .ads&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;remove&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

    &lt;span class="c1"&gt;// Extract title and main content&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;title&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;$&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;title&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;text&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;trim&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="nf"&gt;$&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;h1&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;first&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;text&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;trim&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;mainContent&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt;
      &lt;span class="nf"&gt;$&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;article, main, .content, #content, .post&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;first&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;html&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt;
      &lt;span class="nf"&gt;$&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;body&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;html&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;content&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;turndown&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;turndown&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;mainContent&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="dl"&gt;""&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;title&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;content&lt;/span&gt; &lt;span class="p"&gt;};&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`Failed to parse &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;url&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;:`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This function ensures a clean, readable Markdown by removing non-essential elements like scripts and ads.&lt;/p&gt;

&lt;h3&gt;
  
  
  Sanitizing Keywords
&lt;/h3&gt;

&lt;p&gt;To prevent filename issues, we can sanitize keywords before using them in filenames:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;sanitizeKeyword&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;keyword&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;keyword&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;replace&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sr"&gt;/&lt;/span&gt;&lt;span class="se"&gt;\\&lt;/span&gt;&lt;span class="sr"&gt;s+/g&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;_&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="c1"&gt;// Replace spaces with underscores&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;substring&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;15&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="c1"&gt;// Truncate to 15 characters&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;toLowerCase&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt; &lt;span class="c1"&gt;// Convert to lowercase&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Writing to Markdown
&lt;/h3&gt;

&lt;p&gt;This function writes the parsed content to a Markdown file, using the sanitize function to set the file's name:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;writeToMarkdown&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;keyword&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;index&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;url&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;sanitizedKeyword&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;sanitizeKeyword&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;keyword&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;filename&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;path&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;join&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;output&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Date&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;toISOString&lt;/span&gt;&lt;span class="p"&gt;()}&lt;/span&gt;&lt;span class="s2"&gt;_&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;sanitizedKeyword&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;_&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;index&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;.md`&lt;/span&gt;
  &lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;content&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;`[//]: # (Source: &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;url&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;)&lt;/span&gt;&lt;span class="se"&gt;\\&lt;/span&gt;&lt;span class="s2"&gt;n&lt;/span&gt;&lt;span class="se"&gt;\\&lt;/span&gt;&lt;span class="s2"&gt;n# &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;title&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="se"&gt;\\&lt;/span&gt;&lt;span class="s2"&gt;n&lt;/span&gt;&lt;span class="se"&gt;\\&lt;/span&gt;&lt;span class="s2"&gt;n&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;content&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;fs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;writeFile&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;filename&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;content&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;utf-8&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;filename&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Main Execution
&lt;/h3&gt;

&lt;p&gt;The main script invokes the process. Update the keywords array to keywords relevant to your use case:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// Example Keyword array&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;keywords&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;coffee&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;playstation 5&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;web scraping&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;];&lt;/span&gt;

&lt;span class="c1"&gt;// Main execution block&lt;/span&gt;
&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="c1"&gt;// Create output directory if it doesn't exist&lt;/span&gt;
    &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;fs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;mkdir&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;output&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;recursive&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;

    &lt;span class="c1"&gt;// Process each keyword&lt;/span&gt;
    &lt;span class="k"&gt;for &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;keyword&lt;/span&gt; &lt;span class="k"&gt;of&lt;/span&gt; &lt;span class="nx"&gt;keywords&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;results&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;fetchSearchResults&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;keyword&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

      &lt;span class="c1"&gt;// Process search results if available&lt;/span&gt;
      &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;results&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;organic_results&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="nx"&gt;results&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;organic_results&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;for &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;index&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;result&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;of&lt;/span&gt; &lt;span class="nx"&gt;results&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;organic_results&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;entries&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;parseUrl&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;result&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;link&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
            &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;filename&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;writeToMarkdown&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
              &lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
              &lt;span class="nx"&gt;keyword&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
              &lt;span class="nx"&gt;index&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
              &lt;span class="nx"&gt;result&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;link&lt;/span&gt;
            &lt;span class="p"&gt;);&lt;/span&gt;
            &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`Written to: &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;filename&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
          &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;err&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`Failed to process &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;result&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;link&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;:`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;err&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
            &lt;span class="k"&gt;continue&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
          &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`No organic results found for keyword: &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;keyword&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;})();&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To summarize the above, we:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;strong&gt;Setup output directory:&lt;/strong&gt; Ensures files are saved to an appropriate location.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Fetch and parse results:&lt;/strong&gt; Process each search result URL for relevant content.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Error handling:&lt;/strong&gt; Prevents the entire process from failing due to individual errors.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Next Steps
&lt;/h3&gt;

&lt;p&gt;While the above should get you started, you may need to configure Cheerio or Turndown further to dial in the sections you're scraping.&lt;/p&gt;

&lt;p&gt;You can find a repository for the above code here:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/NateSkiles/search-results-to-markdown" rel="noopener noreferrer"&gt;NateSkiles/search-results-to-markdown&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;SerpApi simplifies accessing structured search engine data through programmatic methods. By leveraging code-based solutions, developers can efficiently extract and transform web pages from search results into usable formats, enabling data collection and analysis.&lt;/p&gt;

&lt;h2&gt;
  
  
  Related Blogs
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/blog/how-to-scrape-google-search-results-with-python/" rel="noopener noreferrer"&gt;How to scrape Google search results with Python&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/blog/how-to-scrape-google-search-results-serps-2023-guide/" rel="noopener noreferrer"&gt;How to Scrape Google Search Results (SERPs) - 2024 Guide&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/blog/google-reviews-analyzer/" rel="noopener noreferrer"&gt;Unlocking Insights: Analyzing Google Reviews with LLMs&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/blog/scrape-google-ai-overviews/" rel="noopener noreferrer"&gt;How to Scrape Google AI Overviews (AIO)&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>webscraping</category>
      <category>markdown</category>
      <category>serpapi</category>
      <category>llm</category>
    </item>
    <item>
      <title>Bulk Image Search with Google Lens</title>
      <dc:creator>Nathan Skiles</dc:creator>
      <pubDate>Tue, 12 Nov 2024 20:36:13 +0000</pubDate>
      <link>https://dev.to/nate_serpapi/bulk-image-search-with-google-lens-53ia</link>
      <guid>https://dev.to/nate_serpapi/bulk-image-search-with-google-lens-53ia</guid>
      <description>&lt;h2&gt;
  
  
  Bulk Image Search: Getting Started
&lt;/h2&gt;

&lt;p&gt;Introducing the Bulk Image Search tool - a simple, no-code web app to utilize SerpApi's Google Lens API. Whether you're a researcher, marketer, or someone who performs many visual searches, the Bulk Image Search tool streamlines visual searches and helps you get the insights you need faster. Keep reading to learn how to get started.&lt;/p&gt;

&lt;h2&gt;
  
  
  What Does It Do?
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Upload images to Imgur in bulk&lt;/li&gt;
&lt;li&gt;Scrape multiple Google Lens search results simultaneously&lt;/li&gt;
&lt;li&gt;Download search results in your preferred format (CSV, JSON, or Excel)&lt;/li&gt;
&lt;li&gt;Access images from your Imgur account&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  SerpApi Introduction
&lt;/h2&gt;

&lt;p&gt;SerpApi is a web scraping company that allows developers to extract search engine results and data from various search engines, including Google, Bing, Yahoo, Baidu, Yandex, and others. It provides a simple way to access search engine data programmatically without dealing directly with the complexities of web scraping.&lt;/p&gt;

&lt;h3&gt;
  
  
  Google Lens API
&lt;/h3&gt;

&lt;p&gt;This tool uses SerpApi's Google Lens API to scrape search results from Google Lens search results (&lt;a href="https://lens.google/" rel="noopener noreferrer"&gt;https://lens.google/&lt;/a&gt;) when performing an image search. The results related to the image could contain visual matches, video, text, and other data. Search results can be returned from specific countries or languages.&lt;/p&gt;

&lt;p&gt;The Google Lens API currently only supports images with URLs, meaning the image must be hosted publicly to be searched. SerpApi has an open feature request to enable file uploads, and you can follow on it's progress here:&lt;/p&gt;


&lt;div class="ltag_github-liquid-tag"&gt;
  &lt;h1&gt;
    &lt;a href="https://github.com/serpapi/public-roadmap/issues/948" rel="noopener noreferrer"&gt;
      &lt;img class="github-logo" alt="GitHub logo" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fassets.dev.to%2Fassets%2Fgithub-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg"&gt;
      &lt;span class="issue-title"&gt;
        [Google Lens API] Support image search via file upload
      &lt;/span&gt;
      &lt;span class="issue-number"&gt;#948&lt;/span&gt;
    &lt;/a&gt;
  &lt;/h1&gt;
  &lt;div class="github-thread"&gt;
    &lt;div class="timeline-comment-header"&gt;
      &lt;a href="https://github.com/ilyazub" rel="noopener noreferrer"&gt;
        &lt;img class="github-liquid-tag-img" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Favatars.githubusercontent.com%2Fu%2F282605%3Fv%3D4" alt="ilyazub avatar"&gt;
      &lt;/a&gt;
      &lt;div class="timeline-comment-header-text"&gt;
        &lt;strong&gt;
          &lt;a href="https://github.com/ilyazub" rel="noopener noreferrer"&gt;ilyazub&lt;/a&gt;
        &lt;/strong&gt; posted on &lt;a href="https://github.com/serpapi/public-roadmap/issues/948" rel="noopener noreferrer"&gt;&lt;time&gt;Jun 21, 2023&lt;/time&gt;&lt;/a&gt;
      &lt;/div&gt;
    &lt;/div&gt;
    &lt;div class="ltag-github-body"&gt;
      &lt;p&gt;To-do&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;[ ] Implement Google Lens and Reverse Image search via file upload&lt;/li&gt;
&lt;li&gt;[ ] Document that &lt;code&gt;async=1&lt;/code&gt; is not available for these requests&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Experimental implementation: &lt;a href="https://github.com/serpapi/SerpApi/pull/2412" rel="noopener noreferrer"&gt;https://github.com/serpapi/SerpApi/pull/2412&lt;/a&gt;&lt;/p&gt;

    &lt;/div&gt;
    &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/serpapi/public-roadmap/issues/948" rel="noopener noreferrer"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
  &lt;/div&gt;
&lt;/div&gt;


&lt;h3&gt;
  
  
  Use Cases
&lt;/h3&gt;

&lt;p&gt;The Google Lens API can be leveraged for various use cases across different industries. Here are a few examples:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Visual Search for E-commerce&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Identify similar products or alternatives based on an image of a product&lt;/li&gt;
&lt;li&gt;Monitor competitor pricing and product offerings by searching for their images&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Academic and Scientific Research&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Identify plant and animal species from images&lt;/li&gt;
&lt;li&gt;Reverse image search to find the original source of an image&lt;/li&gt;
&lt;li&gt;Find similar images for training AI models&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Trend Analysis&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Detect emerging visual styles, designs, or themes across industries&lt;/li&gt;
&lt;li&gt;Monitor changes in product packaging, advertising, or website designs over time&lt;/li&gt;
&lt;li&gt;Gather insights on popular visual motifs, objects, or scenes&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The Google Lens API is a versatile and powerful tool for developers, researchers, marketers, and anyone who needs to extract insights from visual data at scale. By leveraging programmatic access to this visual search functionality, users can streamline their workflows.&lt;/p&gt;

&lt;p&gt;You can test the Google Lens API in SerpApi's playground environment found here:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/playground?engine=google_lens" rel="noopener noreferrer"&gt;SerpApi Playground&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Quick Setup&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;1 - Sign up for free accounts:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;An &lt;a href="https://imgur.com/" rel="noopener noreferrer"&gt;Imgur account&lt;/a&gt;, for storing your images.&lt;/li&gt;
&lt;li&gt;A &lt;a href="https://serpapi.com/users/sign_up" rel="noopener noreferrer"&gt;SerpApi account&lt;/a&gt;, is needed to perform searches. SerpApi provides a free plan with 100 successful searches included every month.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;2 - Log in to Bulk Image Search with your Imgur account&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fua0ucyjunuscowqwoypy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fua0ucyjunuscowqwoypy.png" alt="Log in" width="800" height="522"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;3 - Add your &lt;a href="https://serpapi.com/dashboard" rel="noopener noreferrer"&gt;SerpApi API key&lt;/a&gt; to the &lt;a href="https://bulkimagesearch.com/settings" rel="noopener noreferrer"&gt;settings page&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F24gqau6ximz13b3h4n5i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F24gqau6ximz13b3h4n5i.png" alt="Save SerpApi Key" width="800" height="521"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Using the Tool
&lt;/h2&gt;

&lt;p&gt;In this section, we will walk through performing a bulk image search. The tool has three simple steps: uploading your images, configuring your search parameters, and reviewing your results.&lt;/p&gt;

&lt;h3&gt;
  
  
  Upload Images
&lt;/h3&gt;

&lt;p&gt;Head to the upload page page to get started. You can upload up to 50 images with a maximum single image size of 4.5MB:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://bulkimagesearch.com/upload" rel="noopener noreferrer"&gt;https://bulkimagesearch.com/upload&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft7lpz9fs1zboejxnfij3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft7lpz9fs1zboejxnfij3.png" alt="Upload image" width="800" height="522"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Search Images
&lt;/h3&gt;

&lt;p&gt;After adding images to your Imgur account, navigate to the &lt;a href="https://bulkimagesearch.com/search" rel="noopener noreferrer"&gt;search page&lt;/a&gt; to select all the images you would like to search for, set your search parameters, and choose the format you want the results returned in.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcc9bujepvzhyqaz5o586.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcc9bujepvzhyqaz5o586.png" alt="Search page" width="800" height="525"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Get Your Results&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Click the "Search" button to start your search. Before your searches are processed, you'll be asked to confirm the maximum number of credits that will be used.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fejwe5l8dke0xcpgd10xh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fejwe5l8dke0xcpgd10xh.png" alt="Search Confirmation" width="800" height="519"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Note that if you perform a search with the same parameters within an hour, SerpApi will serve cached results that do not count against your monthly search quota. To bypass search caching, select the "No Cache" search option.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fov6n5osabtq9u83nmsbh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fov6n5osabtq9u83nmsbh.png" alt="Search Results" width="800" height="378"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Manual Method
&lt;/h2&gt;

&lt;p&gt;Now that I've shown you how easy it is to use the Bulk Image Search tool let's discuss what it would take to accomplish this manually using scripts (or as close as possible).&lt;/p&gt;

&lt;h3&gt;
  
  
  Image Hosting
&lt;/h3&gt;

&lt;p&gt;One common issue I see while trying to use the Google Lens API is searching for images hosted via AWS S3 buckets. For whatever reason, Google Lens has trouble processing AWS URLs, making choosing an image hosting option a bit tricky.&lt;/p&gt;

&lt;p&gt;For this reason, I typically suggest users host their images on Imgur (&lt;a href="https://imgur.com/" rel="noopener noreferrer"&gt;&lt;strong&gt;https://imgur.com/&lt;/strong&gt;&lt;/a&gt;), as the issues mentioned above do not exist for Imgur URLs.&lt;/p&gt;

&lt;p&gt;Imgur uses OAuth 2.0 for Authorization of requests; while I won't walk through the complete configuration, below are some resources for getting up and running with the Imgur API:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://apidocs.imgur.com/#authorization-and-oauth" rel="noopener noreferrer"&gt;Imgur Authorization and OAuth Docs&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/Imgur/imgurpython" rel="noopener noreferrer"&gt;Official Imgur Python library&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/eirikb/gifie" rel="noopener noreferrer"&gt;Example HTML5/JavaScript app&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Image Uploading
&lt;/h3&gt;

&lt;p&gt;Once Authorization is set up, uploading images to Imgur is as easy as POSTing form data to the following endpoint:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;https://api.imgur.com/3/image&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Below is an example from &lt;a href="https://apidocs.imgur.com/#c85c9dfc-7487-4de2-9ecd-66f727cf3139" rel="noopener noreferrer"&gt;Imgur's docs&lt;/a&gt; from uploading an image:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;var&lt;/span&gt; &lt;span class="nx"&gt;request&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;request&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="kd"&gt;var&lt;/span&gt; &lt;span class="nx"&gt;fs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;fs&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="kd"&gt;var&lt;/span&gt; &lt;span class="nx"&gt;options&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;method&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;POST&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;url&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;https://api.imgur.com/3/image&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;headers&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Authorization&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Client-ID {{clientId}}&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="na"&gt;formData&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;image&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;value&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;fs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;createReadStream&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;/home/flakrim/Downloads/GHJQTpX.jpeg&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
      &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;options&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;filename&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;GHJQTpX.jpeg&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;contentType&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;image&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;title&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Simple upload&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;description&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;This is a simple image upload in Imgur&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;
&lt;span class="nf"&gt;request&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;options&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nf"&gt;function &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;throw&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Image Search
&lt;/h3&gt;

&lt;p&gt;After uploading your images to Imgur and storing the image's URL, you can start fetching search results from the Google Lens API. I recommend using one of SerpApi's integrations for many popular programming languages:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/integrations" rel="noopener noreferrer"&gt;SerpApi: Integrations&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Using these integrations will allow you to export code directly from SerpApi's playground:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj0q7x17efcdvh3sxvogc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj0q7x17efcdvh3sxvogc.png" alt="Export to code" width="800" height="775"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/playground?engine=google_lens&amp;amp;url=https%3A%2F%2Fi.imgur.com%2FHBrB8p0.png" rel="noopener noreferrer"&gt;SerpApi Playground - SerpApi&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To get a better sense of how to write a script that performs image searches, check out these other blog posts:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/blog/how-to-scrape-google-lens-results/" rel="noopener noreferrer"&gt;How to Scrape Google Lens Results&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/blog/web-scraping-google-lens-results-with-nodejs/" rel="noopener noreferrer"&gt;Web scraping Google Lens Results with Nodejs&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/blog/scrape-google-lens/" rel="noopener noreferrer"&gt;Scrape Google Lens with Python&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/blog/get-image-sources-through-google-lens/" rel="noopener noreferrer"&gt;Get Image Sources through Google Lens&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Ready to Try It?&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;The Bulk Image Search tool solves all of these problems for you. It handles authentication with Imgur, uploading images to Imgur, batching requests to SerpApi, and returning more useful spreadsheet-ready formats (CSV and XSLX).&lt;/p&gt;

&lt;p&gt;Visit &lt;a href="https://bulkimagesearch.com/" rel="noopener noreferrer"&gt;https://bulkimagesearch.com/&lt;/a&gt; to start searching your images in bulk.&lt;/p&gt;

&lt;h2&gt;
  
  
  Frequently Asked Questions
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Is my API key safe?
&lt;/h3&gt;

&lt;p&gt;Yes, your API key is encrypted and never stored in plaintext. It is only used to perform searches on your behalf. API keys are not stored in a database or server.&lt;/p&gt;

&lt;h3&gt;
  
  
  How do I handle "suspicious download" warnings?
&lt;/h3&gt;

&lt;p&gt;Chrome may flag the ZIP file as suspicious when downloading bulk search results. You can safely proceed by clicking "download suspicious file" in Chrome's download bar.&lt;/p&gt;

&lt;h3&gt;
  
  
  I have a question or issue not answered here
&lt;/h3&gt;

&lt;p&gt;For any additional questions or concerns, please get in touch with me at &lt;a href="//mailto:nathan@serpapi.com"&gt;nathan@serpapi.com&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>webscraping</category>
      <category>googlelens</category>
      <category>serpapi</category>
    </item>
    <item>
      <title>Search Engine Scraping: A Comprehensive Look at Global and Niche Engines</title>
      <dc:creator>Nathan Skiles</dc:creator>
      <pubDate>Fri, 18 Oct 2024 07:00:00 +0000</pubDate>
      <link>https://dev.to/nate_serpapi/search-engine-scraping-a-comprehensive-look-at-global-and-niche-engines-46ih</link>
      <guid>https://dev.to/nate_serpapi/search-engine-scraping-a-comprehensive-look-at-global-and-niche-engines-46ih</guid>
      <description>&lt;p&gt;While Google dominates the conversation, a diverse ecosystem of search engines exists, each with unique strengths and specialties. Whether you're an SEO professional devising strategies, a data scientist training AI models, or a market researcher monitoring trends, utilizing data from various search engines is crucial for a holistic view.&lt;/p&gt;

&lt;p&gt;In this article, we'll explore various search engines, highlighting standout features and helping you understand when and how to leverage each for your web scraping needs.&lt;/p&gt;

&lt;h2&gt;
  
  
  Global Search Leaders
&lt;/h2&gt;

&lt;p&gt;These global search leaders have developed sophisticated algorithms, extensive indexing capabilities, and a wide array of features that cater to diverse user needs. While each has unique strengths, they all play a crucial role in organizing the vast expanse of online content.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.google.com/" rel="noopener noreferrer"&gt;Google&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;Google remains the undisputed leader in the search engine world. Its industry-leading algorithm delivers precise results with numerous features, such as Knowledge Graphs, Featured Snippets, and AI-powered overviews.&lt;/p&gt;

&lt;p&gt;Use cases for scraping Google:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Access the world's most extensive index of web pages&lt;/li&gt;
&lt;li&gt;  Utilize advanced features like Knowledge Graphs and Featured Snippets for structured data extraction&lt;/li&gt;
&lt;li&gt;  Leverage Google's AI-powered overviews for summarized information on various topics&lt;/li&gt;
&lt;li&gt;  Extract data from specialized services like Google Maps, Images, Shopping, and News for focused analysis&lt;/li&gt;
&lt;li&gt;  Analyze localized search results to understand regional habits and preferences&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.bing.com/" rel="noopener noreferrer"&gt;Bing&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;Microsoft's Bing has established itself as a significant player in the search engine market. While it may not have Google's market share, Bing offers unique features and advantages that make it valuable for web scraping and data analysis.&lt;/p&gt;

&lt;p&gt;Use cases for scraping Bing:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Explore a less competitive SEO environment, potentially uncovering opportunities missed on Google&lt;/li&gt;
&lt;li&gt;  Analyze different demographics, as Bing tends to reach user groups that differ from Google's primary audience&lt;/li&gt;
&lt;li&gt;  Examine how &lt;a href="https://www.searchenginejournal.com/seo-bing-vs-google/223363/#:~:text=relevant%20anchor%20text.-,3.%20Social%20Signals,-Google%20has%20long" rel="noopener noreferrer"&gt;social signals&lt;/a&gt; impact search results search results, as Bing emphasizes data from platforms like Facebook and X&lt;/li&gt;
&lt;li&gt;  Investigate local search trends, as Bing often provides strong visibility for local businesses&lt;/li&gt;
&lt;li&gt;  Compare ad performance and costs, taking advantage of Bing's typically lower Cost-Per-Click (CPC)&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://duckduckgo.com/" rel="noopener noreferrer"&gt;DuckDuckGo&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;DuckDuckGo has carved out a unique niche in the search engine market by prioritizing user privacy. Unlike many competitors, DuckDuckGo doesn't track user searches or create profiles, making it an attractive option for privacy-conscious users.&lt;/p&gt;

&lt;p&gt;Use cases for scraping DuckDuckGo:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Due to its privacy-focused approach, DuckDuckGo doesn't personalize search results. This means that scraped data is more likely to be consistent across different users and locations, providing a more uniform dataset for analysis.&lt;/li&gt;
&lt;li&gt;  With fewer ads and a cleaner interface, DuckDuckGo's search results pages are often easier to parse and scrape than those of more complex search engines.&lt;/li&gt;
&lt;li&gt;  DuckDuckGo aggregates results from various sources, including its web crawler (DuckDuckBot), providing a broader range of data points for comprehensive analysis.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.yahoo.com/" rel="noopener noreferrer"&gt;Yahoo&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;As one of the pioneers of the internet era, Yahoo has evolved considerably since its founding in 1994. While no longer the dominant force it once was, Yahoo maintains a significant user base and offers unique features that make it valuable for web scraping.&lt;/p&gt;

&lt;p&gt;Use cases for scraping Yahoo:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Yahoo excels in aggregating news, finance, and sports content. Providing a rich data source for researchers or businesses focused on these areas.&lt;/li&gt;
&lt;li&gt;  Yahoo Finance remains one of the most comprehensive sources of financial data, making it a practical resource for scraping financial analysis and stock market research.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Regional Search Powerhouses
&lt;/h2&gt;

&lt;p&gt;While global search engines dominate much of the online landscape, regional search engines play a central role in many parts of the world. These engines are often better tailored to local languages, cultures, and user preferences, making them essential for businesses and researchers focused on specific markets.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.naver.com/" rel="noopener noreferrer"&gt;Naver (Korea)&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;Naver leads the South Korean market. It's known for its comprehensive approach to organizing information, often described as the "Knowledge iN" (&lt;a href="https://kin.naver.com/" rel="noopener noreferrer"&gt;https://kin.naver.com/&lt;/a&gt;) portal rather than just a search engine.&lt;/p&gt;

&lt;p&gt;Use cases for scraping Naver:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Access to user-generated content through Naver's knowledge-sharing platform&lt;/li&gt;
&lt;li&gt;  Real-time trending topics specific to the Korean market&lt;/li&gt;
&lt;li&gt;  Insights into Korean consumer behavior and preferences&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://yandex.com/" rel="noopener noreferrer"&gt;Yandex (Russia)&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;Yandex is a prominent search engine in Russia and several other Russian-speaking countries. It is recognized for its advanced natural language processing capabilities, especially for Cyrillic languages.&lt;/p&gt;

&lt;p&gt;Use cases for scraping Yandex:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Comprehensive coverage of Russian-language content&lt;/li&gt;
&lt;li&gt;  Access to Yandex's suite of services, including maps, news, and marketplace data&lt;/li&gt;
&lt;li&gt;  Insights into Eastern European market trends and consumer behavior&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.baidu.com/" rel="noopener noreferrer"&gt;Baidu (China)&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;Baidu has a foothold as the primary search engine in China. It's tailored specifically for Chinese language search and the unique digital ecosystem of China.&lt;/p&gt;

&lt;p&gt;Use cases for scraping Baidu:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Access to the vast Chinese Internet market&lt;/li&gt;
&lt;li&gt;  Insights into Chinese mobile search trends and app usage&lt;/li&gt;
&lt;li&gt;  Data on China-specific services and platforms not commonly used in other markets&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  E-commerce Search Engines
&lt;/h2&gt;

&lt;p&gt;E-commerce search engines play a vital role in the online shopping ecosystem, providing insights into consumer behavior and product trends. These platforms offer a wealth of data on pricing strategies, product popularity, and customer preferences. Scraping e-commerce search engines can give businesses and researchers actionable insights for inventory management, competitive analysis, and marketing strategies.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.amazon.com/" rel="noopener noreferrer"&gt;Amazon&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;While primarily known as an e-commerce platform, Amazon also functions as a powerful product search engine. Amazon has become the go-to platform for millions of shoppers, making its search data indispensable for market research and e-commerce strategy.&lt;/p&gt;

&lt;p&gt;Use cases for scraping Amazon:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Track product rankings and bestseller lists across various categories&lt;/li&gt;
&lt;li&gt;  Monitor pricing trends and competitive pricing strategies&lt;/li&gt;
&lt;li&gt;  Analyze customer reviews and ratings for sentiment analysis and product improvement insights&lt;/li&gt;
&lt;li&gt;  Study product descriptions and features to optimize listings&lt;/li&gt;
&lt;li&gt;  Investigate sponsored product placements and advertising strategies&lt;/li&gt;
&lt;li&gt;  Analyze seasonal trends and promotional impacts on product visibility and sales&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.ebay.com/" rel="noopener noreferrer"&gt;eBay&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;eBay's search engine specializes in auction-style and fixed-price marketplace. It provides real-time data on pricing and availability, making it a handy tool for market research and competitive analysis.&lt;/p&gt;

&lt;p&gt;Use cases for scraping eBay:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Track product trends&lt;/li&gt;
&lt;li&gt;  Monitor competitive pricing strategies&lt;/li&gt;
&lt;li&gt;  Analyze seasonal demand&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.walmart.com/" rel="noopener noreferrer"&gt;Walmart&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;Walmart's search engine focuses on retail products with a strong emphasis on in-store availability, supporting its vast inventory.&lt;/p&gt;

&lt;p&gt;Use cases for scraping Walmart:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Monitor product availability across Walmart's ecosystem&lt;/li&gt;
&lt;li&gt;  Analyze patterns in consumer preferences&lt;/li&gt;
&lt;li&gt;  Track pricing for both online and in-store offerings&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.homedepot.com/" rel="noopener noreferrer"&gt;The Home Depot&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;Specializing in home improvement and construction products, The Home Depot's search engine offers detailed product information, including specifications, pricing, availability, and customer reviews&lt;/p&gt;

&lt;p&gt;Use cases for scraping The Home Depot:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Gather data on home improvement fads&lt;/li&gt;
&lt;li&gt;  Compare product specifications and reviews across brands&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.apple.com/app-store/" rel="noopener noreferrer"&gt;Apple App Store&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;As the exclusive source for iOS apps, the Apple App Store's search engine provides crucial insights into the mobile app ecosystem.&lt;/p&gt;

&lt;p&gt;Use cases for scraping the Apple App Store:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Track app rankings, ratings, and user reviews&lt;/li&gt;
&lt;li&gt;  Monitor shifts in app categories and features&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://play.google.com/" rel="noopener noreferrer"&gt;Google Play Store&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;The primary source for Android apps, Google Play Store's search engine offers extensive categorization and recommendation features.&lt;/p&gt;

&lt;p&gt;Use cases for scraping Google Play Store:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Monitor app trends and performance in the Android market&lt;/li&gt;
&lt;li&gt;  Access to the Play Store's extensive collection of Books, Games, Movies, and Apps&lt;/li&gt;
&lt;li&gt;  Collect and review user sentiment through reviews&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Video Search Engines
&lt;/h2&gt;

&lt;p&gt;As video content continues to dominate the internet, video search engines have become valuable sources of information on tendencies, user preferences, and content creation strategies. These platforms offer unique insights into viewer behavior, content popularity, and emerging topics across various demographics. Scraping data from video search engines can provide useful information for content creators, marketers, and researchers seeking to understand and leverage video content effectively.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.youtube.com/" rel="noopener noreferrer"&gt;Youtube&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;YouTube, owned by Google, is the world's largest video-sharing platform. As a search engine for video content, it offers extensive data on viewer preferences, content trends, and creator performance.&lt;/p&gt;

&lt;p&gt;Use cases for scraping YouTube:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Analyze video metadata (titles, descriptions, tags) to understand SEO strategies in video content&lt;/li&gt;
&lt;li&gt;  Track trending videos and topics across different regions and categories&lt;/li&gt;
&lt;li&gt;  Study comment sentiment and engagement patterns for various types of content&lt;/li&gt;
&lt;li&gt;  Monitor channel growth and subscriber acquisition strategies&lt;/li&gt;
&lt;li&gt;  Track advertising trends and monetization strategies across different content niches&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.tiktok.com/" rel="noopener noreferrer"&gt;TikTok&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;TikTok has rapidly become one of the world's most popular social media platforms, known for its short-form video content. Its unique algorithm and content discovery features make it an irreplaceable source of data on fads, user behavior, and viral content.&lt;/p&gt;

&lt;p&gt;Use cases for scraping TikTok:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Track trending hashtags, sounds, and challenges to identify emerging cultural phenomena&lt;/li&gt;
&lt;li&gt;  Analyze user engagement patterns across different content types and creator categories&lt;/li&gt;
&lt;li&gt;  Monitor the performance of branded content and influencer marketing campaigns&lt;/li&gt;
&lt;li&gt;  Track the spread and evolution of memes and viral content&lt;/li&gt;
&lt;li&gt;  Analyze the effectiveness of TikTok's recommendation algorithm in content&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Travel Search Engines
&lt;/h2&gt;

&lt;p&gt;While general search engines like Google provide extensive travel information, specialized travel search platforms offer unique insights into travel patterns, accommodation preferences, and user behavior in the tourism industry.&lt;/p&gt;

&lt;p&gt;These platforms often have more detailed and up-to-date information about specific travel-related topics, making them valuable sources for data scraping for travel and hospitality industries.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.yelp.com/" rel="noopener noreferrer"&gt;Yelp&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;Yelp is a popular crowd-sourced review platform covering many businesses, from restaurants to hotels and local services. While not exclusively a travel platform, Yelp's extensive database of local business information makes it a valuable resource for travel-related data.&lt;/p&gt;

&lt;p&gt;Use cases for scraping Yelp:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Analyze review trends and sentiment for restaurants, hotels, and attractions in specific locations&lt;/li&gt;
&lt;li&gt;  Track the popularity of different cuisines or business types across various cities or neighborhoods&lt;/li&gt;
&lt;li&gt;  Analyze user check-ins to understand peak business hours and seasonal habits&lt;/li&gt;
&lt;li&gt;  Investigate the effectiveness of business owner responses on user ratings and sentiment&lt;/li&gt;
&lt;li&gt;  Track promotional offers and their impact on customer engagement and reviews&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.airbnb.com/" rel="noopener noreferrer"&gt;Airbnb&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;Airbnb transformed the lodging industry by allowing individuals to rent out their spaces to travelers. It provides a search engine for lodging and a wealth of data on travel preferences, pricing shifts, and property management.&lt;/p&gt;

&lt;p&gt;Use cases for scraping Airbnb:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Analyze pricing strategies across different locations and seasons&lt;/li&gt;
&lt;li&gt;  Track the popularity of various amenities and their impact on booking rates&lt;/li&gt;
&lt;li&gt;  Study user reviews to gauge traveler satisfaction and preferences&lt;/li&gt;
&lt;li&gt;  Monitor the growth of short-term rentals in specific geographic areas&lt;/li&gt;
&lt;li&gt;  Investigate the impact of local events on accommodation demand and pricing&lt;/li&gt;
&lt;li&gt;  Analyze host performance metrics and their correlation with booking success&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.tripadvisor.com/" rel="noopener noreferrer"&gt;Tripadvisor&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;Offering user-generated content about accommodations, restaurants, and attractions worldwide, Tripadvisor has become one of the world's largest travel platforms. Its search engine provides comprehensive data on traveler opinions and behaviors.&lt;/p&gt;

&lt;p&gt;Use cases for scraping Tripadvisor:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Gather and analyze user reviews for sentiment analysis on hotels, restaurants, and attractions&lt;/li&gt;
&lt;li&gt;  Track ranking changes of businesses within specific categories and locations&lt;/li&gt;
&lt;li&gt;  Analyze travel tendencies and destination popularity over time&lt;/li&gt;
&lt;li&gt;  Monitor pricing changes for hotels and compare them across different booking platforms&lt;/li&gt;
&lt;li&gt;  Analyze user-generated photos to understand what aspects of a business travelers find most noteworthy&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  How SerpApi Enhances Your Search Data Collection
&lt;/h2&gt;

&lt;p&gt;SerpApi's mission is to scrape publicly available data from various search engines and provide logically structured data. SerpApi supports multiple search engines and offers a range of benefits that can significantly enhance your data collection and analysis capabilities:&lt;/p&gt;

&lt;p&gt;Efficiency:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Comprehensive data collection, accessible from a single API provider&lt;/li&gt;
&lt;li&gt;  Time-saving by eliminating the need for separate scrapers&lt;/li&gt;
&lt;li&gt;  Scalability to handle high-volume requests across multiple engines&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Data Quality:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Regular updates to adapt to search engine changes&lt;/li&gt;
&lt;li&gt;  Customized scraping for specific SERP features&lt;/li&gt;
&lt;li&gt;  Standardized data format for easy integration and analysis&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Versatility:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Access to market-specific insights from international search engines&lt;/li&gt;
&lt;li&gt;  Compare results across different search platforms&lt;/li&gt;
&lt;li&gt;  Support for various use cases from SEO to market research&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Getting Started with SerpApi
&lt;/h3&gt;

&lt;p&gt;To get hands-on testing with SerpApi, head to the &lt;a href="https://serpapi.com/playground" rel="noopener noreferrer"&gt;interactive playground&lt;/a&gt; to test the various search engine offerings.&lt;/p&gt;

&lt;p&gt;You can find the documentation for all of our APIs &lt;a href="https://serpapi.com/search-api" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Sign up for a free account to receive 100 successful searches a month:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/users/sign_up" rel="noopener noreferrer"&gt;https://serpapi.com/users/sign_up&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Link to SerpApi sign up page&lt;/p&gt;

&lt;p&gt;While most of the engines included in this article are already available on SerpApi, we are actively working on additional search engines. You can follow the progress of these new APIs on our public roadmap for future updates:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;a href="https://github.com/serpapi/public-roadmap/issues/486" rel="noopener noreferrer"&gt;TikTok&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;a href="https://github.com/serpapi/public-roadmap/issues/811" rel="noopener noreferrer"&gt;Amazon&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;a href="https://github.com/serpapi/public-roadmap/issues/1071" rel="noopener noreferrer"&gt;Tripadvisor&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If there is a search engine you would like to see added to SerpApi, feel free to open a feature request on our public roadmap:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://github.com/serpapi/public-roadmap" rel="noopener noreferrer"&gt;https://github.com/serpapi/public-roadmap&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Relying on a single search engine for data collection can severely limit your insights and opportunities. By leveraging SerpApi's multi-engine support or your own custom web scraper, you can access valuable search data from various sources, each offering unique strengths and market focus.&lt;/p&gt;

&lt;h2&gt;
  
  
  Links
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;a href="https://serpapi.com/blog/how-to-build-a-web-scraper-a-beginners-guide/" rel="noopener noreferrer"&gt;How to build a web scraper: A beginner's guide&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;a href="https://serpapi.com/blog/web-scraping-in-javascript-complete-tutorial-for-beginner/" rel="noopener noreferrer"&gt;Web Scraping with Javascript and Nodejs (2024 Guide)&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;a href="https://serpapi.com/blog/how-to-extract-bing-images-data-with-serpapi-and-python/" rel="noopener noreferrer"&gt;How to Extract Bing Images Data with SerpApi and Python&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

</description>
    </item>
    <item>
      <title>What is a SERP?</title>
      <dc:creator>Nathan Skiles</dc:creator>
      <pubDate>Fri, 20 Sep 2024 16:37:53 +0000</pubDate>
      <link>https://dev.to/nate_serpapi/what-is-a-serp-g86</link>
      <guid>https://dev.to/nate_serpapi/what-is-a-serp-g86</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Search engines have become the primary gateway to information on the internet. When you type a query into Google, Bing, or any other search engine, the page displaying the results is known as a Search Engine Results Page, or SERP. Understanding SERPs and their components is crucial for anyone involved in digital marketing, SEO (Search Engine Optimization), or online business. This post will explore SERPs, their importance, and how SerpApi can help you efficiently collect SERP data.&lt;/p&gt;

&lt;h2&gt;
  
  
  Definition of SERP (Search Engine Results Page)
&lt;/h2&gt;

&lt;p&gt;A Search Engine Results Page (SERP) is a web page that appears in response to a search engine query. It typically contains a list of web pages, images, videos, and other types of content that the search engine deems relevant to the user's query.&lt;/p&gt;

&lt;p&gt;SERPs are dynamic and can vary based on numerous factors, including:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  The search query&lt;/li&gt;
&lt;li&gt;  The user's location&lt;/li&gt;
&lt;li&gt;  Search history and preferences&lt;/li&gt;
&lt;li&gt;  Device (desktop, mobile, etc.)&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;How SERPs Vary by Search Engine&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;While &lt;a href="https://serpapi.com/search-api" rel="noopener noreferrer"&gt;Google&lt;/a&gt; dominates the search engine market, it's important to note that SERPs can vary significantly between different search engines like &lt;a href="https://serpapi.com/bing-search-api" rel="noopener noreferrer"&gt;Bing&lt;/a&gt;, &lt;a href="https://serpapi.com/yahoo-search-api" rel="noopener noreferrer"&gt;Yahoo&lt;/a&gt;, or &lt;a href="https://serpapi.com/duckduckgo-search-api" rel="noopener noreferrer"&gt;DuckDuckGo&lt;/a&gt;. These differences can include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Layout and design of the SERP&lt;/li&gt;
&lt;li&gt;  Number and placement of ads&lt;/li&gt;
&lt;li&gt;  Types of featured snippets or knowledge panels&lt;/li&gt;
&lt;li&gt;  Emphasis on certain types of content (e.g., images, videos, news)&lt;/li&gt;
&lt;li&gt;  Local search features&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;While these are generally the most common search engines worldwide, search engines like &lt;a href="https://serpapi.com/baidu-search-api" rel="noopener noreferrer"&gt;Baidu&lt;/a&gt; in China and &lt;a href="https://serpapi.com/yandex-search-api" rel="noopener noreferrer"&gt;Yandex&lt;/a&gt; in Russia also provide insight into their respective markets.&lt;/p&gt;

&lt;h2&gt;
  
  
  Anatomy of a SERP
&lt;/h2&gt;

&lt;p&gt;To fully understand SERPs, being familiar with their various components is essential.&lt;/p&gt;

&lt;h3&gt;
  
  
  Organic Results
&lt;/h3&gt;

&lt;p&gt;The non-paid listings appear based on the search engine's algorithm. Users typically trust them the most, and they are the primary focus of SEO efforts.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fflf3mlqtpeqd7k7n5p9i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fflf3mlqtpeqd7k7n5p9i.png" alt="Organic results" width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/organic-results" rel="noopener noreferrer"&gt;SerpApi Google Organic Results Documentation&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Paid Advertisements
&lt;/h3&gt;

&lt;p&gt;Also known as pay-per-click (PPC) ads, these are sponsored listings that appear at the top or bottom of the SERP. They're marked as ads based on advertisers bidding for specific keywords.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F43ddaraiksg41jdnej3h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F43ddaraiksg41jdnej3h.png" alt="Paid advertisements" width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/google-ads" rel="noopener noreferrer"&gt;SerpApi Google Ad Results Documentation&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Featured Snippets
&lt;/h3&gt;

&lt;p&gt;These selected search results appear in a box at the top of the SERP, providing a quick answer to the user's query. They often include a summary of the answer, the source website, and a link to the full page.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fntkgdjmjlsdls5qgdk35.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fntkgdjmjlsdls5qgdk35.png" alt="Featured snippets" width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/direct-answer-box-api" rel="noopener noreferrer"&gt;SerpApi Google Answer Box Documentation&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Knowledge Panels
&lt;/h3&gt;

&lt;p&gt;For some queries, knowledge panels appear in a box (known as a Knowledge Graph) on the right side of the SERP and provide quick facts about people, places, organizations, or concepts.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9nm3uhqw2je934k90pbg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9nm3uhqw2je934k90pbg.png" alt="Knowledge panels" width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/knowledge-graph" rel="noopener noreferrer"&gt;SerpApi Google Knowledge Graph Documentation&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Local Pack Results
&lt;/h3&gt;

&lt;p&gt;For queries with local intent, search engines often display a map with nearby businesses or locations relevant to the search.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz9dquqfilrin9m2km1vj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz9dquqfilrin9m2km1vj.png" alt="Local pack results" width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/local-pack" rel="noopener noreferrer"&gt;SerpApi Google Local Pack Documentation&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  AI Overview
&lt;/h3&gt;

&lt;p&gt;AI Overviews are a component currently specific to Google. However, other search engines, such as Bing, are incorporating AI answers into their search engine—in Bing's case, through their Copilot offering.&lt;/p&gt;

&lt;p&gt;Similar to featured snippets, AI overviews typically respond to a question along with references and important information. However, they can be challenging to work with due to the random manner in which Google currently presents them.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F61v8ci5hcljpvd8kvr9t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F61v8ci5hcljpvd8kvr9t.png" alt="AI Overview" width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/ai-overview" rel="noopener noreferrer"&gt;SerpApi Google AI Overview Documentation&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  More Components
&lt;/h3&gt;

&lt;p&gt;SerpApi provides a comprehensive list of various search engine's SERP components in their &lt;a href="https://serpapi.com/search-api" rel="noopener noreferrer"&gt;documentation&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  SERP Data Uses
&lt;/h2&gt;

&lt;p&gt;There are numerous applications for SERP data, ranging from SEO to academic research:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Competitor Analysis: Monitor competitors' keyword rankings and analyze their strategies for gaining visibility.&lt;/li&gt;
&lt;li&gt;  Ad Performance: Analyze the paid results in SERPs to identify trends in bidding strategies and ad performance, optimizing pay-per-click (PPC) campaigns.&lt;/li&gt;
&lt;li&gt;  Market Research and Trends: Tracking shifts in SERP rankings can provide valuable insights into emerging trends and changes in consumer behavior.&lt;/li&gt;
&lt;li&gt;  Algorithm Impact on Content Discoverability: Study the impact of search engine algorithms on the visibility of academic content, research articles, and scholarly publications.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Challenges in SERP Data Collection
&lt;/h2&gt;

&lt;p&gt;While SERP data is valuable for digital marketing and SEO, collecting this data can be challenging:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  SERPs change frequently based on various factors, making it challenging to capture consistent data.&lt;/li&gt;
&lt;li&gt;  Search results can vary based on user location, search history, and preferences.&lt;/li&gt;
&lt;li&gt;  Search engines implement measures to prevent automated data collection, which can hamper large-scale SERP analysis. While it is possible to build an in-house solution, it's often a time-sink for engineering teams as frequent maintenance and upkeep are needed.&lt;/li&gt;
&lt;li&gt;  Manually collecting SERP data is time-consuming and often impractical for large-scale analysis.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Introduction to SerpApi
&lt;/h2&gt;

&lt;p&gt;SerpApi is a solution designed to address the challenges of SERP data collection. As a leading search engine results provider, SerpApi offers a range of services to help businesses, marketers, and researchers efficiently gather and analyze SERP information:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Up-to-date search results as they appear to users.&lt;/li&gt;
&lt;li&gt;  Collect data from Google, Bing, Baidu, and more.&lt;/li&gt;
&lt;li&gt;  Tailor searches based on location, device, language, and other factors.&lt;/li&gt;
&lt;li&gt;  Structured data output (JSON, CSV).&lt;/li&gt;
&lt;li&gt;  Handle large volumes of requests with high uptime and fast response times.&lt;/li&gt;
&lt;li&gt;  Easy to integrate with existing data streams.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Use Cases for SerpApi
&lt;/h2&gt;

&lt;p&gt;SerpApi's versatility makes it useful for a wide range of applications:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Digital marketing agencies: Track campaign performance and competitor rankings.&lt;/li&gt;
&lt;li&gt;  E-commerce businesses: Monitor product listings and pricing across search engines.&lt;/li&gt;
&lt;li&gt;  SEO: Provides rankings and makes sense of organic search results.&lt;/li&gt;
&lt;li&gt;  Academic research: Analyze search trends and user behavior patterns.&lt;/li&gt;
&lt;li&gt;  AI models: Build data sets of publicly available images, text results, and scientific articles.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;As search engines continue to evolve, the importance of SERP analysis in digital strategy will only grow. SerpApi provides a powerful solution to the challenges of SERP data collection, offering businesses and researchers the tools needed to stay ahead in the competitive online environment.&lt;/p&gt;

&lt;p&gt;By leveraging SerpApi's capabilities, organizations can gain valuable insights, inform their SEO strategies, and make data-driven decisions to improve their online visibility and performance. As we look to the future, efficiently collecting and analyzing SERP data will become increasingly necessary for businesses.&lt;/p&gt;

&lt;h2&gt;
  
  
  Links
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;a href="https://serpapi.com/blog/scraping-public-pages-legality/" rel="noopener noreferrer"&gt;Scraping public pages is legal in the US (2024)&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;a href="https://serpapi.com/blog/monitoring-competitors-google-ads/" rel="noopener noreferrer"&gt;Monitoring Competitors' Google Ads&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;a href="https://serpapi.com/blog/how-to-scrape-google-search-results-with-python/" rel="noopener noreferrer"&gt;How to scrape Google search results with Python&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;a href="https://serpapi.com/blog/analyzing-the-distribution-of-problems-in-competitions-reviews/" rel="noopener noreferrer"&gt;Analyzing the Distribution of Problems in Competition Reviews&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

</description>
    </item>
  </channel>
</rss>
