<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Emirhan Akdeniz</title>
    <description>The latest articles on DEV Community by Emirhan Akdeniz (@kagermanov27).</description>
    <link>https://dev.to/kagermanov27</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/kagermanov27"/>
    <language>en</language>
    <item>
      <title>Introducing SerpApi's MCP Server</title>
      <dc:creator>Emirhan Akdeniz</dc:creator>
      <pubDate>Fri, 05 Dec 2025 17:26:27 +0000</pubDate>
      <link>https://dev.to/serpapi/introducing-serpapis-mcp-server-2i61</link>
      <guid>https://dev.to/serpapi/introducing-serpapis-mcp-server-2i61</guid>
      <description>&lt;p&gt;As the AI Agents ecosystem and usage grow, we at SerpApi are looking to support that effort and simplify how AI developers can get fast and reliable access to the latest Google and other Search web data sources. Today, we are releasing an open-source Model Context Protocol (MCP) server that exposes our web search API to AI agents and developers through a familiar and common interface.&lt;/p&gt;

&lt;p&gt;This server lets any MCP-compatible AI model client call SerpApi’s search tools without writing any custom code or dealing with complicated SDKs. The agent can ask the server what tools it offers and then invoke them as needed to achieve the desired outcome. SerpApi’s MCP server includes a unified search tool so that LLM (Large Language Models) agents can retrieve real-time search results through simple function calls. The search tool is flexible and supports all schemas and parameters that are supported in our REST API.&lt;/p&gt;

&lt;p&gt;The server supports multiple engines (Google, Bing, Yahoo, DuckDuckGo, Yandex, Baidu, YouTube, eBay, Walmart, and more) and specialised queries such as weather or stock lookups, returning clean JSON results optimised for AI application consumption. You may &lt;a href="https://serpapi.com/users/sign_up" rel="noopener noreferrer"&gt;register for SerpApi to claim free credits&lt;/a&gt;.&lt;br&gt;
What is the Model Context Protocol (MCP)?&lt;/p&gt;
&lt;h2&gt;
  
  
  What is the Model Context Protocol (MCP)?
&lt;/h2&gt;

&lt;p&gt;The Model Context Protocol (MCP) is an open protocol created by Antrophic to let AI models operate in richer, more context-aware ways inside real-world development environments and applications. Instead of relying solely on static training data, MCP lets models access live information—such as local files in a file system, project metadata, or connected external systems—all through a unified, standardised protocol. It works over simple transports like stdio, sse, or even USB-C (when implemented over serial-style connections), making it easy to embed into many platforms. To support broad interoperability, MCP also clarifies how transport methods such as server-sent events or process pipes can carry messages between tools and clients. MCP is defined through a clear protocol specification built on JSON-RPC, and many implementations explicitly align with JSON-RPC 2.0, ensuring that tools and clients can communicate reliably without vendor lock-in. As adoption grows, MCP aims to function as a truly standardized protocol that any AI system can use—whether it’s ChatGPT from OpenAI, Claude from Anthropic, Gemini, or any other model—so long as they connect through a compatible endpoint. These foundations also help address security risks by enabling structured access control, controlled sandboxing, and predictable interactions between models and tools.&lt;/p&gt;

&lt;p&gt;With MCP, developers can build powerful MCP tools that expose datasets, APIs, services, or an entire workspace to AI assistants and chatbots. This enables models to read docs, run development tools, inspect code, or even debug problems interactively using natural language while still following strict structured messaging. Because everything follows the same protocol, tools become reusable across editors, terminals, and other apps. This growing MCP integration ecosystem means that not only OpenAI models but also third-party AI systems can participate in shared workflows. MCP supports emerging agentic AI patterns, where models autonomously sequence many small tool calls to complete complex use cases, while still respecting boundaries around sensitive data and maintaining scalability as environments grow. With these enhancements, MCP turns AI from a passive text generator into an active participant that can interact with your environment in a safe, structured way—unlocking smarter automation, deeper integrations, and more capable AI-assisted workflows.&lt;/p&gt;
&lt;h3&gt;
  
  
  Key Features
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Multi-Engine Search: A single &lt;code&gt;search&lt;/code&gt; tool covers dozens of engines that are providers of external data sources—Google (full and light), Bing, Yahoo, DuckDuckGo, Yandex, Baidu, YouTube, eBay, Walmart, and others.&lt;/li&gt;
&lt;li&gt;Structured Results: The server autogenerates structured JSON block templates (external tools such as answer boxes, knowledge graph data, calculations, and more) along with organic, news, image, and shopping results. AI Assistant Agents get exactly what they need without manual parsing.&lt;/li&gt;
&lt;li&gt;Raw and Clean Output: Developers can request raw JSON or a clean, normalised version with fields like title, snippet, link, etc.&lt;/li&gt;
&lt;li&gt;Robust Reliability: Error handling and retry logic are included by default. Authentication failures, rate limits, and network timeouts return helpful messages that agents can use to respond gracefully.&lt;/li&gt;
&lt;li&gt;Open Source: We at SerpApi believe that open source is an integral part of the development of AI Tools. Anyone with some Python knowledge should be able to contribute to Artificial Intelligence tooling and general AI Systems. You may head to &lt;a href="https://github.com/serpapi/serpapi-mcp" rel="noopener noreferrer"&gt;our GitHub Repository&lt;/a&gt; to access an open-sourced version of the MCP Server.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;
  
  
  Getting Started
&lt;/h3&gt;

&lt;p&gt;You can start by using our remotely hosted MCP Server at &lt;a href="https://mcp.serpapi.com/SERPAPI_API_KEY/mcp" rel="noopener noreferrer"&gt;https://mcp.serpapi.com/SERPAPI_API_KEY/mcp&lt;/a&gt;. The MCP server supports requests from any origin.&lt;/p&gt;

&lt;p&gt;You can also self-host the MCP server locally to have full control of the deployment. You can start with it by cloning the repository and installing the dependencies:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git clone https://github.com/serpapi/serpapi-mcp.git
cd serpapi-mcp/
uv sync
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Set your SerpApi API key in a .env file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;SERPAPI_API_KEY=your_key_here

Then start the server:

uv run src/server.py
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Configure your MCP-client to connect to the server. Once connected, the AI-powered agent will detect SerpApi’s search tool and any additional resources the server provides in automation.&lt;/p&gt;

&lt;p&gt;You can also use the provided Docker configuration to deploy the service as a container.&lt;br&gt;
Interacting with the server&lt;/p&gt;

&lt;p&gt;You can use the MCP inspector to interact and explore the MCP service. You can start by installing the package:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npm install -g  @modelcontextprotocol/inspector
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then start the inspector:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npx @modelcontextprotocol/inspector
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The service will become available on localhost:6274. You can try listing the available tools, see their description and parameters.&lt;/p&gt;

&lt;p&gt;You can also test the MCP server inside Microsoft’s VS Code IDE. You can add the MCP server config to the .vscode/mcp.json file and test it inside the VS Code Copilot.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "servers": {
    "serpapi-mcp": {
      "type": "http",
      "url": "https://mcp.serpapi.com/&amp;lt;API_KEY&amp;gt;/mcp"
    }
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Accelerating AI Integration
&lt;/h3&gt;

&lt;p&gt;By adopting the MCP standard, SerpApi makes web search a native capability for AI agents. Instead of bespoke integration code with complicated connectors, agents discover and use SerpApi’s tools automatically. This reduces hallucinations and accelerates the development of agentic applications.&lt;/p&gt;

&lt;p&gt;SerpApi releases this server to give AI developers fast, reliable access to live search data. We’re excited to support the growing ecosystem of AI agents and provide the infrastructure that helps them understand the real world with open standards and with minimal permissions needed. The MCP server is open-source, and we invite developers to explore it, contribute, and integrate it into their own AI workflows.&lt;/p&gt;

&lt;p&gt;Try out the &lt;a href="https://serpapi.com/" rel="noopener noreferrer"&gt;SerpApi&lt;/a&gt; MCP server and see how it fits into your agent stack.&lt;/p&gt;

</description>
      <category>mcp</category>
      <category>ai</category>
      <category>programming</category>
      <category>modelcontextprotocol</category>
    </item>
    <item>
      <title>Spiking Neural Networks</title>
      <dc:creator>Emirhan Akdeniz</dc:creator>
      <pubDate>Wed, 12 Jun 2024 09:40:00 +0000</pubDate>
      <link>https://dev.to/serpapi/spiking-neural-networks-2k5p</link>
      <guid>https://dev.to/serpapi/spiking-neural-networks-2k5p</guid>
      <description>&lt;p&gt;In this blog post, we will discuss the differences between spiking neural networks, and non-spiking neural networks, potential use cases of these algorithms, and open source a simple example script to compare a simple SNN model to an ANN model.&lt;/p&gt;

&lt;p&gt;At &lt;a href="https://serpapi.com/blog"&gt;SerpApi Blog&lt;/a&gt;, we discuss different topics around web scraping:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/users/sign_up"&gt;Register to Claim Free Credits&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What is a spiking neural network?
&lt;/h2&gt;

&lt;p&gt;A spiking neural network (SNN) is a type of artificial neural network that more closely mimics the behavior of biological neurons. Unlike traditional artificial neural networks (ANNs), which use continuous activation functions, SNNs use discrete events called spikes. These spikes represent the times at which neurons fire and enable the network to process information in a way that is more similar to synapses of the brain. This enabling mechanism is what makes SNNs superior to ANNs in inference times for temporal (in-time steps) data.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is the difference between spiking and non-spiking neural networks?
&lt;/h2&gt;

&lt;p&gt;The primary difference between spiking and non-spiking neural networks lies in how they handle information processing:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Spiking neural networks: Use spikes to represent information, with firing neurons only when their membrane potential reaches a certain threshold. These neuronal modelings are often associated with neuromorphic computing and use learning rules such as spike-timing-dependent plasticity (STDP) instead of backpropagation.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Non-spiking neural networks (e.g. ANNs): Use continuous activation functions like ReLU or sigmoid to process information and typically use backpropagation for learning.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  How Does Spiking Neural Network Work?
&lt;/h2&gt;

&lt;p&gt;SNNs work by simulating the behavior of biological neurons. When a neuron’s membrane potential exceeds a certain threshold, it generates a spike that propagates to other spiking neurons. These spikes can modify synaptic weights through learning rules such as spike-timing-dependent plasticity (STDP), enabling the network to learn from temporal patterns in the data.&lt;/p&gt;

&lt;h2&gt;
  
  
  What are the most effective ways to use neural networks for pattern recognition?
&lt;/h2&gt;

&lt;p&gt;For pattern recognition, deep learning models like convolutional neural network models (CNNs) are highly effective. However, SNNs are gaining attention for their ability to recognize spatiotemporal (belonging to both space and time or to space-time) patterns with high precision and low power consumption.&lt;/p&gt;

&lt;h2&gt;
  
  
  What could it mean for the future of scraping?
&lt;/h2&gt;

&lt;p&gt;In my humble opinion, SNNs could hold a place in finding patterns within changing and evolving HTML structures. Instead of classifying items and parsing them, SNNs may be useful for identifying where specific parts of the HTML are within the overall body. This could reduce human interaction and pave the way for the future of fully automated parsers with higher precision and lower inference times.&lt;/p&gt;

&lt;h2&gt;
  
  
  SNN vs ANN Comparison
&lt;/h2&gt;

&lt;p&gt;The following is a simple demonstration script for comparing SNN and ANN models under the same conditions. I have to give a disclaimer that this is for demonstration purposes, and not proving purposes, or definite benchmarking. As I repeat time and again, I am not an expert in machine learning, I am just an enthusiast.&lt;/p&gt;

&lt;p&gt;Let's import the libraries. We will be using PyTorch for the framework, sklearn for simple dataset-splitting tasks, and snntorch for creating SNN models in PyTorch:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;numpy&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;

&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;

&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;torch.nn&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;nn&lt;/span&gt;

&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;torch.optim&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;optim&lt;/span&gt;

&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;snntorch&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;snn&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;sklearn.model_selection&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;train_test_split&lt;/span&gt;

&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;time&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let's create a function that simulates motion data (a kind of temporal data):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;
&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;generate_motion_data&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;num_samples&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;event_length&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;num_events&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;noise_level&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;X&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
    &lt;span class="n"&gt;y&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;_&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;num_samples&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;motion_indices&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;random&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;randint&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;event_length&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;size&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;num_events&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;event_data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;zeros&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event_length&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;event_data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;motion_indices&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;
        &lt;span class="n"&gt;noise&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;random&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;normal&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;noise_level&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;size&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;event_length&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;event_data&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="n"&gt;noise&lt;/span&gt;

        &lt;span class="c1"&gt;# Introduce variability in the patterns
&lt;/span&gt;        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;random&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;rand&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="mf"&gt;0.5&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;event_data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;roll&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event_data&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;random&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;randint&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;event_length&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;

        &lt;span class="n"&gt;X&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event_data&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event_data&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;array&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;X&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;array&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The output of this encoding will be a binary where 1 representing motion and 0 representing no motion. We also introduce a Gaussian noise with standard deviation to make the data more consistent with real-world data. Alongside noise, we introduce some random variability patterns to make the task harder. The model should be able to take into consideration all of these factors and predict the motion output within the series.&lt;/p&gt;

&lt;p&gt;Let's create our data:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;
&lt;span class="c1"&gt;# Parameters
&lt;/span&gt;
&lt;span class="n"&gt;num_samples&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1000&lt;/span&gt;

&lt;span class="n"&gt;event_length&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;100&lt;/span&gt;

&lt;span class="n"&gt;num_events&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;100&lt;/span&gt;

&lt;span class="n"&gt;noise_level&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mf"&gt;0.1&lt;/span&gt;

&lt;span class="c1"&gt;# Generate data
&lt;/span&gt;
&lt;span class="n"&gt;X&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;generate_motion_data&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;num_samples&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;event_length&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;num_events&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;noise_level&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Convert to PyTorch tensors
&lt;/span&gt;
&lt;span class="n"&gt;X&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;tensor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;X&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;dtype&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;float32&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;y&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;tensor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;dtype&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;float32&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Split into training and validation sets
&lt;/span&gt;
&lt;span class="n"&gt;X_train&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;X_val&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y_train&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y_val&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;train_test_split&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;X&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;test_size&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mf"&gt;0.2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;random_state&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;42&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let's define an SNN model and train it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;
&lt;span class="c1"&gt;# Define SNN model
&lt;/span&gt;&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;SpikingNN&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Module&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;input_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;hidden_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;output_dim&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="nf"&gt;super&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;SpikingNN&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;fc1&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;Linear&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;input_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;hidden_dim&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;lif1&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;snn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;Leaky&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;beta&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mf"&gt;0.9&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;fc2&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;Linear&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;hidden_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;output_dim&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;forward&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;fc1&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;mem1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;spk1&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;lif1&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;fc2&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;spk1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt;

&lt;span class="c1"&gt;# Model, loss function, and optimizer for SNN
&lt;/span&gt;&lt;span class="n"&gt;input_dim&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;event_length&lt;/span&gt;
&lt;span class="n"&gt;hidden_dim&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;64&lt;/span&gt;
&lt;span class="n"&gt;output_dim&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;  &lt;span class="c1"&gt;# Binary classification
&lt;/span&gt;
&lt;span class="n"&gt;snn_model&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;SpikingNN&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;input_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;hidden_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;output_dim&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;criterion&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;BCEWithLogitsLoss&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;  &lt;span class="c1"&gt;# Binary cross-entropy loss
&lt;/span&gt;&lt;span class="n"&gt;optimizer&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;optim&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;Adam&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;snn_model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;parameters&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="n"&gt;lr&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mf"&gt;0.001&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Training loop for SNN
&lt;/span&gt;&lt;span class="n"&gt;num_epochs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;100&lt;/span&gt;
&lt;span class="n"&gt;snn_training_start&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;time&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;time&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;epoch&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;num_epochs&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;snn_model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;train&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;optimizer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;zero_grad&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;outputs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;snn_model&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;X_train&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;loss&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;criterion&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;outputs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;squeeze&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="n"&gt;y_train&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;loss&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;backward&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;optimizer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;step&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="c1"&gt;# Calculate training loss
&lt;/span&gt;    &lt;span class="n"&gt;train_loss&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;loss&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;item&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="c1"&gt;# Validation
&lt;/span&gt;    &lt;span class="n"&gt;snn_model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;eval&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;no_grad&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
        &lt;span class="n"&gt;val_outputs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;snn_model&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;X_val&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;val_loss&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;criterion&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;val_outputs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;squeeze&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="n"&gt;y_val&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;val_loss&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;val_loss&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;item&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;SNN Epoch &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;epoch&lt;/span&gt;&lt;span class="o"&gt;+&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;/&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;num_epochs&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;, Loss: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;train_loss&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;, Validation Loss: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;val_loss&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;snn_training_time&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;time&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;time&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;snn_training_start&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;SNN Training Time: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;snn_training_time&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; seconds&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let's create an ANN model and train it for comparison:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;
&lt;span class="c1"&gt;# Define ANN model
&lt;/span&gt;
&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;ANN&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Module&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;input_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;hidden_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;output_dim&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="nf"&gt;super&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ANN&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;fc1&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;Linear&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;input_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;hidden_dim&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;relu&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;ReLU&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;fc2&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;Linear&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;hidden_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;output_dim&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;forward&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;fc1&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;relu&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;fc2&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt;

&lt;span class="c1"&gt;# Model, loss function, and optimizer for ANN
&lt;/span&gt;&lt;span class="n"&gt;ann_model&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ANN&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;input_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;hidden_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;output_dim&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;criterion&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;BCEWithLogitsLoss&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;  &lt;span class="c1"&gt;# Binary cross-entropy loss
&lt;/span&gt;&lt;span class="n"&gt;optimizer&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;optim&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;Adam&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ann_model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;parameters&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="n"&gt;lr&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mf"&gt;0.001&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Training loop for ANN
&lt;/span&gt;&lt;span class="n"&gt;ann_training_start&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;time&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;time&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;epoch&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;num_epochs&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;ann_model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;train&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;optimizer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;zero_grad&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;outputs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;ann_model&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;X_train&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;loss&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;criterion&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;outputs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;squeeze&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="n"&gt;y_train&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;loss&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;backward&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;optimizer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;step&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="c1"&gt;# Calculate training loss
&lt;/span&gt;    &lt;span class="n"&gt;train_loss&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;loss&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;item&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="c1"&gt;# Validation
&lt;/span&gt;    &lt;span class="n"&gt;ann_model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;eval&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;no_grad&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
        &lt;span class="n"&gt;val_outputs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;ann_model&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;X_val&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;val_loss&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;criterion&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;val_outputs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;squeeze&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="n"&gt;y_val&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;val_loss&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;val_loss&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;item&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;ANN Epoch &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;epoch&lt;/span&gt;&lt;span class="o"&gt;+&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;/&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;num_epochs&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;, Loss: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;train_loss&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;, Validation Loss: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;val_loss&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;ann_training_time&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;time&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;time&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;ann_training_start&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ANN Training Time: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;ann_training_time&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; seconds&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let's define a function to run predictions, calculate the inference time, and compare the two models:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;
&lt;span class="c1"&gt;# Function to predict and measure inference time
&lt;/span&gt;
&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;predict_and_measure_time&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;new_data&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;start_time&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;time&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;time&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;eval&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;no_grad&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
        &lt;span class="n"&gt;new_data_tensor&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;tensor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;new_data&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;dtype&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;float32&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;outputs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;model&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;new_data_tensor&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;inference_time&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;time&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;time&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;start_time&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;outputs&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;inference_time&lt;/span&gt;

&lt;span class="c1"&gt;# Generate new test data
&lt;/span&gt;&lt;span class="n"&gt;X_test&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y_test&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;generate_motion_data&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;event_length&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;num_events&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;noise_level&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Predictions with SNN
&lt;/span&gt;&lt;span class="n"&gt;snn_outputs&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;snn_inference_time&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;predict_and_measure_time&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;snn_model&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;X_test&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;snn_predictions&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;round&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sigmoid&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;snn_outputs&lt;/span&gt;&lt;span class="p"&gt;)).&lt;/span&gt;&lt;span class="nf"&gt;squeeze&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;numpy&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;SNN Predictions:&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;snn_predictions&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;SNN Inference Time: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;snn_inference_time&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; seconds&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Predictions with ANN
&lt;/span&gt;&lt;span class="n"&gt;ann_outputs&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;ann_inference_time&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;predict_and_measure_time&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ann_model&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;X_test&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;ann_predictions&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;round&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sigmoid&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ann_outputs&lt;/span&gt;&lt;span class="p"&gt;)).&lt;/span&gt;&lt;span class="nf"&gt;squeeze&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;numpy&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ANN Predictions:&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;ann_predictions&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ANN Inference Time: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;ann_inference_time&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; seconds&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Comparison Summary
&lt;/span&gt;&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Comparison Summary:&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;SNN Training Time: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;snn_training_time&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; seconds&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ANN Training Time: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;ann_training_time&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; seconds&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;SNN Inference Time: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;snn_inference_time&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; seconds&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ANN Inference Time: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;ann_inference_time&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; seconds&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Final validation accuracies (from the last epoch)
&lt;/span&gt;&lt;span class="n"&gt;snn_model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;eval&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;no_grad&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="n"&gt;snn_val_outputs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;snn_model&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;X_val&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;snn_val_accuracy&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sigmoid&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;snn_val_outputs&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mf"&gt;0.5&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;squeeze&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;float&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="n"&gt;y_val&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;float&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;mean&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;item&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="n"&gt;ann_model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;eval&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;no_grad&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="n"&gt;ann_val_outputs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;ann_model&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;X_val&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;ann_val_accuracy&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sigmoid&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ann_val_outputs&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mf"&gt;0.5&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;squeeze&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;float&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="n"&gt;y_val&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;float&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;mean&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;item&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Final SNN Validation Accuracy: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;snn_val_accuracy&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Final ANN Validation Accuracy: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;ann_val_accuracy&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Following numbers are not definitive as the task is simple. But it should give a clear idea as to how SNN can beat ANN:&lt;/p&gt;

&lt;p&gt;Comparison Summary:&lt;/p&gt;

&lt;p&gt;SNN Training Time: 0.6785 seconds&lt;/p&gt;

&lt;p&gt;ANN Training Time: 0.3952 seconds&lt;/p&gt;

&lt;p&gt;SNN Inference Time: 0.0007 seconds&lt;/p&gt;

&lt;p&gt;ANN Inference Time: 0.0017 seconds&lt;/p&gt;

&lt;p&gt;Final SNN Validation Accuracy: 1.0000&lt;/p&gt;

&lt;p&gt;Final ANN Validation Accuracy: 1.0000&lt;/p&gt;

&lt;p&gt;It took more time to train SNN due to framework and the architecture not in favor of training the model. However in the future we may see different approaches to optimize training time of SNN models. For inference time, SNN was faster than ANN model. This is where the energy efficiency is taking place. Because SNN is much easier to execute under same accuracy, it is consuming less CPU power.&lt;/p&gt;

&lt;h2&gt;
  
  
  Are spiking neural networks the future?
&lt;/h2&gt;

&lt;p&gt;Spiking neural networks (SNNs) are increasingly being considered as a promising frontier in the future of artificial intelligence, particularly due to their closer resemblance to the neural computation seen in biological brains. Leveraging principles from neuroscience, SNNs process information through spikes, or action potentials, which offers a unique form of temporal coding and rate coding.&lt;/p&gt;

&lt;p&gt;In contrast to traditional deep neural networks, which rely on gradient descent and back propagation, SNNs utilize spike-based learning algorithms and synaptic plasticity, making them more efficient in certain types of neural computation. The initialization of these spiking networks involves setting up multi-layer architectures capable of handling the temporal dynamics and correlations within spike trains.&lt;/p&gt;

&lt;p&gt;One of the significant advantages of SNNs is their potential for lower energy consumption, especially when implemented on neuromorphic hardware. These processors mimic the brain’s architecture and function, enabling real-time processing with minimal latency. This is particularly beneficial in applications like robotics, computer vision, and large-scale network models, where real-time and efficient computations are crucial.&lt;/p&gt;

&lt;p&gt;SNNs also offer improved interpretability compared to traditional deep-learning models. Each single neuron in an SNN can be examined for its specific role in the network, which aids in understanding how neural computations propagate through the system. Feedforward and recurrent neural networks can both be implemented within the SNN framework, providing versatility in handling different types of data and tasks.&lt;/p&gt;

&lt;p&gt;Despite these advantages, SNNs face challenges in terms of learning algorithms and network models. The nonlinear nature of spike-based communication and the need for precise temporal synchronization complicate the development of effective supervised learning techniques. Additionally, the number of spikes and their timing (latency) play a crucial role in the plausibility and performance of SNNs.&lt;/p&gt;

&lt;p&gt;Recent advances in state-of-the-art neuromorphic processors and spiking neuron models show promise for overcoming these hurdles. As research in neuroscience and artificial intelligence continues to converge, SNNs may become more viable for practical applications, enhancing the capabilities of both AI and computational neuroscience.&lt;/p&gt;

&lt;p&gt;In summary, spiking neural networks hold significant potential for the future of AI, particularly in areas requiring efficient, real-time processing with low energy consumption. Their biologically inspired approach offers a plausible and powerful alternative to traditional deep learning, potentially revolutionizing fields such as robotics, computer vision, and beyond.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;I’m grateful to the reader for their attention. I hope this blog post will help people understand the potential of spiking neural networks and their use cases. The reader may find the full comparison script below:&lt;/p&gt;

&lt;p&gt;Full script:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;
&lt;span class="err"&gt;!&lt;/span&gt;&lt;span class="n"&gt;pip&lt;/span&gt; &lt;span class="n"&gt;install&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt; &lt;span class="n"&gt;snntorch&lt;/span&gt; &lt;span class="n"&gt;scikit&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="n"&gt;learn&lt;/span&gt;

&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;numpy&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;torch.nn&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;nn&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;torch.optim&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;optim&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;snntorch&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;snn&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;sklearn.model_selection&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;train_test_split&lt;/span&gt;  
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;time&lt;/span&gt;

&lt;span class="c1"&gt;# Generate synthetic event-based motion data
&lt;/span&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;generate_motion_data&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;num_samples&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;event_length&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;num_events&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;noise_level&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;X&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
    &lt;span class="n"&gt;y&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;_&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;num_samples&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;motion_indices&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;random&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;randint&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;event_length&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;size&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;num_events&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;event_data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;zeros&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event_length&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;event_data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;motion_indices&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;
        &lt;span class="n"&gt;noise&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;random&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;normal&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;noise_level&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;size&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;event_length&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;event_data&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="n"&gt;noise&lt;/span&gt;

        &lt;span class="c1"&gt;# Introduce variability in the patterns
&lt;/span&gt;        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;random&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;rand&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="mf"&gt;0.5&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;event_data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;roll&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event_data&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;random&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;randint&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;event_length&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;

        &lt;span class="n"&gt;X&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event_data&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event_data&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;array&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;X&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;array&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Parameters
&lt;/span&gt;&lt;span class="n"&gt;num_samples&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1000&lt;/span&gt;
&lt;span class="n"&gt;event_length&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;100&lt;/span&gt;
&lt;span class="n"&gt;num_events&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;100&lt;/span&gt;
&lt;span class="n"&gt;noise_level&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mf"&gt;0.1&lt;/span&gt;

&lt;span class="c1"&gt;# Generate data
&lt;/span&gt;&lt;span class="n"&gt;X&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;generate_motion_data&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;num_samples&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;event_length&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;num_events&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;noise_level&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Convert to PyTorch tensors
&lt;/span&gt;&lt;span class="n"&gt;X&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;tensor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;X&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;dtype&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;float32&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;y&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;tensor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;dtype&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;float32&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Split into training and validation sets
&lt;/span&gt;&lt;span class="n"&gt;X_train&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;X_val&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y_train&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y_val&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;train_test_split&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;X&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;test_size&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mf"&gt;0.2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;random_state&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;42&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Define SNN model
&lt;/span&gt;&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;SpikingNN&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Module&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;input_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;hidden_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;output_dim&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="nf"&gt;super&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;SpikingNN&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;fc1&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;Linear&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;input_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;hidden_dim&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;lif1&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;snn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;Leaky&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;beta&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mf"&gt;0.9&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;fc2&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;Linear&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;hidden_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;output_dim&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;forward&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;fc1&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;mem1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;spk1&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;lif1&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;fc2&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;spk1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt;

&lt;span class="c1"&gt;# Model, loss function, and optimizer for SNN
&lt;/span&gt;&lt;span class="n"&gt;input_dim&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;event_length&lt;/span&gt;
&lt;span class="n"&gt;hidden_dim&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;64&lt;/span&gt;
&lt;span class="n"&gt;output_dim&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;  &lt;span class="c1"&gt;# Binary classification
&lt;/span&gt;
&lt;span class="n"&gt;snn_model&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;SpikingNN&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;input_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;hidden_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;output_dim&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;criterion&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;BCEWithLogitsLoss&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;  &lt;span class="c1"&gt;# Binary cross-entropy loss
&lt;/span&gt;&lt;span class="n"&gt;optimizer&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;optim&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;Adam&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;snn_model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;parameters&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="n"&gt;lr&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mf"&gt;0.001&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Training loop for SNN
&lt;/span&gt;&lt;span class="n"&gt;num_epochs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;100&lt;/span&gt;
&lt;span class="n"&gt;snn_training_start&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;time&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;time&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;epoch&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;num_epochs&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;snn_model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;train&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;optimizer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;zero_grad&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;outputs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;snn_model&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;X_train&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;loss&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;criterion&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;outputs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;squeeze&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="n"&gt;y_train&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;loss&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;backward&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;optimizer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;step&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="c1"&gt;# Calculate training loss
&lt;/span&gt;    &lt;span class="n"&gt;train_loss&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;loss&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;item&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="c1"&gt;# Validation
&lt;/span&gt;    &lt;span class="n"&gt;snn_model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;eval&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;no_grad&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
        &lt;span class="n"&gt;val_outputs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;snn_model&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;X_val&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;val_loss&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;criterion&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;val_outputs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;squeeze&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="n"&gt;y_val&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;val_loss&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;val_loss&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;item&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;SNN Epoch &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;epoch&lt;/span&gt;&lt;span class="o"&gt;+&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;/&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;num_epochs&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;, Loss: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;train_loss&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;, Validation Loss: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;val_loss&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;snn_training_time&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;time&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;time&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;snn_training_start&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;SNN Training Time: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;snn_training_time&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; seconds&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Define ANN model
&lt;/span&gt;&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;ANN&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Module&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;input_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;hidden_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;output_dim&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="nf"&gt;super&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ANN&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;fc1&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;Linear&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;input_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;hidden_dim&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;relu&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;ReLU&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;fc2&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;Linear&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;hidden_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;output_dim&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;forward&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;fc1&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;relu&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;fc2&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt;

&lt;span class="c1"&gt;# Model, loss function, and optimizer for ANN
&lt;/span&gt;&lt;span class="n"&gt;ann_model&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ANN&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;input_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;hidden_dim&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;output_dim&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;criterion&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;BCEWithLogitsLoss&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;  &lt;span class="c1"&gt;# Binary cross-entropy loss
&lt;/span&gt;&lt;span class="n"&gt;optimizer&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;optim&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;Adam&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ann_model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;parameters&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="n"&gt;lr&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mf"&gt;0.001&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Training loop for ANN
&lt;/span&gt;&lt;span class="n"&gt;ann_training_start&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;time&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;time&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;epoch&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;num_epochs&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;ann_model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;train&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;optimizer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;zero_grad&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;outputs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;ann_model&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;X_train&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;loss&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;criterion&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;outputs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;squeeze&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="n"&gt;y_train&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;loss&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;backward&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;optimizer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;step&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="c1"&gt;# Calculate training loss
&lt;/span&gt;    &lt;span class="n"&gt;train_loss&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;loss&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;item&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="c1"&gt;# Validation
&lt;/span&gt;    &lt;span class="n"&gt;ann_model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;eval&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;no_grad&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
        &lt;span class="n"&gt;val_outputs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;ann_model&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;X_val&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;val_loss&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;criterion&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;val_outputs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;squeeze&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="n"&gt;y_val&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;val_loss&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;val_loss&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;item&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;ANN Epoch &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;epoch&lt;/span&gt;&lt;span class="o"&gt;+&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;/&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;num_epochs&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;, Loss: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;train_loss&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;, Validation Loss: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;val_loss&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;ann_training_time&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;time&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;time&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;ann_training_start&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ANN Training Time: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;ann_training_time&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; seconds&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Function to predict and measure inference time
&lt;/span&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;predict_and_measure_time&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;new_data&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;start_time&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;time&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;time&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;eval&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;no_grad&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
        &lt;span class="n"&gt;new_data_tensor&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;tensor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;new_data&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;dtype&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;float32&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;outputs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;model&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;new_data_tensor&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;inference_time&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;time&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;time&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;start_time&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;outputs&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;inference_time&lt;/span&gt;

&lt;span class="c1"&gt;# Generate new test data
&lt;/span&gt;&lt;span class="n"&gt;X_test&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y_test&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;generate_motion_data&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;event_length&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;num_events&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;noise_level&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Predictions with SNN
&lt;/span&gt;&lt;span class="n"&gt;snn_outputs&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;snn_inference_time&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;predict_and_measure_time&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;snn_model&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;X_test&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;snn_predictions&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;round&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sigmoid&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;snn_outputs&lt;/span&gt;&lt;span class="p"&gt;)).&lt;/span&gt;&lt;span class="nf"&gt;squeeze&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;numpy&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;SNN Predictions:&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;snn_predictions&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;SNN Inference Time: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;snn_inference_time&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; seconds&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Predictions with ANN
&lt;/span&gt;&lt;span class="n"&gt;ann_outputs&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;ann_inference_time&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;predict_and_measure_time&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ann_model&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;X_test&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;ann_predictions&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;round&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sigmoid&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ann_outputs&lt;/span&gt;&lt;span class="p"&gt;)).&lt;/span&gt;&lt;span class="nf"&gt;squeeze&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;numpy&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ANN Predictions:&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;ann_predictions&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ANN Inference Time: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;ann_inference_time&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; seconds&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Comparison Summary
&lt;/span&gt;&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Comparison Summary:&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;SNN Training Time: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;snn_training_time&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; seconds&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ANN Training Time: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;ann_training_time&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; seconds&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;SNN Inference Time: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;snn_inference_time&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; seconds&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ANN Inference Time: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;ann_inference_time&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; seconds&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Final validation accuracies (from the last epoch)
&lt;/span&gt;&lt;span class="n"&gt;snn_model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;eval&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;no_grad&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="n"&gt;snn_val_outputs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;snn_model&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;X_val&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;snn_val_accuracy&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sigmoid&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;snn_val_outputs&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mf"&gt;0.5&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;squeeze&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;float&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="n"&gt;y_val&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;float&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;mean&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;item&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="n"&gt;ann_model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;eval&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;no_grad&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="n"&gt;ann_val_outputs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;ann_model&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;X_val&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;ann_val_accuracy&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sigmoid&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ann_val_outputs&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mf"&gt;0.5&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;squeeze&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;float&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="n"&gt;y_val&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;float&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;mean&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;item&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Final SNN Validation Accuracy: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;snn_val_accuracy&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Final ANN Validation Accuracy: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;ann_val_accuracy&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>webdev</category>
      <category>ai</category>
      <category>machinelearning</category>
      <category>spikingneuralnetworks</category>
    </item>
    <item>
      <title>Google Search Parameters (2024 Guide)</title>
      <dc:creator>Emirhan Akdeniz</dc:creator>
      <pubDate>Tue, 23 Apr 2024 02:25:21 +0000</pubDate>
      <link>https://dev.to/serpapi/google-search-parameters-2024-guide-1k0g</link>
      <guid>https://dev.to/serpapi/google-search-parameters-2024-guide-1k0g</guid>
      <description>&lt;p&gt;The Google Search URL parameters are important to understand whether you are maximizing the conversion rate in your ad groups and optimizing your cost per click(CPC) rates in Google Analytics for your ad campaigns, improving your SEO(Search Engine Optimization) metrics for your e-commerce business, or collecting data for your social media project. Using custom parameters for your search will affect the Search Engine Result Page (SERP) data significantly in your favor. &lt;a href="https://serpapi.com"&gt;SerpApi&lt;/a&gt; unlocks the power of SERP data to you with minimal effort, the fastest response times, and in the most complete form.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/users/sign_up"&gt;Register to claim free credits&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this blog post, we will explore some of the Google search parameters with their corresponding attributions in &lt;a href="https://serpapi.com/search-api"&gt;SerpApi Documentation&lt;/a&gt;. You can head to &lt;a href="https://serpapi.com/playground"&gt;SerpApi Playground&lt;/a&gt; to test all of the information you will read below.&lt;/p&gt;

&lt;h3&gt;
  
  
  What are Google Search parameters?
&lt;/h3&gt;

&lt;p&gt;Google Search parameters are like special codes you can add to the web address (URL) when you search on Google. They help you control and customize your search results. For example, you can use them to specify the language or location for your search.&lt;/p&gt;

&lt;p&gt;Here's how they work: When you type a search query into Google, it creates a URL with your search terms. You can then add parameters to this URL using an ampersand (&amp;amp;) followed by the parameter name and value. These parameters, often called "params" for short, tell Google how you want your search results to be customized.&lt;/p&gt;

&lt;p&gt;For instance, if you want to search for "cats" in Spanish, you could add "hl=es" to the end of the URL. This tells Google to show results in Spanish. Once you've added all the parameters you want, you have your final URL ready for tailoring your tracking template. When you open this URL in your web browser, Google will show you search results based on the parameters you've specified.&lt;/p&gt;

&lt;h2&gt;
  
  
  The q parameter
&lt;/h2&gt;

&lt;p&gt;It represents the search query submitted to Google. For example, in "q=Coffee", "Cofee" is the query you're searching for. It can also be used with special commands for filtering specific file types, filtering keywords in URLs and bodies, etc. giving you more exact matches. Some of these advanced search operators include filetype, inurl, intitle, intext, inanchor, allinurl, allinanchor, allintitle, allintext. You can check out &lt;a href="https://serpapi.com/blog/ultimate-guide-to-google-search-operators-2023-guide/"&gt;Ultimate Guide to Google Search Operators&lt;/a&gt; for detailed, and up-to-date information on how to use Google search operators.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8ervrkrU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/34c3df2b-9e12-4d94-96d6-28701471e845" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8ervrkrU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/34c3df2b-9e12-4d94-96d6-28701471e845" alt="q" width="800" height="461"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/search-api#api-parameters-search-query-q"&gt;Reference in SerpApi Documentation&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The ‘oq‘ parameter of Google represents the query typed into the Google search box.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--BF5xrkhN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/10b8ebdb-66ed-49c7-98de-634df44a951a" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--BF5xrkhN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/10b8ebdb-66ed-49c7-98de-634df44a951a" alt="oq" width="800" height="461"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The hl parameter
&lt;/h2&gt;

&lt;p&gt;It stands for a host language, it determines the language used for the search. There are different versions of the ‘hl‘ parameters you can pass to a Google Search. The two-letter version is for all languages supported by Google. For example "hl=en" searches in the English language. There are also two letters followed by a hyphen and two letters or three-digit versions representing dialects. Examples of this kind are "hl=en-gb" representing British English, or "hl=es-419" representing Latin American Spanish.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--it3PhrUt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/d7184c71-30ef-4c1d-b1c8-54c5265f4a4e" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--it3PhrUt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/d7184c71-30ef-4c1d-b1c8-54c5265f4a4e" alt="hl" width="800" height="461"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/search-api#api-parameters-localization-hl"&gt;Reference in SerpApi Documentation&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The gl parameter
&lt;/h2&gt;

&lt;p&gt;It represents geolocation, which can be used to personalize search results based on the region. It is a two-letter country code. For example "gl=us" will serve results for the United States.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--IcYr-fsQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/6fb764ff-f65e-41a4-b581-e24463e02a95" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--IcYr-fsQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/6fb764ff-f65e-41a4-b581-e24463e02a95" alt="gl" width="800" height="461"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/search-api#api-parameters-localization-gl"&gt;Reference in SerpApi Documentation&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The lr parameter
&lt;/h2&gt;

&lt;p&gt;It stands for language restriction, it's used to limit your search results to webpages in a specific language or languages. It is a two-letter language code proceeded by the word lang and an underscore. Multiple languages can be passed using a bar symbol. For example "lr=lang_en|lang_fr" will restrict the results to websites with only English and French languages.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--gwIksZUS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/3363f66b-360c-40e8-b0cc-20f1276e94a6" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--gwIksZUS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/3363f66b-360c-40e8-b0cc-20f1276e94a6" alt="lr" width="800" height="461"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/search-api#api-parameters-localization-lr"&gt;Reference in SerpApi Documentation&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The cr parameter
&lt;/h2&gt;

&lt;p&gt;It represents country restriction, it refines results based on preferred region. It is a two-letter uppercase country code proceeded by the word country. Multiple countries can be passed using a bar symbol. For example "cr=countryUS|countryFR" will restrict the results pages to the United States and France.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--20ZvlMxc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/0544b1b7-360e-4e30-abf7-c8e068952df7" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--20ZvlMxc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/0544b1b7-360e-4e30-abf7-c8e068952df7" alt="cr" width="800" height="461"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/search-api#api-parameters-localization-cr"&gt;Reference in SerpApi Documentation&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The tbs parameter
&lt;/h2&gt;

&lt;p&gt;It stands for to be searched, it can be used for various purposes, for example, to filter results by a specific date or a date range (e.g., past week, past month). For example, "tbs=qdr:h" will serve only the results from the last hour.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--dsiwsIEr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/5f58a75f-5938-4cc9-8cc4-752aa66152a3" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--dsiwsIEr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/5f58a75f-5938-4cc9-8cc4-752aa66152a3" alt="tbs" width="800" height="461"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/search-api#api-parameters-advanced-filters-tbs"&gt;Reference in SerpApi Documentation&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The source parameter
&lt;/h2&gt;

&lt;p&gt;It indicates the source of your search, such as the Google homepage or a referral from another website. For example "source=hp" means that the page was accessed through the homepage.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Ce_TnUp3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/300520e7-ac3a-4207-87a4-e264773b648f" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Ce_TnUp3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/300520e7-ac3a-4207-87a4-e264773b648f" alt="source" width="800" height="452"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The ei parameter
&lt;/h2&gt;

&lt;p&gt;It possibly stands for engine identifier, a unique identifier assigned to each search, likely used for internal tracking by Google. Its decoded form consists of four numbers. The first two are a Unix timestamp and microseconds of that timestamp respectively. The last two are undisclosed and unknown.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ajlRX-Hb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/bbfb31e1-7852-41e0-97c5-11b37eea04ad" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ajlRX-Hb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/bbfb31e1-7852-41e0-97c5-11b37eea04ad" alt="ei" width="800" height="452"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The ved parameter
&lt;/h2&gt;

&lt;p&gt;It is used for internal tracking of links and elements. Apart from the link itself, the element index and type can be gathered by decoding the ved value.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--yH5k6h5i--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/f21c446d-649c-4dd1-91b6-8acf8a81a01b" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--yH5k6h5i--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/f21c446d-649c-4dd1-91b6-8acf8a81a01b" alt="ved" width="800" height="454"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/serpapi/ved_decoder"&gt;Open Source Repository for Decoding the ved values&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The gs_lp parameter
&lt;/h2&gt;

&lt;p&gt;Not much is disclosed about this parameter. It contains the search query and search client.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--dci5EYVx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/6dce671b-1bc4-4d60-9523-afd6575e7c30" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--dci5EYVx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/6dce671b-1bc4-4d60-9523-afd6575e7c30" alt="gs_lp" width="800" height="456"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The iflsig parameter
&lt;/h2&gt;

&lt;p&gt;This parameter isn't disclosed. It is speculated to be about internal redirects. Other assumptions include relations to parsing.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--LBqN6bEK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/af689b09-5075-4b93-b5b7-5e26d56c2ed7" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--LBqN6bEK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/af689b09-5075-4b93-b5b7-5e26d56c2ed7" alt="iflsig" width="800" height="456"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The ibp parameter
&lt;/h2&gt;

&lt;p&gt;It is responsible for rendering elements differently than their usual format. For example, when used with the ludocid parameter (unique id for a Google Business listing) in a mobile view, "ibp=gwp;0,7&amp;amp;ludocid=1663467585083384531", the 'ibp' parameter can give full page views of the listing.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--xlAvvK-B--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/dabf949d-a7c5-4a70-981d-531a782ac731" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--xlAvvK-B--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/dabf949d-a7c5-4a70-981d-531a782ac731" alt="ibp" width="800" height="456"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/search-api#api-parameters-advanced-google-parameters-ibp"&gt;Reference in SerpApi Documentation&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The kgmid parameter
&lt;/h2&gt;

&lt;p&gt;The parameter is responsible for serving elements saved by Google including knowledge graph. It can override the ‘q‘ (query) parameter. For example "kgmid=/m/07gyp7" will get results for McDonald's with a knowledge graph regardless of the query parameter. The parameter is likely acting as a cached version of the Google advanced search parameters.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--JWUWhpOX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/39be971b-46a0-46ca-a48c-c7af4d23d4b7" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--JWUWhpOX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/39be971b-46a0-46ca-a48c-c7af4d23d4b7" alt="kgmid" width="800" height="456"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/search-api#api-parameters-advanced-google-parameters-kgmid"&gt;Reference in SerpApi Documentation&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The si parameter
&lt;/h2&gt;

&lt;p&gt;It is used to serve knowledge graph tabs alongside other tasks. For example, in the example below, the 'si' parameter is responsible for serving the Google search results with the 'Music Groups' tab of the knowledge graph for Jimi Hendrix.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--SXn2Z4FK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/e4618533-895c-4c3d-addf-c1babec54970" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--SXn2Z4FK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/e4618533-895c-4c3d-addf-c1babec54970" alt="si" width="800" height="456"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/search-api#api-parameters-advanced-google-parameters-si"&gt;Reference in SerpApi Documentation&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The sxsrf parameter
&lt;/h2&gt;

&lt;p&gt;It is possibly an internal token for security. The 'sxsrf' consists of an encoded token probably for security purposes, and a unix timestamp.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zsYjZ6El--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/917a0c22-62bd-40e5-8d14-9d74530db415" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zsYjZ6El--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/917a0c22-62bd-40e5-8d14-9d74530db415" alt="sxsrf" width="800" height="459"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The uule parameter
&lt;/h2&gt;

&lt;p&gt;It is used to specify a location for a search. The 'uule' parameter has two types. The first type is an encoded form of the canonical name of a place in &lt;a href="https://developers.google.com/google-ads/api/data/geotargets"&gt;Google Ads API Geo Targets&lt;/a&gt;. The second type is an encoded form of the GPS coordinates alongside other data.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Z1bYz8gs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/c555c0a6-d3a9-4de4-a480-58871d432d14" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Z1bYz8gs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/c555c0a6-d3a9-4de4-a480-58871d432d14" alt="uule" width="800" height="452"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/search-api#api-parameters-geographic-location-uule"&gt;Reference in SerpApi Documentation&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/blog/gps-targeted-local-serp/"&gt;A Blog Post about Different Types of UULE&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The sclient parameter
&lt;/h2&gt;

&lt;p&gt;It stands for search client. For example "sclient=gws-wiz" means Google Web Search Wizard is used for the search.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--KN8XOiBR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/f2d589b4-5438-42ab-aae5-de3c749e096f" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--KN8XOiBR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/f2d589b4-5438-42ab-aae5-de3c749e096f" alt="sclient" width="800" height="452"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The tbm parameter
&lt;/h2&gt;

&lt;p&gt;It stands for to be matched. This parameter specifies the type of content that the user is searching for, allowing Google to tailor the search results accordingly. For instance, "tbm=isch" indicates that the search is intended for image search, while "tbm=nws" directs Google to prioritize results from Google News. By utilizing this parameter, users can fine-tune their searches to precisely start finding pages, sites, or specific content types using Google's sophisticated algorithm. Additionally, users can specify whether they want to search within meta information, anchor text, or other elements through the appropriate parameters. This level of customization empowers users to efficiently navigate the vast expanse of information accessible through Google's search bar.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--W2owTNv9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/c1ceb6e5-3fb4-4cbb-accb-8510ddd2f4e1" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--W2owTNv9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://gist.github.com/assets/73674035/c1ceb6e5-3fb4-4cbb-accb-8510ddd2f4e1" alt="tbm" width="800" height="452"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/search-api#api-parameters-search-type-tbm"&gt;Reference in SerpApi Documentation&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;I would like to thank the reader for their attention. I hope this writing could shed some light on how Google search parameters work. For more up-to-date information about webscraping, SERP data, and more related topics you may visit the &lt;a href="https://serpapi.com/blog"&gt;SerpApi Blog&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>google</category>
      <category>parameters</category>
      <category>serp</category>
      <category>webdev</category>
    </item>
    <item>
      <title>Comparing Types of Databases: A Real-World Benchmark Analysis</title>
      <dc:creator>Emirhan Akdeniz</dc:creator>
      <pubDate>Thu, 09 Nov 2023 19:48:01 +0000</pubDate>
      <link>https://dev.to/serpapi/comparing-types-of-databases-a-real-world-benchmark-analysis-4fjg</link>
      <guid>https://dev.to/serpapi/comparing-types-of-databases-a-real-world-benchmark-analysis-4fjg</guid>
      <description>&lt;p&gt;&lt;strong&gt;Intro&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Choosing between common database solutions can be tough. In this writing, I will be sharing how to choose which database to use for a specific real-world case, give introductory information on how different databases store data, open source a benchmarking script, discuss important factors to take in decision-making, and share benchmark reports in charts.&lt;/p&gt;

&lt;h1&gt;
  
  
  The Problem
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://serpapi.com"&gt;SerpApi&lt;/a&gt; is an API for scraping Google and other search engines with fast, easy, and complete solutions. Our team is tackling a challenge within our operational database where we house a Locations collection featuring predefined data structures. We aim to offer these locations to clients reliably and allow them to utilize these in their searches with &lt;a href="https://serpapi.com/search-api"&gt;SerpApi's Google Search API&lt;/a&gt;, demonstrating the interconnected functionality of various types of databases. You may &lt;a href="https://serpapi.com/users/sign_up"&gt;register to claim free credits&lt;/a&gt; to try out our products&lt;/p&gt;

&lt;h1&gt;
  
  
  The Present Solution and Its Shortcomings
&lt;/h1&gt;

&lt;p&gt;We're operating a robust MongoDB server—an example of one of the types of NoSQL database, more specifically a document-oriented database—which stores Locations collection objects. This enables us to efficiently store data, manage complex data structures, and enhance the Location model's scalability. Its adeptness at performing complete searches, fuzzy matches, and geospatial evaluations between nodes stand out. The Location collection is one of the primary keys of the search object and is crucial for a big portion of our searches.&lt;/p&gt;

&lt;p&gt;However, this approach has a scalability issue concerning portability. The firm coupling with a centralized database alongside other collections with parent-child relations limits our agility in customizing the schema and affects the playroom for diverse operational settings. Altering a small collection by adding or deleting new entries in such a hierarchical database carries inherent risks due to its monolithic nature.&lt;/p&gt;

&lt;p&gt;Additionally, constants loaded during server initialization, which could reside in a cloud database with superior indexing, instead consume significant amounts of memory. For this analysis, we'll focus on the Location model within this DBMS.&lt;/p&gt;

&lt;h1&gt;
  
  
  Proposed Solutions
&lt;/h1&gt;

&lt;p&gt;I have devised three prototypes to surmount the portability hurdle. These databases store and serve in various ways. I graph database benchmarks in order to provide different types of data for each prototype.&lt;/p&gt;

&lt;h2&gt;
  
  
  Using a Static Sqlite3 File
&lt;/h2&gt;

&lt;p&gt;Deploying a static Sqlite3 file, akin to a personal database on each server, offers remarkable portability, allows for server-level data management, and provides the advantages of a relational database management system (RDBMS). This is reflective of the distributed database paradigm, where local updates can be methodically managed and validated through unit testing. Integration with our MongoDB documents is seamless, thanks to the BSON-serialization of Ruby hashes. SQLite is a popular database software choice for utilizing the advantages of relational models.&lt;/p&gt;

&lt;h2&gt;
  
  
  Streaming from a Static JSON File
&lt;/h2&gt;

&lt;p&gt;Streaming from a static JSON file, which is a form of semi-structured data, is another solution, maintaining control at the server level for safe, localized development. Even updating key-value structures is relatively easy. This represents a move towards a microservices architecture, often hosted on cloud computing platforms. This acts as a kind of database for information management.&lt;/p&gt;

&lt;h2&gt;
  
  
  External Standalone MongoDB Server
&lt;/h2&gt;

&lt;p&gt;A standalone MongoDB server represents a non-relational database alternative for storing child records of the search object, more specifically the Location collection. This option demands a deep understanding of database technology, offering real-time adjustments through a secured public IP, maintaining data security, and harnessing MongoDB 2dsphere for indexing data models for spherical approximations. It is a popular database to store information with wide community support.&lt;/p&gt;

&lt;h1&gt;
  
  
  Performance Benchmarks
&lt;/h1&gt;

&lt;p&gt;The benchmarks involve various queries and GPS coordinates, providing an average of real-time performance over multiple iterations. These parameters act as a data set to compare different performance metrics of the prototypes.&lt;/p&gt;

&lt;p&gt;I have applied the benchmark on JSON Streaming prototype only up to 5 concurrency. It wasn't efficient enough to benchmark all the way through in my local environment and was facing difficult bottlenecks.&lt;/p&gt;

&lt;p&gt;There are important details to consider in this benchmark. The current solution we have with a Monolithic Mongodb database, and the Standalone Mongodb server is executing some commands externally on a remote server. Therefore the data is dependent on the connection time latency. Unlike these two other solutions, the Sqlite3 prototype, and JSON Streaming Prototype are contained within an individual server and are only affected by system load and throughput of hardware. his benchmark is storing different data types when it comes to time. For the purposes of simplifying things, we will be using average real-time taken as our indicator.&lt;/p&gt;

&lt;p&gt;You may find the complete benchmark below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'benchmark'&lt;/span&gt;
&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'thread'&lt;/span&gt;
&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'csv'&lt;/span&gt;

&lt;span class="no"&gt;SAMPLE_QUERIES&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'USA'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'FalseInput'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Canada'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Austin'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Ankara'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="no"&gt;SAMPLE_COORDINATES&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[[&lt;/span&gt;&lt;span class="mf"&gt;37.0902&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mf"&gt;95.7129&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mf"&gt;56.1304&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mf"&gt;106.3468&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mf"&gt;30.2672&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mf"&gt;97.7431&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mf"&gt;39.9334&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;32.8597&lt;/span&gt;&lt;span class="p"&gt;]]&lt;/span&gt;
&lt;span class="no"&gt;TIMES&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;
&lt;span class="no"&gt;CONCURRENCY_LEVELS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="o"&gt;..&lt;/span&gt;&lt;span class="mi"&gt;50&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;to_a&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;perform_searches&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;method&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;latitude&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kp"&gt;nil&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;longitude&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kp"&gt;nil&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="nb"&gt;method&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="ss"&gt;:find_nearest_location&lt;/span&gt;
    &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;send&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;method&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;latitude&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;longitude&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="k"&gt;else&lt;/span&gt;
    &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;send&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;method&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="k"&gt;end&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;concurrent_benchmark&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;method&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;concurrency_level&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;latitude&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kp"&gt;nil&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;longitude&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kp"&gt;nil&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="n"&gt;threads&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
  &lt;span class="n"&gt;concurrency_level&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;times&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
    &lt;span class="n"&gt;threads&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="no"&gt;Thread&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
      &lt;span class="n"&gt;perform_searches&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;method&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;latitude&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;longitude&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;end&lt;/span&gt;
  &lt;span class="k"&gt;end&lt;/span&gt;
  &lt;span class="n"&gt;threads&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="ss"&gt;:join&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;benchmark_methods&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="n"&gt;results&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{}&lt;/span&gt;

  &lt;span class="no"&gt;CONCURRENCY_LEVELS&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;concurrency&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
    &lt;span class="n"&gt;results&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;concurrency&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{}&lt;/span&gt;
    &lt;span class="no"&gt;SAMPLE_QUERIES&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
      &lt;span class="n"&gt;results&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;concurrency&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="ss"&gt;search: &lt;/span&gt;&lt;span class="mf"&gt;0.0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="ss"&gt;search_for_completion: &lt;/span&gt;&lt;span class="mf"&gt;0.0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="ss"&gt;find_nearest_location: &lt;/span&gt;&lt;span class="mf"&gt;0.0&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;

      &lt;span class="n"&gt;times&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Array&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="no"&gt;TIMES&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
        &lt;span class="no"&gt;Benchmark&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;measure&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
          &lt;span class="n"&gt;concurrent_benchmark&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="ss"&gt;:search&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;concurrency&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;end&lt;/span&gt;
      &lt;span class="k"&gt;end&lt;/span&gt;
      &lt;span class="n"&gt;results&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;concurrency&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="ss"&gt;:search&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="ss"&gt;real: &lt;/span&gt;&lt;span class="n"&gt;times&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="ss"&gt;:real&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="no"&gt;TIMES&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="ss"&gt;cpu: &lt;/span&gt;&lt;span class="n"&gt;times&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="ss"&gt;:total&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="no"&gt;TIMES&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="ss"&gt;system: &lt;/span&gt;&lt;span class="n"&gt;times&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="ss"&gt;:stime&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="no"&gt;TIMES&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="ss"&gt;user: &lt;/span&gt;&lt;span class="n"&gt;times&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="ss"&gt;:utime&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="no"&gt;TIMES&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;

      &lt;span class="n"&gt;times&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Array&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="no"&gt;TIMES&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
        &lt;span class="no"&gt;Benchmark&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;measure&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
          &lt;span class="n"&gt;concurrent_benchmark&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="ss"&gt;:search_for_completion&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;concurrency&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;end&lt;/span&gt;
      &lt;span class="k"&gt;end&lt;/span&gt;
      &lt;span class="n"&gt;results&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;concurrency&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="ss"&gt;:search_for_completion&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="ss"&gt;real: &lt;/span&gt;&lt;span class="n"&gt;times&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="ss"&gt;:real&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="no"&gt;TIMES&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="ss"&gt;cpu: &lt;/span&gt;&lt;span class="n"&gt;times&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="ss"&gt;:total&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="no"&gt;TIMES&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="ss"&gt;system: &lt;/span&gt;&lt;span class="n"&gt;times&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="ss"&gt;:stime&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="no"&gt;TIMES&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="ss"&gt;user: &lt;/span&gt;&lt;span class="n"&gt;times&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="ss"&gt;:utime&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="no"&gt;TIMES&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;

      &lt;span class="n"&gt;coord_index&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;SAMPLE_QUERIES&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;index&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="n"&gt;latitude&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;longitude&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;SAMPLE_COORDINATES&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;coord_index&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
      &lt;span class="n"&gt;times&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Array&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="no"&gt;TIMES&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
        &lt;span class="no"&gt;Benchmark&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;measure&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
          &lt;span class="n"&gt;concurrent_benchmark&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="ss"&gt;:find_nearest_location&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;concurrency&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;latitude&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;longitude&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;end&lt;/span&gt;
      &lt;span class="k"&gt;end&lt;/span&gt;
      &lt;span class="n"&gt;results&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;concurrency&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="ss"&gt;:find_nearest_location&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="ss"&gt;real: &lt;/span&gt;&lt;span class="n"&gt;times&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="ss"&gt;:real&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="no"&gt;TIMES&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="ss"&gt;cpu: &lt;/span&gt;&lt;span class="n"&gt;times&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="ss"&gt;:total&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="no"&gt;TIMES&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="ss"&gt;system: &lt;/span&gt;&lt;span class="n"&gt;times&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="ss"&gt;:stime&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="no"&gt;TIMES&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="ss"&gt;user: &lt;/span&gt;&lt;span class="n"&gt;times&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="ss"&gt;:utime&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="no"&gt;TIMES&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="k"&gt;end&lt;/span&gt;
  &lt;span class="k"&gt;end&lt;/span&gt;

  &lt;span class="n"&gt;results&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;print_results&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;results&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="n"&gt;results&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;concurrency&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;concurrency_results&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
    &lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"Concurrency Level: &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;concurrency&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
    &lt;span class="n"&gt;concurrency_results&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;methods&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
      &lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"Query: &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
      &lt;span class="nb"&gt;methods&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="nb"&gt;method&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;times&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
        &lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"  &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="nb"&gt;method&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;:"&lt;/span&gt;
        &lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"    Average real time over &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="no"&gt;TIMES&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt; runs: &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;times&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="ss"&gt;:real&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;round&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;s"&lt;/span&gt;
        &lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"    Average CPU time over &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="no"&gt;TIMES&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt; runs: &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;times&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="ss"&gt;:cpu&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;round&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;s"&lt;/span&gt;
        &lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"    Average system time over &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="no"&gt;TIMES&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt; runs: &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;times&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="ss"&gt;:system&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;round&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;s"&lt;/span&gt;
        &lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"    Average user time over &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="no"&gt;TIMES&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt; runs: &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;times&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="ss"&gt;:user&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;round&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;s"&lt;/span&gt;
      &lt;span class="k"&gt;end&lt;/span&gt;
    &lt;span class="k"&gt;end&lt;/span&gt;
    &lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"-"&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="mi"&gt;30&lt;/span&gt;
  &lt;span class="k"&gt;end&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;
&lt;span class="n"&gt;results&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;benchmark_methods&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="no"&gt;Location&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;print_results&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;results&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;export_to_csv&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;results&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;filename&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"benchmark_results.csv"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="no"&gt;CSV&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;open&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;filename&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s2"&gt;"wb"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;csv&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
    &lt;span class="n"&gt;csv&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"Concurrency Level"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s2"&gt;"Query"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s2"&gt;"Method"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s2"&gt;"Average Real Time"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s2"&gt;"Average CPU Time"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s2"&gt;"Average System Time"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s2"&gt;"Average User Time"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="n"&gt;results&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;concurrency&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;concurrency_results&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
      &lt;span class="n"&gt;concurrency_results&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;methods&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
        &lt;span class="nb"&gt;methods&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="nb"&gt;method&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;times&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
          &lt;span class="n"&gt;csv&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;concurrency&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;method&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;times&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="ss"&gt;:real&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;times&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="ss"&gt;:cpu&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;times&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="ss"&gt;:system&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;times&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="ss"&gt;:user&lt;/span&gt;&lt;span class="p"&gt;]]&lt;/span&gt;
        &lt;span class="k"&gt;end&lt;/span&gt;
      &lt;span class="k"&gt;end&lt;/span&gt;
    &lt;span class="k"&gt;end&lt;/span&gt;
  &lt;span class="k"&gt;end&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;

&lt;span class="n"&gt;export_to_csv&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;results&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;filename&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"benchmark_results.csv"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Overall Benchmark Results
&lt;/h2&gt;

&lt;p&gt;&lt;iframe height="600" src="https://codepen.io/kagermanov/embed/MWLmjBP?height=600&amp;amp;default-tab=result&amp;amp;embed-version=2"&gt;
&lt;/iframe&gt;
&lt;br&gt;
The overarching benchmarks, visualized in the graph above reveal that the JSON Streaming model is non-viable for our high-performance standards, primarily due to the demanding optimization it requires for handling big data. It is also easier to read minute details for data points within the charts without its data. It is not a fit prototype to be used in industry-standard methodologies.&lt;/p&gt;
&lt;h2&gt;
  
  
  Benchmark Results for Proximity Estimation
&lt;/h2&gt;

&lt;p&gt;&lt;iframe height="600" src="https://codepen.io/kagermanov/embed/LYqyRMV?height=600&amp;amp;default-tab=result&amp;amp;embed-version=2"&gt;
&lt;/iframe&gt;
&lt;br&gt;
As depicted above, our prototypes, including the current SQL database solution and the standalone MongoDB server, demonstrate diverse performance levels in geospatial proximity estimation. The MongoDB model showcases the durability of its geospatial indexing capabilities.&lt;/p&gt;

&lt;p&gt;While MongoDB has 2dsphere, for Sqlite3 I have used the simplification below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt; &lt;span class="n"&gt;where&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="no"&gt;Sequel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;~&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="ss"&gt;gps_lat: &lt;/span&gt;&lt;span class="kp"&gt;nil&lt;/span&gt;&lt;span class="p"&gt;)).&lt;/span&gt;&lt;span class="nf"&gt;where&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="no"&gt;Sequel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;~&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="ss"&gt;gps_lng: &lt;/span&gt;&lt;span class="kp"&gt;nil&lt;/span&gt;&lt;span class="p"&gt;)).&lt;/span&gt;&lt;span class="nf"&gt;order&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="no"&gt;Sequel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;lit&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;"ABS(gps_lat - ?) + ABS(gps_lng - ?)"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;latitude&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;longitude&lt;/span&gt;&lt;span class="p"&gt;)).&lt;/span&gt;&lt;span class="nf"&gt;first&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The Haversine formula is much more effective in giving more accurate approximations for dense maps. But for the purposes of this task, using absolute difference as an indicator is enough. However, MongoDB is excelling at speed even though it is using the Haversine formula.&lt;/p&gt;

&lt;p&gt;Also, the current monolithic solution and the standalone prototype give roughly the same numbers under different concurrencies. Let's point out the fact that we can improve the standalone server's hardware easily.&lt;/p&gt;

&lt;h2&gt;
  
  
  Benchmark Results for Fuzzy Search Capability
&lt;/h2&gt;

&lt;p&gt;&lt;iframe height="600" src="https://codepen.io/kagermanov/embed/NWojRey?height=600&amp;amp;default-tab=result&amp;amp;embed-version=2"&gt;
&lt;/iframe&gt;
&lt;br&gt;
The above chart shows the results for making a fuzzy search, a feature critical in web applications, to find a result with the common words the user is providing. Although the difference is not that big, the Sqlite3 prototype is clearly ahead in terms of speed. Also, the margin between the standalone prototype and the current monolithic server is growing. There is one point that needs attention though. The Sqlite3 prototype uses a LIKE clause in SQL to make a fuzzy search. An equivalent or similar solution for enhancing MongoDB's similar functionality with SQL-like structured query language constructs. would result in faster search completions for a standalone MongoDB prototype.&lt;/p&gt;

&lt;h2&gt;
  
  
  Direct Search Method Benchmark Results
&lt;/h2&gt;

&lt;p&gt;&lt;iframe height="600" src="https://codepen.io/kagermanov/embed/BaMRLMm?height=600&amp;amp;default-tab=result&amp;amp;embed-version=2"&gt;
&lt;/iframe&gt;
&lt;br&gt;
This is the method responsible for the overwhelming majority of the searches &lt;a href="https://serpapi.com/"&gt;SerpApi&lt;/a&gt; is processing. So any kind of sign in here is crucial to the entire comparison. Although the Sqlite3 prototype is superior to the Standalone MongoDB prototype in lower concurrencies, the latter is keeping consistency over high concurrency and winning against the former which is pivotal for large amounts of data and complex data management scenarios. The difference between the current Monolithic MongoDB Solution, and the Standalone MongoDB Prototype is not that big in terms of milliseconds. The standalone database can also be configured to have higher resources to compete.&lt;/p&gt;

&lt;h1&gt;
  
  
  Conclusion
&lt;/h1&gt;

&lt;p&gt;Discarding the JSON Streaming model due to its concurrency shortcomings in big data storage, we are left to decide between the Sqlite3 and MongoDB standalone prototypes. We are left with several factors between different types of databases, and the architecture we use them in.&lt;/p&gt;

&lt;p&gt;Any kind of lack of efficiency in the Sqlite3 Prototype should be dealt with by upgrading the resources of each individual server in the cluster. This is definitely one of the bad signs for a high-load task, threatening maintenance of atomicity, and data integrity. The expectation to dig deep into SQL is also a downside although it is not as tough as learning a new programming language. In addition to these, the lack of efficiency in the search method alone is a good enough reason not to go forward with the database model. However, any way of delivering the efficiency of the search for completion method in the Sqlite3 prototype to a Standalone MongoDB server would be beneficial. Anything that works with the same logic of the LIKE clause of SQL in MongoDB could benefit &lt;a href="https://serpapi.com/playground"&gt;SerpApi's Playground&lt;/a&gt; for the end user. Also, the standalone MongoDB server should be optimized with additional resources to get as close to the current solution as possible.&lt;/p&gt;

&lt;p&gt;In conclusion, the standalone MongoDB database structure should be used for storing secondary constants and collections for our use case. I would like to thank the reader for their attention. I hope this writing could shed some light on how to benchmark different prototype solutions for database examples to tackle one problem.&lt;/p&gt;

&lt;p&gt;Originally published at &lt;a href="https://serpapi.com/blog/comparing-types-of-databases-a-real-world-benchmark-analysis/"&gt;https://serpapi.com&lt;/a&gt; on November 8, 2023.&lt;/p&gt;

</description>
      <category>database</category>
      <category>mongodb</category>
      <category>sql</category>
      <category>json</category>
    </item>
    <item>
      <title>Cold Email Marketing with Open-Source Email Extractor</title>
      <dc:creator>Emirhan Akdeniz</dc:creator>
      <pubDate>Fri, 07 Jul 2023 19:38:51 +0000</pubDate>
      <link>https://dev.to/serpapi/cold-email-marketing-with-open-source-email-extractor-3j54</link>
      <guid>https://dev.to/serpapi/cold-email-marketing-with-open-source-email-extractor-3j54</guid>
      <description>&lt;p&gt;In this blog post, we will deep-dive into the fascinating world of Cold Email Marketing, exploring its multifaceted strategies, unraveling the do's and don'ts, and introducing you to an exceptional Open-Source tool that simplifies the process of collecting mass-targeted emails.&lt;/p&gt;

&lt;p&gt;Related Sources&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://github.com/serpapi/clauneck"&gt;Clauneck&lt;/a&gt;: An Open-Source Command Line Tool to Scrape and Extract Emails and Social Network Handles with Targeted SERP Data&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://serpapi.com/search-api"&gt;SerpApi's Google Search API&lt;/a&gt;: An Easy-to-use API that provides Real-Time Google Search Results in JSON format&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  What Cold Email Means?
&lt;/h2&gt;

&lt;p&gt;Cold emailing, a pivotal strategy in digital marketing, stands as the lifeline of many B2B sales campaigns. It's a technique that involves sending unsolicited but well-crafted emails to potential clients or potential customers, primarily for sales, promotional, or networking purposes. Unlike its infamous cousin spam, a good cold email isn't an unsolicited barrage but a carefully personalized note sent to specific individuals or companies. The targets are meticulously chosen based on certain criteria, such as their industry, job title, or interest level in your product/service.&lt;/p&gt;

&lt;p&gt;As with crafting a compelling sales pitch, building an effective cold email requires meticulous planning. From devising engaging subject lines, and curating appealing email body content, to ensuring that your email successfully avoids the recipient's spam folder, each step is essential. The best cold email practices are all about addressing a pain point, providing a solution, and adding value to the recipient's life or business, rather than being a pesky interruption.&lt;/p&gt;

&lt;h2&gt;
  
  
  Is Cold Emailing Illegal?
&lt;/h2&gt;

&lt;p&gt;One of the biggest myths surrounding cold email outreach is the idea that it's illegal. However, in reality, the legality of cold emails hinges on certain key factors and how you execute your outreach email strategy. In most jurisdictions, it's entirely legal to send cold emails for B2B purposes, as long as the email includes an accurate email signature containing contact information, comprising the sender's physical address and phone number. Moreover, the email must offer a clear and conspicuous unsubscribe option, enabling recipients to opt out of future communications with just a click. It is also recommended to use an email checker for verifying valid email addresses to reduce the bounce rate of emails in order to prevent issues with your ESP.&lt;/p&gt;

&lt;p&gt;However, things may get a little murky when it comes to scraping emails from social media accounts like LinkedIn or websites. Although it's generally considered legal if these addresses are publicly listed and meant for communication purposes, it's crucial to tread carefully. Every email marketer must vigilantly avoid violating privacy laws or regulations such as the CAN-SPAM Act in the U.S. or GDPR in Europe. Thankfully, CRM systems offer an integrated solution for managing this legal maze, helping to ensure compliance and improve email deliverability.&lt;/p&gt;

&lt;h2&gt;
  
  
  Does Cold Email Actually Work?
&lt;/h2&gt;

&lt;p&gt;The efficacy of cold outreach campaigns is a hot topic in digital marketing circles, and the simple answer is yes, it can work phenomenally, but with a few caveats with the right kind of automation. The success of a cold email campaign depends heavily on multiple factors. These factors include the quality of your bulk email list, the recipient’s name or company’s name in addressing, a well-crafted opening line giving an intro to your service, the relevance of your message to the recipient, your cold email subject lines, a call to action for a better conversion rate, and the robustness of your follow-up email strategy. Metrics for a cold email strategy such as open rate, reply rate, and conversion rates stand as the cardinal indicators to measure the effectiveness of a cold email campaign.&lt;/p&gt;

&lt;p&gt;When implemented right, cold emails can transform into a potent tool for lead generation, enabling salespeople to directly reach the decision-makers, bypassing the usual gatekeepers. To optimize your approach, consider A/B testing different cold email templates, subject lines, and CTAs. For instance, integrating a case study in the email body demonstrating how your product/service resolved a similar problem for another company can bolster your value proposition and amplify your response rate. Also, later in this blog post, we will talk about open-source email extractor software that can increase the precision of your email list to reach a more nuanced audience.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is Email Scraping?
&lt;/h2&gt;

&lt;p&gt;Email scraping or email extraction is a high-tech process of deploying an email scraper tool or bot to find email addresses and mine them from various online sources. These sources can span from social media platforms, LinkedIn profiles, and websites, to even text files. An efficient email scraper dives deep into HTML and JavaScript codes, scans web pages, and leverages APIs to collate email addresses.&lt;/p&gt;

&lt;p&gt;The harvested data is usually stored in a structured format like a CSV or Excel file for effortless access and analysis. However, it's vital to note that while scraping public email addresses for communication or lead-generation purposes can be legal, there are ethical considerations and regulations to be mindful of. Upholding respect for user privacy and adherence to data protection laws should always be paramount when engaging in email scraping activities.&lt;/p&gt;

&lt;h3&gt;
  
  
  Is it Legal to Scrape Emails?
&lt;/h3&gt;

&lt;p&gt;The legality of email scraping may seem dubious, but it primarily depends on the intent behind the act and the method of extraction. If the email addresses are publicly accessible and are collected for legitimate purposes like business prospecting, then it's typically considered legal. Email addresses displayed on websites or public social media accounts are often meant for communication purposes. Hence, employing an email scraper tool to compile these addresses aligns with legal practices. However, adhering to the respective data protection regulations of each country and obtaining consent where necessary is crucial to avoid any potential legal complications.&lt;/p&gt;

&lt;h3&gt;
  
  
  Why targeting is important in Scraping Emails?
&lt;/h3&gt;

&lt;p&gt;The importance of targeting in email scraping lies in its potency to offer both efficiency and relevance. Efficiency stems from the need to minimize the number of irrelevant or uninterested recipients who are unlikely to engage with your content. By concentrating on a specific target audience or demographic, the volume of emails you scrape may decrease, but the quality of those emails in terms of potential engagement and response enhances dramatically.&lt;/p&gt;

&lt;p&gt;Moreover, targeting aids in reducing the likelihood of being classified as spammers. When emails are scraped indiscriminately, there's a higher chance that your sales emails will be dispatched to individuals who have no interest in your content or offerings. These individuals are more likely to mark your emails as spam, activating spam filters which can negatively impact your email deliverability rates and sender reputation. By ensuring that the emails you scrape belong to people likely interested in your offerings, you reduce the risk of your emails being labeled as spam, thereby securing your sender's reputation. This leads to higher open rates, better engagement, and overall improved effectiveness of your email marketing efforts.&lt;/p&gt;

&lt;h2&gt;
  
  
  How can SerpApi's Google Search API help with targeting?
&lt;/h2&gt;

&lt;p&gt;Although SerpApi's Google Search API is well known for SEO purposes, it can be an influential tool that enhances the process of email scraping by delivering targeted and refined search results. This API allows you to programmatically interact with the Google Search engine, extracting invaluable data from the search results page that can be used to inform and refine your business email scraping strategy.&lt;/p&gt;

&lt;p&gt;One way SerpApi's Google Search API can aid with targeted email scraping is by creating a more refined list of websites or companies that align with your business or campaign objectives. For example, if you're running a cold email campaign targeting tech startups, you can use the Google Search API to generate a list of such companies. This list can then be input into your email scraper tool, which will extract the email addresses from these specific websites, ensuring your extracted email list is relevant and targeted.&lt;/p&gt;

&lt;h2&gt;
  
  
  An Open-Source Email Extractor Tool
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://github.com/serpapi/clauneck"&gt;&lt;code&gt;Clauneck&lt;/code&gt;&lt;/a&gt; is a Ruby gem, and a user-friendly command line tool designed to scrape specific information from a series of URLs, either directly provided or fetched from Google search results via &lt;a href="https://serpapi.com/search-api"&gt;SerpApi's Google Search API&lt;/a&gt;. Its key feature is to extract email addresses and much more information like social media handles from web pages and to store the results in a CSV file ordered by their domain name.&lt;/p&gt;

&lt;h3&gt;
  
  
  What Clauneck provides?
&lt;/h3&gt;

&lt;p&gt;The tool looks for high-quality sequences it can find in the targeted website such as email sequences, and then extracts data to a local file in CSV format. If it cannot find any one of the information on a website, it will label it as null. For unknown errors happening in-between (connection errors, encoding errors, etc.) the fields will be filled with as error.&lt;/p&gt;

&lt;p&gt;Unlike Google Chrome extensions that need you to visit webpages one by one, Clauneck excels in bringing the list of websites to you by leveraging &lt;a href="//serpapi.com/search-api"&gt;SerpApi’s Google Search API&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Below is a part of the landing page of &lt;a href="https://www.serpapi.com"&gt;SerpApi&lt;/a&gt;. You can observe the targeted information in red boxes:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--LKVi7NhE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/251824347-e64bd564-460e-4322-a470-70682014db7a.png" class="article-body-image-wrapper"&gt;&lt;img alt="the footer at serpapi" src="https://res.cloudinary.com/practicaldev/image/fetch/s--LKVi7NhE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/251824347-e64bd564-460e-4322-a470-70682014db7a.png" width="800" height="401"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here's the end result:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Website&lt;/th&gt;
&lt;th&gt;Information&lt;/th&gt;
&lt;th&gt;Type of Information&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;serpapi.com&lt;/td&gt;
&lt;td&gt;&lt;a href="mailto:contact@serpapi.com"&gt;contact@serpapi.com&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;Email&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;serpapi.com&lt;/td&gt;
&lt;td&gt;serpapicom&lt;/td&gt;
&lt;td&gt;Instagram&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;serpapi.com&lt;/td&gt;
&lt;td&gt;serpapicom&lt;/td&gt;
&lt;td&gt;Facebook&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;serpapi.com&lt;/td&gt;
&lt;td&gt;serp_api&lt;/td&gt;
&lt;td&gt;Twitter&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;serpapi.com&lt;/td&gt;
&lt;td&gt;null&lt;/td&gt;
&lt;td&gt;Tiktok&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;serpapi.com&lt;/td&gt;
&lt;td&gt;channel/UCUgIHlYBOD3yA3yDIRhg_mg&lt;/td&gt;
&lt;td&gt;Youtube&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;serpapi.com&lt;/td&gt;
&lt;td&gt;serpapi&lt;/td&gt;
&lt;td&gt;Github&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;serpapi.com&lt;/td&gt;
&lt;td&gt;serpapi&lt;/td&gt;
&lt;td&gt;Medium&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h3&gt;
  
  
  Prerequisites
&lt;/h3&gt;

&lt;p&gt;Since &lt;a href="https://serpapi.com"&gt;SerpApi&lt;/a&gt; offers free credits that renew every month, and the user can access a list of free public proxies online, in terms of pricing, this tool is technically a free email finder tool. You may extract data from approximately 10,000 pages (100 results in 1 page, and up to 100 pages) with a free account from &lt;a href="https://serpapi.com"&gt;SerpApi&lt;/a&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;For collecting URLs to scrape, one of the following is required:&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;SerpApi API Key: You may &lt;a href="https://serpapi.com/users/sign_up"&gt;Register to Claim Free Credits&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;List of URLs in a text document (The URLs should be Google web cache links that start with &lt;a href="https://webcache.googleusercontent.com"&gt;https://webcache.googleusercontent.com&lt;/a&gt;)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;For scraping URLs, one of the following is required:&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;List of Proxies in a text document (You may use public proxies. Only HTTP proxies are accepted.)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Rotating Proxy IP&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Installation
&lt;/h3&gt;

&lt;p&gt;You will need to install Ruby on your device. You may take a look at &lt;a href="https://www.ruby-lang.org/en/documentation/installation/"&gt;Installing Ruby Documentation&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Then you need to install it from your command line:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
$ gem install clauneck

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Basic Usage
&lt;/h3&gt;

&lt;p&gt;You can use Clauneck as a command line tool or within your Ruby scripts. Here's the command line usage below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
clauneck --api_key YOUR_SERPAPI_KEY --output "results.csv" --q "site:*.ai AND inurl:/contact OR inurl:/contact-us" --proxy "proxies.txt"

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The search query in the tutorial: "site:*.ai AND inurl:/contact OR inurl:/contact-us" will search for websites ending with .ai and at /contact or /contact-us paths. You may use more intricate search commands to target more specific groups in alignment with your needs.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;I am grateful to the reader for their attention. I hope this blog post and the open-source email address extractor tool can bring ease to your marketing campaigns. Feel free to use it in your integrations, or plugins. If you want to know more about the tool's capabilities or would like to participate in its improvement, you may visit &lt;a href="https://github.com/serpapi/clauneck"&gt;https://github.com/serpapi/clauneck&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>marketing</category>
      <category>programming</category>
      <category>webscraping</category>
      <category>email</category>
    </item>
    <item>
      <title>Real World Example of AI Powered Parsing</title>
      <dc:creator>Emirhan Akdeniz</dc:creator>
      <pubDate>Thu, 22 Jun 2023 18:41:09 +0000</pubDate>
      <link>https://dev.to/serpapi/real-world-example-of-ai-powered-parsing-212</link>
      <guid>https://dev.to/serpapi/real-world-example-of-ai-powered-parsing-212</guid>
      <description>&lt;p&gt;At SerpApi, we're constantly exploring innovative advancements to enhance our services and provide our users with the most cutting-edge solutions. That's why we're excited to introduce a potential new AI feature called AI Powered Parsing, designed to revolutionize how we extract valuable insights from Real Time Search Engine Results using AI Tools.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Disclaimer&lt;/p&gt;

&lt;p&gt;The inclusion of this AI feature in SerpApi's offerings is contingent upon official confirmation. The plans outlined here are based on the assumption of approval.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;For starters, we plan to utilize AI Powered Parsing for Local Results in a Google Search served by &lt;a href="//serpapi.com/local-pack"&gt;SerpApi's Google Local Pack API&lt;/a&gt;, and &lt;a href="https://serpapi.com/google-local-api"&gt;SerpApi's Google Local API&lt;/a&gt;. AI Powered Parsers take advantage of an open-source model called &lt;a href="https://huggingface.co/serpapi/bert-base-local-results"&gt;bert-base-local-results&lt;/a&gt;, and an open-source ruby gem called &lt;a href="https://github.com/serpapi/google-local-results-ai-parser"&gt;google-local-results-ai-parser&lt;/a&gt;. We believe in transparency, which is why both the model and the gem composing the core of this AI technology are publicly available for anyone to examine.&lt;/p&gt;

&lt;p&gt;To be at the forefront of leveraging gathering data from Search Engine Results Pages, &lt;a href="https://serpapi.com/users/sign_up"&gt;You may register to claim free credits&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Open Sourced Materials in this Blog Post:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://github.com/serpapi/google-local-results-ai-parser"&gt;Google Local Results AI Parser&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://github.com/serpapi/google-local-results-ai-server"&gt;Google Local Results AI Server&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://huggingface.co/serpapi/bert-base-local-results"&gt;BERT-Based Classification Model for Google Local Listings&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  What is AI Powered Parsing?
&lt;/h3&gt;

&lt;p&gt;AI-powered parsing for HTML refers to the use of artificial intelligence techniques to extract and understand the content and structure of HTML documents. HTML (Hypertext Markup Language) is the standard language for creating web pages, and parsing refers to the process of analyzing the HTML code to extract meaningful information from it.&lt;/p&gt;

&lt;p&gt;Traditionally, HTML parsing has been performed using rule-based algorithms or template-based approaches, where specific patterns or rules are defined to identify and extract desired elements from the HTML. However, these methods can be limited in their ability to handle complex or inconsistent HTML structures, as they rely on predefined rules that may not adapt well to variations in coding styles or new web page designs, resulting in a disruption of the customer experience.&lt;/p&gt;

&lt;p&gt;AI-powered parsing offers several advantages over traditional parsing approaches. It can handle a wider range of HTML structures, adapt to different coding styles and variations, and improve accuracy and robustness. Furthermore, AI models can learn from large datasets, which enables them to be continuously improved for their parsing capabilities as they are exposed to more diverse HTML documents.&lt;/p&gt;

&lt;h3&gt;
  
  
  How is it useful to the user?
&lt;/h3&gt;

&lt;p&gt;One of the key advantages of AI-Powered Parsing is the ability to trade a little extra processing time for greater precision in extracting data from search results. This means you can obtain more accurate and reliable information, resulting in improved decision-making and insights.&lt;/p&gt;

&lt;p&gt;We understand that search engines frequently evolve, and keeping up with these changes can be challenging. That's why we continuously update our standard parsers to ensure they remain compatible. However, in the rare event that a deprecation goes unnoticed, our cutting-edge AI Powered Parsers serve as a reliable backup solution, ensuring that you're always covered and can access the data you need.&lt;/p&gt;

&lt;p&gt;Accessing this game-changing feature is as simple as including a single parameter in your API requests. We've designed it to be user-friendly and hassle-free, so you can start benefiting from AI-Powered Parsing without any complex setup or configuration.&lt;/p&gt;

&lt;p&gt;At SerpApi, we value our users' needs and strive to provide the most reliable and up-to-date solutions to improve user experience. While we continually update our parsers, we understand the importance of offering a backup solution using the power of AI systems for those rare cases when changes occur unexpectedly.&lt;/p&gt;

&lt;h3&gt;
  
  
  How does it work?
&lt;/h3&gt;

&lt;p&gt;Below, you'll find a general flowchart illustrating the basic functioning. Further sections will provide more specific information.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Ny0_a-oE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/248002212-2ef526a8-2790-4a84-9f29-229223238d92.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Ny0_a-oE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/248002212-2ef526a8-2790-4a84-9f29-229223238d92.jpg" alt="flowchart-of-ai-powered-parser" width="621" height="991"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  BERT-Based Classification Model for Google Local Listings
&lt;/h4&gt;

&lt;p&gt;We are excited to present the &lt;a href="https://huggingface.co/serpapi/bert-base-local-results"&gt;BERT-Based Classification Model for Google Local Listings&lt;/a&gt;, an open-source model available at Huggingface. This powerful model, developed using the Hugging Face library, and a dataset gathered by our own APIs is a key component of our AI-Powered Parsing feature.&lt;/p&gt;

&lt;p&gt;The BERT-based model excels in differentiating difficult semantic similarities, handling partial texts that can be combined later, and effortlessly handling vocabulary from diverse areas. It also returns an Assurance Score for after-correction and demonstrates strong performance against grammatical mistakes.&lt;/p&gt;

&lt;p&gt;Additionally, we want to emphasize that the model's flaws and limitations are systematically documented in the model card. While the model offers robust performance, it does have certain constraints, and potential misclassifications in specific scenarios. These limitations are diligently addressed to provide users with a transparent understanding of the model's capabilities.&lt;/p&gt;

&lt;p&gt;You can play with the model using &lt;a href="https://huggingface.co/serpapi/bert-base-local-results"&gt;Free Inference API at the Repository&lt;/a&gt;:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Q38Ov_vP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/248012148-4d29a0f5-3d90-4694-b341-3c21acc197c5.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Q38Ov_vP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/248012148-4d29a0f5-3d90-4694-b341-3c21acc197c5.jpg" alt="bert-base-local-results-inference-api" width="800" height="426"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Google Local Results AI Server
&lt;/h4&gt;

&lt;p&gt;We're thrilled to announce that we have also open-sourced a simple server code for deploying the &lt;a href="https://huggingface.co/serpapi/bert-base-local-results"&gt;bert-base-local-results model&lt;/a&gt;. The server code can be found in &lt;a href="https://github.com/serpapi/google-local-results-ai-server"&gt;SerpApi's Github Repository&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The repository contains the code for a server that mimics the Inference API endpoints provided by Hugging Face. The server offers a straightforward interface to perform text classification using BERT-based models. It has been specifically designed by SerpApi to cater to heavy-load prototyping and production tasks, enabling the implementation of the &lt;a href="https://github.com/serpapi/google-local-results-ai-parser"&gt;google-local-results-ai-parser&lt;/a&gt; gem, which utilizes the &lt;a href="https://huggingface.co/serpapi/bert-base-local-results"&gt;serpapi/bert-base-local-results&lt;/a&gt; model.&lt;/p&gt;

&lt;p&gt;By open-sourcing this server code, we aim to provide developers with a convenient and efficient way to deploy the &lt;a href="https://huggingface.co/serpapi/bert-base-local-results"&gt;bert-base-local-results&lt;/a&gt; model in their own environments. It offers flexibility and control, allowing you to customize the deployment according to your specific requirements.&lt;/p&gt;

&lt;p&gt;Feel free to explore the repository, leverage the server code, and adapt it to suit your needs. We're excited to see how you integrate the bert-base-local-results model into your projects using this server code.&lt;/p&gt;

&lt;h4&gt;
  
  
  Google Local Results AI Parser
&lt;/h4&gt;

&lt;p&gt;The &lt;a href="https://github.com/serpapi/google-local-results-ai-parser"&gt;google-local-results-ai-parser&lt;/a&gt; is a gem developed by SerpApi that allows you to extract and categorize structured data from Google Local Search Results using natural language techniques. It uses the &lt;a href="https://huggingface.co/serpapi/bert-base-local-results"&gt;serpapi/bert-base-local-results&lt;/a&gt; transformer model to parse Google Local Results Listings in English and extract important information, categorizing it into different sections such as ratings, reviews, descriptions, etc.&lt;/p&gt;

&lt;p&gt;You may visit &lt;a href="https://github.com/serpapi/google-local-results-ai-parser"&gt;SerpApi's Github Repository&lt;/a&gt; for an in-depth look at its capabilities. I'll provide a basic explanation of the usage and functions of this next-generation parser.&lt;/p&gt;

&lt;p&gt;Let's say you want to parse the following Google Local Result:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ytcHPcdK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/205724193-917feb92-3054-436d-93e9-552f8ec7ca9b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ytcHPcdK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/205724193-917feb92-3054-436d-93e9-552f8ec7ca9b.png" alt="Google Local Result" width="800" height="144"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can use the following simple code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'google-local-results-ai-parser'&lt;/span&gt;
&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'nokolexbor'&lt;/span&gt;
&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'http'&lt;/span&gt;
&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'parallel'&lt;/span&gt;

&lt;span class="n"&gt;html&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"HTML of an Individual Local Result"&lt;/span&gt;
&lt;span class="n"&gt;bearer_token&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'Huggingface Token or Private Server Key'&lt;/span&gt;
&lt;span class="n"&gt;result&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;GoogleLocalResultsAiParser&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;parse&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="ss"&gt;html: &lt;/span&gt;&lt;span class="n"&gt;html&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="ss"&gt;bearer_token: &lt;/span&gt;&lt;span class="n"&gt;bearer_token&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The output will be:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;
&lt;span class="p"&gt;{&lt;/span&gt;

  &lt;span class="s2"&gt;"address"&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="s2"&gt;"Nicosia"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;

  &lt;span class="s2"&gt;"description"&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="s2"&gt;"Iconic Seattle-based coffeehouse chain"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;

  &lt;span class="s2"&gt;"price"&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="s2"&gt;"€€"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;

  &lt;span class="s2"&gt;"reviews"&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="s2"&gt;"418"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;

  &lt;span class="s2"&gt;"rating"&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="s2"&gt;"4.0"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;

  &lt;span class="s2"&gt;"type"&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="s2"&gt;"Coffee shop"&lt;/span&gt;

&lt;span class="p"&gt;}&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can also utilize it for multiple results:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'google-local-results-ai-parser'&lt;/span&gt;
&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'nokolexbor'&lt;/span&gt;
&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'http'&lt;/span&gt;
&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'parallel'&lt;/span&gt;

&lt;span class="n"&gt;html_parts&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;

  &lt;span class="s2"&gt;"HTML of an Individual Local Result"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;

  &lt;span class="s2"&gt;"HTML of another Individual Local Result"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;

  &lt;span class="o"&gt;...&lt;/span&gt;

&lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="n"&gt;bearer_token&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'Huggingface Token or Private Server Key'&lt;/span&gt;
&lt;span class="n"&gt;results&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;GoogleLocalResultsAiParser&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;parse_multiple&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="ss"&gt;html_parts: &lt;/span&gt;&lt;span class="n"&gt;html_parts&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="ss"&gt;bearer_token: &lt;/span&gt;&lt;span class="n"&gt;bearer_token&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this case, the output will be an array of dictionaries:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;
&lt;span class="p"&gt;[&lt;/span&gt;

  &lt;span class="p"&gt;{&lt;/span&gt;

    &lt;span class="s2"&gt;"address"&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="s2"&gt;"Nicosia"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;

    &lt;span class="s2"&gt;"description"&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="s2"&gt;"Iconic Seattle-based coffeehouse chain"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;

    &lt;span class="s2"&gt;"price"&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="s2"&gt;"€€"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;

    &lt;span class="s2"&gt;"reviews"&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="s2"&gt;"418"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;

    &lt;span class="s2"&gt;"rating"&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="s2"&gt;"4.0"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;

    &lt;span class="s2"&gt;"type"&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="s2"&gt;"Coffee shop"&lt;/span&gt;

  &lt;span class="p"&gt;},&lt;/span&gt;

  &lt;span class="o"&gt;...&lt;/span&gt;

&lt;span class="p"&gt;]&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The AI-Powered Parser is able to do this by calling the model, and then doing a rule-based after correction for flaws of the model. This way, maximum precision can be provided. There are also other advanced parameters to cope with unexpected cases. You may refer to the &lt;a href="https://github.com/serpapi/google-local-results-ai-parser#advanced-usage"&gt;Documentation&lt;/a&gt; for these capabilities.&lt;/p&gt;

&lt;h4&gt;
  
  
  SerpApi's Potential New Parameter
&lt;/h4&gt;

&lt;p&gt;The AI Powered Parser Search operates by simply incorporating a single parameter named ai_parser. By integrating this AI-powered solution, the search results are not only enhanced but also become more precise. While the conventional parsing methods will still function as usual, the AI Powered Parser supercharges the parsing process, effectively refining and streamlining the precision of the results.&lt;/p&gt;

&lt;p&gt;We favor solutions that are "no-brainers" - easy to implement, efficient, and notably improving outcomes. This AI parser fits right into that category, offering an innovative and straightforward way to enhance data extraction from Google Local Search Results or any other part that would be supported.&lt;/p&gt;

&lt;p&gt;Imagine the following search:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/playground?q=Coffee"&gt;https://serpapi.com/playground?q=Coffee&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is searching for the query Coffee on Google. Imagine the &lt;a href="https://serpapi.com/local-pack"&gt;Google Local Pack&lt;/a&gt; feature is messing up and giving the address for the price, the price for the number of reviews, etc.&lt;/p&gt;

&lt;p&gt;You can open an issue at &lt;a href="https://github.com/serpapi/public-roadmap/"&gt;SerpApi's Public Roadmap&lt;/a&gt; and or contact our Customer Success Engineers to notify us about the deprecation. While the issue is being handled by our engineers, you may use the following query:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/playground?q=Coffee&amp;amp;ai_parser=true"&gt;https://serpapi.com/playground?q=Coffee&amp;amp;ai_parser=true&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This will be searching for the query Coffee with our AI Powered Parsers, giving precise results until the issue is fixed. By trading extra time for superior precision, you can maintain your production capabilities as if the issue never happened. We desire to secure your channel of the data supply chain from us as much as we can.&lt;/p&gt;

&lt;p&gt;Here's an example image showcasing the use of the parameter in the playground:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--iTeoGR_K--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/248009458-c0a36fd3-86d2-456c-a75c-d971d722631c.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--iTeoGR_K--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/248009458-c0a36fd3-86d2-456c-a75c-d971d722631c.jpg" alt="ai-parser-parameter-in-playground" width="800" height="398"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Unit Tests for Maintenance
&lt;/h3&gt;

&lt;p&gt;At SerpApi, we value test-driven development. In order to ensure the quality and integrity of our AI Powered Parser Search feature, we have undertaken comprehensive unit testing.&lt;/p&gt;

&lt;p&gt;I sifted through all the previous unit test examples available in the stack and enriched the test cases with some recent examples as well. To facilitate the unit testing process, I crafted a Rake task designed to automatically save resulting JSONs as unit test examples in their respective parts.&lt;/p&gt;

&lt;p&gt;As a part of the validation, I manually examined these JSONs for any abnormalities or inconsistencies. Once this manual check was completed, I employed Large Language Model (LLM) prompts for a more in-depth verification.&lt;/p&gt;

&lt;p&gt;Here's an example context prompt:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
Do you think there are any keys that may contain falsely parsed information in the following array at the following possible keys:

"rating", "type", "years_in_business", "service_options", "hours", "reviews_original", "reviews", "address", "description", "price", "phone"

[

  {

    "position": 1,

    "title": "xxx xxxx Ltd",

    "place_id": "9149508700925112934",

    "place_id_search": "http://localhost:3000/search.json?ai_parser=true&amp;amp;device=desktop&amp;amp;engine=google_local&amp;amp;gl=uk&amp;amp;google_domain=google.com&amp;amp;hl=en&amp;amp;location=United+Kingdom&amp;amp;ludocid=9149508700925112934&amp;amp;q=xxx+xxxxx.Ltd",

    "lsig": "AB86z5W38iEx_9mjnRFzmp68DR6h",

    "gps_coordinates": {

      "latitude": 53.9177131,

      "longitude": -2.1785891

    },

    "links": {

      "website": "http://xxx-xxxxxxx.com/"

    },

    "reviews_original": "No reviews",

    "reviews": 0,

    "address": "No ReviewsBarnoldswick"

  }
]

The Answer: "address" key possibly contains a residue. The correct value is probably "Barnoldswick". The residue is probably "No Reviews"

Wait for my instructions.

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Followed up by the query on the actual JSON:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
Do you think there are any keys that may contain falsely parsed information in the following array at the following possible keys:

"rating", "type", "years_in_business", "service_options", "hours", "reviews_original", "reviews", "address", "description", "price", "phone"

[JSON I want to check out for oddities]

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;These models, given their generative capabilities, provide helpful feedback about the accuracy of parsed data in keys such as "rating", "type", "years_in_business", "service_options", "hours", "reviews_original", "reviews", "address", "description", "price", "phone", etc.&lt;/p&gt;

&lt;p&gt;Given their generative nature, Generative AI language models like OpenAI’s GPT, chatbots like ChatGPT, or Google AI Labs Project in beta version Bard might produce insufficient information in their answers, necessitating multiple follow-up prompts or some extra interpretation of their output. Despite this, it proved to be an efficient alternative to manual inspection, facilitating the process of identifying potential errors or inconsistencies in our JSONs.&lt;/p&gt;

&lt;p&gt;After this preliminary check with the JSONs from various older examples that were already available in the stack, we leveraged these in the unit tests. Anytime there was a modification in the responsible parts, these vetted examples proved extremely helpful for comparison.&lt;/p&gt;

&lt;p&gt;The overarching objective of these unit tests, as with any other unit tests, was to scrutinize different behaviors in different fixes. The fixes could be localized within the &lt;a href="https://github.com/serpapi/google-local-results-ai-parser"&gt;google-local-results-ai-parser&lt;/a&gt; gem or be more broadly based within the stack. By implementing this rigorous testing procedure, we ensure our feature's robustness and reliability, enabling it to consistently deliver enhanced and precise results.&lt;/p&gt;

&lt;h3&gt;
  
  
  Dedicated Pages and Documentation Examples
&lt;/h3&gt;

&lt;p&gt;At SerpApi, we firmly believe in the importance of clear and effective communication with our users. We recognize that the ease of understanding our features and APIs is fundamental to their optimal utilization. Therefore, we've dedicated ourselves to improving our user-facing documentation to facilitate a seamless integration experience.&lt;/p&gt;

&lt;p&gt;We have designed two new dedicated pages to serve this purpose. The first page provides a comprehensive overview of our new feature – the AI Powered Parser Search. It outlines the feature's capabilities, details its operation, and presents potential use cases. We believe this will allow our users to grasp the feature's value and envision how it can best fit into their unique workflows. Here are some visual examples:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Disclaimer&lt;/p&gt;

&lt;p&gt;These pages contain a work in progress. The final version of these pages might be different than showcased here.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--owpPEGZm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/248010100-81a94d0f-b962-44af-8407-bb746d475d49.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--owpPEGZm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/248010100-81a94d0f-b962-44af-8407-bb746d475d49.jpg" alt="ai-powered-parser-feature-page-1" width="800" height="398"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--NGDv8_cx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/248010444-b3e17ba7-ebaf-454b-a3b4-83dd32b5a88d.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--NGDv8_cx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/248010444-b3e17ba7-ebaf-454b-a3b4-83dd32b5a88d.jpg" alt="ai-powered-parser-feature-page-2" width="800" height="398"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ebdqsLo---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/248010488-db961876-8f0d-47b9-ac8b-581cb085ce7d.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ebdqsLo---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/248010488-db961876-8f0d-47b9-ac8b-581cb085ce7d.jpg" alt="ai-powered-parser-feature-page-3" width="800" height="398"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The second page is a complete table of all AI Powered Parsers. It serves as a handy reference for users to understand the different parsers available and the unique functionalities they offer. With this table, users can quickly locate the most suitable AI Powered Parser for their specific needs and applications.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--1CKCUygd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/248010526-db47f30e-e358-418b-855f-6828963af3b7.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--1CKCUygd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/248010526-db47f30e-e358-418b-855f-6828963af3b7.jpg" alt="ai-powered-parser-table" width="800" height="398"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Furthermore, we have augmented our documentation with new examples in two of our key APIs: &lt;a href="https://serpapi.com/local-pack"&gt;SerpApi's Google Local Pack Results API&lt;/a&gt;, and &lt;a href="https://serpapi.com/google-local-api"&gt;SerpApi's Google Local Results API&lt;/a&gt; to help our users understand how to leverage these APIs effectively.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--rXomLs2q--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/248010751-a6413e53-307b-4321-9f02-f9aa0c9797e6.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rXomLs2q--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/248010751-a6413e53-307b-4321-9f02-f9aa0c9797e6.jpg" alt="ai-powered-parser-documentation" width="800" height="507"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In conclusion, we continually strive to keep our documentation up-to-date and user-friendly, knowing that it empowers our users to leverage our offerings to their full potential. Whether you are new to SerpApi or an experienced user, these resources will provide valuable guidance in your journey of leveraging AI Powered Parser Search or any of our extensive array of APIs in general.&lt;/p&gt;

&lt;h2&gt;
  
  
  Brief Overview of the Previous Attempt
&lt;/h2&gt;

&lt;p&gt;In my previous try to tackle this problem, I attempted to develop a local model in Rails for text classification. I followed a similar method as the one mentioned here. However, I encountered some unexpected challenges along the way. Here are the key lessons I learned from that experience.&lt;/p&gt;

&lt;p&gt;You can find some of the old blogs from series around the problem at:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://serpapi.com/blog/real-world-example-of-machine-learning-on-rails/"&gt;Real World Example of Machine Learning on Rails&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://serpapi.com/blog/how-ml-hybrid-parser-beats-tradition/"&gt;How ML Hybrid Parser Beats Traditional Parser&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Key Lessons from Past Mistakes
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Externalize the Model
&lt;/h3&gt;

&lt;p&gt;One of the main problems with my previous approach was to train the model locally. I had to take into account a ton of optimization problems to be resolved. At first sight, it might seem that containing a situation within the stack is an excellent way to keep the problem at bay. It might give you access to better benchmarking and debugging capabilities. But it just turned out to be wrong.&lt;/p&gt;

&lt;p&gt;I had to thrive for a more time and performance-efficient model since it was local. This limitation alone is enough to create more confusion in the development of the model. For example, since it was using an RNN model, when I increased the dataset size, it became slower. When I tried to cover the cases in the expanded dataset with less performance, I lost precision. It was a vicious cycle. Not to mention the lack of documentation on Machine Learning in Rails.&lt;/p&gt;

&lt;p&gt;In this solution, I have externalized the model to a server, giving more room to play with different models at different capacities. This was an important detail that made this approach into a working solution.&lt;/p&gt;

&lt;h3&gt;
  
  
  Don't Reinvent the Wheel
&lt;/h3&gt;

&lt;p&gt;In the previous approach, I have tried to train the model from scratch. Play with different structures, different hyperparameters, different datasets, etc. This would seem as if I had more tools in my hand.&lt;/p&gt;

&lt;p&gt;But in reality, it gave me more things to complicate the situation. I had to do many grid-type attempts with different structures and hyperparameters which may or may not bear good results. Many times I was lost in details and was limited with my computing power to make subtle observations and make deductions on model types.&lt;/p&gt;

&lt;p&gt;In this approach, I used a transformers model called &lt;a href="https://huggingface.co/bert-base-uncased"&gt;bert-base-uncased&lt;/a&gt;. It was already trained on a relatively large English corpus, and all I had to do was to finetune it well to our needs. This has significantly boosted not only the simplicity of training the model but also the performance on precision. The resulting model &lt;a href="https://huggingface.co/serpapi/bert-base-local-results"&gt;bert-base-local-results&lt;/a&gt; was more capable, easy to implement, and easier to understand.&lt;/p&gt;

&lt;p&gt;In conclusion, using solutions that are already good at some part of the solution to your problem, that are easily distributable, and applicable are a better choice for most cases.&lt;/p&gt;

&lt;h3&gt;
  
  
  Anticipate Conflicts with Other Parts
&lt;/h3&gt;

&lt;p&gt;One other problem I faced with the previous solution was the amount of parallel progress in the reviewing process. These parts ended up creating more trouble than a solution.&lt;/p&gt;

&lt;p&gt;For example, I attempted to employ the Rails logic of DRY (Don't Repeat Yourself), and I separated the extraction of the common parts with the old parsers into a separate function to be used by both parsers. By the time I relooked at the previous solution, even the location of the files was updated, and it was blocking any kind of attempt to provide how it is performing.&lt;/p&gt;

&lt;p&gt;In this solution, I have decoupled necessary parts even if it means repeating the same code in two files. This way, I can ensure that the changes made on other parts won't affect the reviewing process or the integration of the code.&lt;/p&gt;

&lt;p&gt;In conclusion, sometimes repeating yourself is actually good for maintenance. It is also good for keeping maintenance problems at bay. Knowing that the AI Parser problem is most likely caused by a definite part of the whole code is an absolute time saver.&lt;/p&gt;

&lt;h3&gt;
  
  
  Improve the Product Instead of Replacing it
&lt;/h3&gt;

&lt;p&gt;I made the mistake of thinking I can replace the whole parsing process with an AI solution. At first sight, it could sound great. But the unknown is always fearsome, thus this meant more reviewing time, more details to be careful on the review, harder to explain concepts since they are replacing a known method, etc.&lt;/p&gt;

&lt;p&gt;In this solution, I have used it as a parallel backup solution. Being able to switch from a traditional parser to an AI Powered Parser meant less complexity with better control. I safeguarded the code so that if there is any unknown problem, the AI Powered Parsers would be bypassed and the API would resort to traditional parsers.&lt;/p&gt;

&lt;p&gt;This way, I was able to provide the same feature without giving up on the old method. Instead of measuring how much performance or precision I have added with this feature, I was able to serve it as a definite bonus.&lt;/p&gt;

&lt;h3&gt;
  
  
  Refrain from Automation of Every Aspect
&lt;/h3&gt;

&lt;p&gt;I have attempted to create a completely automatic solution from ground zero in the previous solution. From gathering data with SerpApi from within the Rails stack, feeding it directly into the training dataset loader, training process, creating a table for the training debugging values such as loss function and success rates, etc.&lt;/p&gt;

&lt;p&gt;At first sight, it might seem like an over-encompassing solution. But the amount of complexity it brought to someone who isn't informed on the process was huge. I had to take unconventional approaches in many parts.&lt;/p&gt;

&lt;p&gt;To exemplify, I didn't want us to download the Torch library into each server. So I devised a system in which you could develop a model in Torch with PT format, and then convert that to an ONNX file. Rails had some light options to run ONNX models. But since I wasn't well versed in writing code for transforming PT to ONNX, I had to resort to a Python script within a Rails Stack.&lt;/p&gt;

&lt;p&gt;In this solution, I only attempted to fully automate the parsing process with &lt;a href="https://github.com/serpapi/google-local-results-ai-parser"&gt;google-local-results-ai-parser&lt;/a&gt; gem.&lt;/p&gt;

&lt;p&gt;I have also offered a solution to automate serving the model with &lt;a href="https://github.com/serpapi/google-local-results-ai-server"&gt;google-local-results-ai-server&lt;/a&gt;, but this is optional. People can still deploy their own solutions at Microsoft Azure, Amazon Sagemaker, or a new startup they think is better for them. &lt;/p&gt;

&lt;p&gt;For training the model, people are free to choose which dataset to use or which framework to train if they wish to improve or replicate &lt;a href="https://huggingface.co/serpapi/bert-base-local-results"&gt;bert-base-local-results&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The essential take from this is to give people more flexibility in how they want to resolve the problem while providing at least one working path. Automating every aspect with limited built-in options will possibly cause confusion in the long run.&lt;/p&gt;

&lt;h3&gt;
  
  
  Handle Flaws of the Model Systematically
&lt;/h3&gt;

&lt;p&gt;In my previous approach, because of other complexities, I wasn't able to do a full systematic flaw breakdown. I have only provided the necessary fixes for the flaws. Solutions to these flaws created secondary flaws in the results, and things became out of control.&lt;/p&gt;

&lt;p&gt;In addition to that, I also employed some hardcoded regex to do after-corrections. This solution while improving the overall precision, blocked me from implementing better solutions that would encompass a wide range of flaws.&lt;/p&gt;

&lt;p&gt;Also, I didn't separate the information fed by the model, and the after-corrections into separate parts. This made it harder to observe the flaw in model behavior versus the after corrections.&lt;/p&gt;

&lt;p&gt;In this solution, I have employed a fully systematic breakdown of the known flaws of the model. Also instead of using dense regexes, I've handled after corrections mostly in relation to the position of the elements in the HTML part.&lt;/p&gt;

&lt;p&gt;Furthermore, since the model was hosted on a different server, I was easily able to check little entries, and their classifications done by the model. Even using Huggingface's Free Inference API on the &lt;a href="https://huggingface.co/serpapi/bert-base-local-results"&gt;model page&lt;/a&gt; was extremely helpful in debugging different flaws.&lt;/p&gt;

&lt;p&gt;Ensuring that the solution could be easily transferred to other languages, and other search results was a crucial part of this attempt. Until we create a new model for a new language or a new part, we cannot know for sure if it was effective. However, few details here are likely to improve the transferability of the solution.&lt;/p&gt;

&lt;h3&gt;
  
  
  Compartmentalize Different Parts and Open Source Most of Them
&lt;/h3&gt;

&lt;p&gt;This is much related to refraining from automating every aspect, but subtly different topics. In my previous attempt, I tried to solve the problem from one place because I thought it would be easier to employ.&lt;/p&gt;

&lt;p&gt;In this solution, I have broken it into 4 parts as I have lightly mentioned above:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;The code to handle the gem, and serve enhanced results in Rails&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The AI Powered Parser Gem at &lt;a href="https://github.com/serpapi/google-local-results-ai-parser"&gt;google-local-results-ai-parser&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The Server Solution to serve the model which I have provided an alternative at &lt;a href="https://github.com/serpapi/google-local-results-ai-server"&gt;google-local-results-ai-server&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The Hosted Model &lt;a href="https://huggingface.co/serpapi/bert-base-local-results"&gt;bert-base-local-results&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I could've easily gone for a server solution that also handles the parsing on a remote server, or used a custom hosting solution to combine multiple parts, etc., and have fewer steps to achieve the same result. This approach would ultimately fail in the long run.&lt;/p&gt;

&lt;p&gt;To exemplify this with a relatively similar experience from the old project, I wasn't able to open-source most of the work because there were intended for our codebase, and I mitigated the risk of sharing bits of code to explain the solution. This has led me to create more blog posts to explain the idea.&lt;/p&gt;

&lt;p&gt;Although I am grateful to all the readers who read my previous work, this situation caused a secondary problem; mitigating the propagating of the solution versus doing the actual work. Many times when I had to rephrase myself in other blog posts, keeping me from developing.&lt;/p&gt;

&lt;p&gt;In this approach, different components are open-sourced, and only a minor code is intended for the stack. This way, I can express the intricate details of how to use &lt;a href="https://github.com/serpapi/google-local-results-ai-parser"&gt;google-local-results-ai-parser&lt;/a&gt; in a more advanced way inside its documentation, detailed flaws of the &lt;a href="https://huggingface.co/serpapi/bert-base-local-results"&gt;bert-base-local-results&lt;/a&gt; in the model card, etc. This is still creating a bigger corpus for the audience. But it is also helping me mitigate the risks at bay while having better control at different parts.&lt;/p&gt;

&lt;p&gt;Also in the event that this work is not integrated, most of the materials will be out there for public use instead of bits and pieces that interested people must compile themselves.&lt;/p&gt;

&lt;p&gt;In conclusion, open-sourcing the parts of your projects, especially ones that are not certain to be accepted, is a good way of mitigating the propagation of the idea versus its actual work. Also, a good compartmentalization of the open-source parts will reduce the risk of exposing delicate parts of your codebase to the public.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;I'm grateful for the reader's time and attention. I am also grateful for any potential attention to open-sourced materials used in making it possible. I hope this solution communicates the idea of AI Powered Parsing to people who are interested, and also hope that key takes on my previous mistakes would benefit people in the most potent way. I would be more grateful open sourced materials that would provide use to any projects.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>machinelearning</category>
      <category>ruby</category>
      <category>opensource</category>
    </item>
    <item>
      <title>Open Source Review Analyzer Chrome Extension for ChatGPT</title>
      <dc:creator>Emirhan Akdeniz</dc:creator>
      <pubDate>Fri, 28 Apr 2023 10:38:19 +0000</pubDate>
      <link>https://dev.to/serpapi/open-source-review-analyzer-chrome-extension-for-chatgpt-16he</link>
      <guid>https://dev.to/serpapi/open-source-review-analyzer-chrome-extension-for-chatgpt-16he</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Online reviews are instrumental in influencing consumer decisions. As a result, effective and efficient review analysis is essential for businesses. SerpApi has developed a practical tool that streamlines this process by harnessing both the capabilities of ChatGPT, and SerpApi: &lt;a href="https://github.com/serpapi/review-analyzer"&gt;SerpApi's ChatGPT Review Analyzer Chrome Extension&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  How SerpApi's ChatGPT Review Analyzer Works
&lt;/h2&gt;

&lt;h3&gt;
  
  
  The role of ChatGPT in review analysis
&lt;/h3&gt;

&lt;p&gt;SerpApi's ChatGPT Review Analyzer employs OpenAI's ChatGPT, an advanced language model chatbot, and SerpApi's Review Results from Search Engines to process and analyze vast amounts of review data swiftly. By using artificial intelligence, natural language processing (NLP) techniques, and SERP data, the extension can extract valuable insights from reviews, generating concise summaries, sentiment analysis, and keyword extraction.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;References&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/use-cases/machine-learning-and-artificial-intelligence"&gt;https://serpapi.com/use-cases/machine-learning-and-artificial-intelligence&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://openai.com/blog/chatgpt"&gt;https://openai.com/blog/chatgpt&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  The ChatGPT Chrome Extension
&lt;/h3&gt;

&lt;p&gt;This powerful tool is designed as a Chrome extension, allowing for smooth integration with your web browser. The extension is ready to use and usable with ChatGPT on Chrome and Brave browsers, offering valuable insights directly within your browsing experience.&lt;/p&gt;

&lt;p&gt;You need two accounts for this extension to work. One of which is obviously the ChatGPT account. Another is the SerpApi Account. You may &lt;a href="https://serpapi.com/users/sign_up"&gt;register to claim free credits recurring each month&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Alongside Google Search Results in an easy-to-understand JSON format without worrying about proxies and parsers, SerpApi also provides many search result features in different engines like Bing, Duckduckgo, or shopping engines like Walmart, eBay, etc.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;References&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://github.com/serpapi/review-analyzer"&gt;https://github.com/serpapi/review-analyzer&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/search-api"&gt;https://serpapi.com/search-api&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/bing-search-api"&gt;https://serpapi.com/bing-search-api&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/duckduckgo-search-api"&gt;https://serpapi.com/duckduckgo-search-api&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/walmart-search-api"&gt;https://serpapi.com/walmart-search-api&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/ebay-search-api"&gt;https://serpapi.com/ebay-search-api&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Features of SerpApi's ChatGPT Review Analyzer
&lt;/h2&gt;

&lt;h3&gt;
  
  
  An example Output of the extension
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;    Quality of coffee &lt;span class="o"&gt;(&lt;/span&gt;70% positive&lt;span class="o"&gt;)&lt;/span&gt;
    Service &lt;span class="o"&gt;(&lt;/span&gt;40% positive, 30% negative&lt;span class="o"&gt;)&lt;/span&gt;
    Ambiance/decor &lt;span class="o"&gt;(&lt;/span&gt;60% positive&lt;span class="o"&gt;)&lt;/span&gt;
    Availability of snacks/food options &lt;span class="o"&gt;(&lt;/span&gt;40% positive, 20% negative&lt;span class="o"&gt;)&lt;/span&gt;
    Price &lt;span class="o"&gt;(&lt;/span&gt;20% negative&lt;span class="o"&gt;)&lt;/span&gt;
    Seating/WiFi &lt;span class="o"&gt;(&lt;/span&gt;60% positive&lt;span class="o"&gt;)&lt;/span&gt;
    Milk options &lt;span class="o"&gt;(&lt;/span&gt;30% positive, 10% negative&lt;span class="o"&gt;)&lt;/span&gt;
    Accessibility &lt;span class="o"&gt;(&lt;/span&gt;10% positive&lt;span class="o"&gt;)&lt;/span&gt;
    Cleanliness &lt;span class="o"&gt;(&lt;/span&gt;20% positive&lt;span class="o"&gt;)&lt;/span&gt;
    Location &lt;span class="o"&gt;(&lt;/span&gt;10% positive&lt;span class="o"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Real-time review analysis
&lt;/h3&gt;

&lt;p&gt;SerpApi offers a variety of engines that offer review results. Most of templates are covered in this extension. You can get real-time review results using the vast ecosystem of SERP Engines such as SerpApi’s Google Reviews Results API, Google Maps Reviews API, Walmart Product Reviews API, Apple App Store Reviews API, and much more. You can find links to some of their documentation and tutorials in the references.&lt;/p&gt;

&lt;p&gt;The Google Chrome extension examines reviews instantly, providing valuable insights with the ChatGPT response. This real-time capability enables you to make informed decisions quickly without manually sorting through numerous reviews.&lt;/p&gt;

&lt;p&gt;SerpApi also provides an add-on called Ludicrous Speed which provides up to 2.5x speeds. If you would like to improve the speed at which you gather your results, you may take a look at &lt;a href="https://serpapi.com/ludicrous-speed"&gt;Ludicrous Speed Page&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;References&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/ludicrous-speed"&gt;https://serpapi.com/ludicrous-speed&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/reviews-results"&gt;https://serpapi.com/reviews-results&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/google-maps-reviews-api"&gt;https://serpapi.com/google-maps-reviews-api&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/walmart-reviews-results"&gt;https://serpapi.com/walmart-reviews-results&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/walmart-product-reviews-api"&gt;https://serpapi.com/walmart-product-reviews-api&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/apple-reviews"&gt;https://serpapi.com/apple-reviews&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/google-play-product-reviews"&gt;https://serpapi.com/google-play-product-reviews&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Sentiment analysis
&lt;/h3&gt;

&lt;p&gt;SerpApi's ChatGPT Review Analyzer can ascertain the overall sentiment of a review, classifying it as positive, negative, or neutral. This feature allows users to gauge general opinions about a product or service without reading every review in depth.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;References&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://aws.amazon.com/what-is/sentiment-analysis/"&gt;https://aws.amazon.com/what-is/sentiment-analysis/&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Review summarization
&lt;/h3&gt;

&lt;p&gt;The extension employs ChatGPT to produce concise summaries of extensive reviews, emphasizing the most critical points. This feature lets users grasp the crux of a review without dedicating time to reading the entire text.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;References&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://huggingface.co/tasks/summarization"&gt;https://huggingface.co/tasks/summarization&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Keyword extraction and Topic Modeling
&lt;/h3&gt;

&lt;p&gt;The tool can identify and extract relevant keywords from reviews, offering users a snapshot of the most frequently mentioned topics. This feature assists users in recognizing patterns and trends in customer feedback and opinions, allowing them to concentrate on specific areas of interest or concern.&lt;/p&gt;

&lt;p&gt;This is actually also a bit of my personal area of interest. We have a variety of transformer models for on-demand labeled classification. But when it comes to generating labels for these, we haven't got many approaches. In one of my previous blog posts, I covered Latent Dirichlet Allocation (LDA) algorithm for label generation. LLMs prove to be useful in doing the same thing for the batches. A chain of batches, and selection of the most common labels between results has the possibility to provide label generation. This function could be used to classify a larger corpus of text for a variety of purposes beyond reviews.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;References&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://en.wikipedia.org/wiki/Latent_Dirichlet_allocation"&gt;https://en.wikipedia.org/wiki/Latent_Dirichlet_allocation&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.quora.com/What-is-the-difference-between-keyword-extraction-and-Topic-modeling-such-as-Latent-Dirichlet-Allocation"&gt;https://www.quora.com/What-is-the-difference-between-keyword-extraction-and-Topic-modeling-such-as-Latent-Dirichlet-Allocation&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/blog/topic-analysis-of-app-store-reviews-a-heatmap-approach-with-the-app-store-reviews-scraper-api/"&gt;https://serpapi.com/blog/topic-analysis-of-app-store-reviews-a-heatmap-approach-with-the-app-store-reviews-scraper-api/&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Benefits of SerpApi's ChatGPT Review Analyzer
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Time-saving
&lt;/h3&gt;

&lt;p&gt;By automating the review analysis process, SerpApi's ChatGPT Review Analyzer saves users considerable time. This frees businesses and individuals to focus on other essential tasks, enhancing overall productivity without worrying about how to collect their reviews, or how to get insights from them.&lt;/p&gt;

&lt;h3&gt;
  
  
  Improved decision-making
&lt;/h3&gt;

&lt;p&gt;With SerpApi's capacity to supply real-time insights, users can make better-informed decisions based on customer feedback. This can result in more effective marketing strategies, product improvements, and overall business growth.&lt;/p&gt;

&lt;h3&gt;
  
  
  Enhanced user experience
&lt;/h3&gt;

&lt;p&gt;SerpApi's ChatGPT Review Analyzer Chrome extension streamlines the review analysis process, creating a more enjoyable and efficient browsing experience. Users can quickly access valuable insights by copying the results SerpApi already scraped for them without the need for opening a new tab to scrape each different review.&lt;/p&gt;

&lt;h3&gt;
  
  
  Competitive edge
&lt;/h3&gt;

&lt;p&gt;By leveraging ChatGPT and SerpApi, businesses can gain a competitive advantage by staying abreast of trends and comprehending customer feedback better than their competitors. This can lead to improved customer satisfaction and loyalty, as well as increased market share.&lt;/p&gt;

&lt;h2&gt;
  
  
  How do I use ChatGPT extension in Chrome?
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Installation process
&lt;/h3&gt;

&lt;p&gt;The extension isn’t published under Chrome Web Store yet, so the only web page you can access is the GitHub Repository for the moment. It doesn’t have support for Firefox yet. You will need a Chrome or a Brave Browser to use it.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Visit the GitHub repository at &lt;a href="https://github.com/serpapi/review-analyzer"&gt;https://github.com/serpapi/review-analyzer&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Download the repository as a ZIP file and extract it, or just clone it.&lt;/li&gt;
&lt;li&gt;In your Chrome or Brave browser, open the Extensions page (chrome://extensions for Chrome Browser, brave://extensions for Brave Browser).&lt;/li&gt;
&lt;li&gt;Enable "Developer mode" in the top right corner of the page.&lt;/li&gt;
&lt;li&gt;Click "Load unpacked" and select the extracted folder from your computer.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;References&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://github.com/serpapi/review-analyzer"&gt;https://github.com/serpapi/review-analyzer&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.reddit.com/r/chrome_extensions/comments/zh4ave/how_to_manually_add_extension_to_chrome_when_i/"&gt;https://www.reddit.com/r/chrome_extensions/comments/zh4ave/how_to_manually_add_extension_to_chrome_when_i/&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Usage
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Navigate to ChatGPT webpage.&lt;/li&gt;
&lt;li&gt;Navigate to SerpApi Playground to filter the review results you desire.&lt;/li&gt;
&lt;li&gt;Click on the SerpApi's ChatGPT Review Analyzer icon in your browser.&lt;/li&gt;
&lt;li&gt;A popup window will open. Copy the review results you gathered from SerpApi in JSON format. (Do not try to trim the unnecessary parts.)&lt;/li&gt;
&lt;li&gt;Click on the Analyze Button.&lt;/li&gt;
&lt;li&gt;The extension will automatically process and display insights from the reviews, providing you with the 10 most common labels and their percentages based on occurrence rate. It will also indicate whether a label is positive or negative.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Here is an example of Google Maps Reviews Results for a Cafe in New York using SerpApi’s Google Maps Reviews API and GPT-4:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;Coffee Quality &lt;span class="o"&gt;(&lt;/span&gt;70% positive&lt;span class="o"&gt;)&lt;/span&gt;: 7 out of 10 reviews mentioned the quality of the coffee as a positive aspect.
Staff Attitude &lt;span class="o"&gt;(&lt;/span&gt;30% positive, 60% negative&lt;span class="o"&gt;)&lt;/span&gt;: 6 out of 10 reviews mentioned staff attitude, with 4 of them being negative and 2 being positive.
Pastries and Baked Goods &lt;span class="o"&gt;(&lt;/span&gt;50% positive, 30% negative&lt;span class="o"&gt;)&lt;/span&gt;: 5 out of 10 reviews mentioned the pastries and baked goods, with 3 being positive and 2 being negative.
Ambiance and Decor &lt;span class="o"&gt;(&lt;/span&gt;70% positive&lt;span class="o"&gt;)&lt;/span&gt;: 7 out of 10 reviews mentioned the ambiance and decor positively.
Cleanliness &lt;span class="o"&gt;(&lt;/span&gt;50% negative&lt;span class="o"&gt;)&lt;/span&gt;: 2 out of 10 reviews mentioned cleanliness negatively.
Seating Options &lt;span class="o"&gt;(&lt;/span&gt;50% positive&lt;span class="o"&gt;)&lt;/span&gt;: 3 out of 10 reviews mentioned seating options as a positive aspect.
Service Speed &lt;span class="o"&gt;(&lt;/span&gt;50% positive&lt;span class="o"&gt;)&lt;/span&gt;: 2 out of 10 reviews mentioned the speed of service positively.
Location &lt;span class="o"&gt;(&lt;/span&gt;80% positive&lt;span class="o"&gt;)&lt;/span&gt;: 4 out of 10 reviews mentioned the location as a positive aspect.
Crockery &lt;span class="o"&gt;(&lt;/span&gt;50% negative&lt;span class="o"&gt;)&lt;/span&gt;: 1 out of 10 reviews mentioned the lack of crockery negatively.
Restroom Availability &lt;span class="o"&gt;(&lt;/span&gt;100% negative&lt;span class="o"&gt;)&lt;/span&gt;: 1 out of 10 reviews mentioned the lack of restroom negatively.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;References&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/playground"&gt;https://serpapi.com/playground&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/playground?engine=google_maps_reviews"&gt;https://serpapi.com/playground?engine=google_maps_reviews&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/google-maps-reviews-api"&gt;https://serpapi.com/google-maps-reviews-api&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Alternative Input Method
&lt;/h2&gt;

&lt;p&gt;Aside from using the extension with SerpApi's Engines like Google Maps Reviews API or Apple Reviews API, users can alternatively input reviews in an array format. This option offers flexibility in handling reviews from various sources such as tweets, social media posts, youtube video comments, etc., ensuring that valuable insights are not missed. SerpApi also serves these results in its engines. But it is not supported yet for this extension.&lt;/p&gt;

&lt;p&gt;Here is an example of reviews from different sources about SerpApi:&lt;br&gt;
Input:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="o"&gt;[&lt;/span&gt;
&lt;span class="s2"&gt;"I wanted to try out events search API and found SERP API to be more manageable and versatile with plenty of other things to integrate! Pretty straightforward integration and support are on time with less than 15 mins TAT."&lt;/span&gt;,
&lt;span class="s2"&gt;"As the CEO of ReviewsJet, I have witnessed firsthand the immense value Serp API has brought to our platform. By integrating this powerful real-time API into our review software, we have revolutionized how our users collect and analyze customer feedback from sources such as Google, Yelp, Bing, Yahoo, eBay, and many more.

Before incorporating Serp API, we faced numerous challenges in aggregating reviews from various platforms. Our users often had to spend hours manually searching and compiling data, which led to inefficiencies and inaccuracies. Serp API has completely changed the game for us by automating this process and providing a seamless, comprehensive, and up-to-date source of information.

The real-time capabilities of Serp API have greatly benefited our clients. They can now access the most relevant and current data without worrying about outdated information. This has allowed them to make timely and well-informed decisions, ultimately leading to better products and services for their customers.

At ReviewsJet, we pride ourselves on delivering an exceptional user experience, and Serp API's seamless integration within our platform has been a significant contributing factor to this. Our clients can now easily access and analyze data from multiple sources in a unified interface, resulting in a more efficient and enjoyable workflow.

Furthermore, Serp API's extensive coverage across search engines and platforms has given our users a holistic view of their customer's experiences and opinions. This comprehensive understanding has empowered them to identify areas for improvement and capitalize on their strengths, leading to happier customers and more successful businesses."&lt;/span&gt;,
&lt;span class="s2"&gt;"I used the google scholar service, it was pretty smooth and nice. I spend couple of day to figure a way to do it myself. Then I decided to use SerpApi and it took like 10 minutes to implement and run my good."&lt;/span&gt;
...
&lt;span class="o"&gt;]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Output:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;Ease of Use &lt;span class="o"&gt;(&lt;/span&gt;Positive&lt;span class="o"&gt;)&lt;/span&gt; - 23%
Fast and Responsive API &lt;span class="o"&gt;(&lt;/span&gt;Positive&lt;span class="o"&gt;)&lt;/span&gt; - 20%
Excellent Customer Support &lt;span class="o"&gt;(&lt;/span&gt;Positive&lt;span class="o"&gt;)&lt;/span&gt; - 18%
Comprehensive and Versatile &lt;span class="o"&gt;(&lt;/span&gt;Positive&lt;span class="o"&gt;)&lt;/span&gt; - 15%
Pricing Concerns &lt;span class="o"&gt;(&lt;/span&gt;Negative&lt;span class="o"&gt;)&lt;/span&gt; - 10%
Documentation Quality &lt;span class="o"&gt;(&lt;/span&gt;Mixed&lt;span class="o"&gt;)&lt;/span&gt; - 7% &lt;span class="o"&gt;(&lt;/span&gt;Positive: 5%, Negative: 2%&lt;span class="o"&gt;)&lt;/span&gt;
Integration Issues &lt;span class="o"&gt;(&lt;/span&gt;Negative&lt;span class="o"&gt;)&lt;/span&gt; - 3%
Wide Range of Features &lt;span class="o"&gt;(&lt;/span&gt;Positive&lt;span class="o"&gt;)&lt;/span&gt; - 2%
Real-time Results &lt;span class="o"&gt;(&lt;/span&gt;Positive&lt;span class="o"&gt;)&lt;/span&gt; - 1%
Wish &lt;span class="k"&gt;for &lt;/span&gt;More Free Credits &lt;span class="o"&gt;(&lt;/span&gt;Negative&lt;span class="o"&gt;)&lt;/span&gt; - 1%
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;References&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.trustpilot.com/review/serpapi.com"&gt;https://www.trustpilot.com/review/serpapi.com&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.g2.com/products/serpapi-serpapi/reviews"&gt;https://www.g2.com/products/serpapi-serpapi/reviews&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://crozdesk.com/software/serpapi"&gt;https://crozdesk.com/software/serpapi&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.capterra.co.il/reviews/1030762/serpapi"&gt;https://www.capterra.co.il/reviews/1030762/serpapi&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;I'm grateful for the reader's time and attention. SerpApi's ChatGPT Review Analyzer Chrome extension offers users a streamlined and efficient way to analyze reviews using SerpAPI results. By harnessing the power of ChatGPT, the extension provides valuable insights, such as sentiment analysis, review summarization, and label extraction. Although it is not an officially supported SerpApi tool or official plugin OpenAI APIs, the extension is an invaluable resource for businesses and individuals seeking to make informed decisions based on customer feedback. I hope it demonstrates to the reader that SerpApi is a great source for developing ChatGPT-powered AI tools.&lt;/p&gt;

</description>
      <category>chatgpt</category>
      <category>webdev</category>
      <category>programming</category>
      <category>webscraping</category>
    </item>
    <item>
      <title>Global SEO: Strategies for International Success</title>
      <dc:creator>Emirhan Akdeniz</dc:creator>
      <pubDate>Thu, 20 Apr 2023 18:46:24 +0000</pubDate>
      <link>https://dev.to/serpapi/global-seo-strategies-for-international-success-7p3</link>
      <guid>https://dev.to/serpapi/global-seo-strategies-for-international-success-7p3</guid>
      <description>&lt;h2&gt;
  
  
  Introduction to Global SEO
&lt;/h2&gt;

&lt;p&gt;In today's interconnected world, businesses must expand their online presence beyond local borders to reach a global audience, and to have a presence in global markets, especially in sectors like the e-commerce sector. Global SEO is the key to success in this endeavor. It involves optimizing your website for search engines, considering multiple languages, cultures, target countries, and technical elements to improve your visibility in international search results. &lt;/p&gt;

&lt;p&gt;One way to achieve better international search results, and expand your market share is by utilizing SerpApi, an API to scrape SERP(an acronym for Search Engine Results Page) that can help you gain insights and make data-driven decisions. I will discuss about different tools SerpApi can provide to help your marketing strategy whether you are an individual or international SEO agency, and provide the details you need to consider in your endeavor.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;References:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://developers.google.com/search/docs/specialty/international/localized-versions?hl=en&amp;amp;visit_id=638175998064162971-1820852155&amp;amp;rd=1"&gt;https://developers.google.com/search/docs/specialty/international/localized-versions?hl=en&amp;amp;visit_id=638175998064162971-1820852155&amp;amp;rd=1&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/use-cases/seo"&gt;https://serpapi.com/use-cases/seo&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Importance of Global SEO
&lt;/h2&gt;

&lt;p&gt;Global SEO is essential for businesses aiming to tap into international markets. It helps you:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Gain visibility in international search terms&lt;/li&gt;
&lt;li&gt;Increase organic traffic from target regions&lt;/li&gt;
&lt;li&gt;Build brand recognition in foreign markets&lt;/li&gt;
&lt;li&gt;Improve user experience for global audiences&lt;/li&gt;
&lt;li&gt;Boost sales and revenue from international customers&lt;/li&gt;
&lt;li&gt;Enhance conversions by catering to a diverse audience&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  What is the difference between global and local SEO?
&lt;/h3&gt;

&lt;p&gt;Local SEO focuses on optimizing a website to rank higher for a specific local audience, while global SEO targets a broader, international audience. While both share similarities, global SEO requires additional considerations such as language, culture, and regional search engine preferences whereas details such as an address, phone number, or local rating are more important aspects for local SEO.&lt;/p&gt;

&lt;h3&gt;
  
  
  How do you create a global SEO strategy?
&lt;/h3&gt;

&lt;p&gt;To develop a successful global SEO strategy, consider the following components:&lt;/p&gt;

&lt;h4&gt;
  
  
  Language and Localization
&lt;/h4&gt;

&lt;p&gt;Tailor your content to your target audience by translating and localizing it into different languages such as English, Spanish, and Japanese to have a better experience for native speakers by doing multilingual SEO with language targeting. Localization involves adapting your content to suit the cultural nuances and preferences of your target market, such as adapting pricing for specific regions such as the Middle East, North America, etc., or specific countries such as Australia, etc. Work with native translators, cultural consultants, native SEO experts, or native SEO companies in the targeted group to ensure accuracy and relevance. SerpApi can help you gather data on language and localization by providing insights into the search results of different countries and languages.&lt;/p&gt;

&lt;p&gt;Here's an example of a search from Australia, with the English Language, in a Desktop Device from SerpApi's Google Search Results API:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--5yjCgFMh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/233420901-1fd1160b-24e3-489d-8860-92a58c5e7026.png" class="article-body-image-wrapper"&gt;&lt;img alt="language-and-localization" src="https://res.cloudinary.com/practicaldev/image/fetch/s--5yjCgFMh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/233420901-1fd1160b-24e3-489d-8860-92a58c5e7026.png" width="800" height="409"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;References:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://support.google.com/business/answer/6270107?hl=en&amp;amp;sjid=11440402720055813938-EU"&gt;https://support.google.com/business/answer/6270107?hl=en&amp;amp;sjid=11440402720055813938-EU&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/search-api"&gt;https://serpapi.com/search-api&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/locations-api"&gt;https://serpapi.com/locations-api&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Technical SEO
&lt;/h4&gt;

&lt;p&gt;Implement technical SEO aspects like international site structure, hreflang tags, and canonical tags to ensure that search engines correctly index your multilingual content. Also, you can improve your site's loading speed, add alt descriptions for images, and mobile-friendliness, etc. for a seamless user experience. Try to create case studies from the metrics such as the PPC (Pay-Per-Click) of your ads, click-through rate, bounce rate, conversion rate, and whatever information you can do with up-to-date on-page SEO practices for your web pages. Make competitor analysis before and after your optimisations. SerpApi can assist in tracking technical SEO factors by monitoring how your site appears in SERPs across different regions. It reduces the need for proxies, HTML parsers, calculating pagerank, which page content had been cached by the search engine's crawlers, and acts as an easy-to-understand API for your SEO needs.&lt;/p&gt;

&lt;p&gt;Here's an example with SerpApi's Google About This Results API from &lt;code&gt;google.com.tr&lt;/code&gt; domain:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--EcpLLO7b--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/233434589-e69a954b-5f3f-4d5f-964f-18508bdabd48.png" class="article-body-image-wrapper"&gt;&lt;img alt="technical-seo" src="https://res.cloudinary.com/practicaldev/image/fetch/s--EcpLLO7b--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/233434589-e69a954b-5f3f-4d5f-964f-18508bdabd48.png" width="800" height="409"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;References:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/organic-results#api-examples-organic-results-overview"&gt;https://serpapi.com/organic-results#api-examples-organic-results-overview&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/google-about-this-result"&gt;https://serpapi.com/google-about-this-result&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Content Strategy and Content Marketing
&lt;/h4&gt;

&lt;p&gt;Create content that resonates with your target audience. Research their interests, pain points, and preferences to develop a content plan that addresses their needs. Utilize content marketing tactics to promote your brand in new markets, such as Brazil and Spain. Also, optimize your content with relevant keywords for each region. SerpApi can aid in keyword research by providing indicators for your search volume, which can inform your content strategy. You can then combine it with your SEO audit to craft your SEO marketing strategy. It is also vital to be in line with the regulations of the search engine you want to optimize your rank for.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;References:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://support.google.com/business/answer/3038177?hl=en"&gt;https://support.google.com/business/answer/3038177?hl=en&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://ahrefs.com/blog/seo-content-strategy/"&gt;https://ahrefs.com/blog/seo-content-strategy/&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Link Building
&lt;/h4&gt;

&lt;p&gt;Build high-quality backlinks from authoritative websites in your target regions. This will help improve your site's credibility and visibility in local search results. SerpApi can support your link-building efforts by helping you identify potential link opportunities in the target markets by giving you the tools to calculate the best opportunities from organic search data.&lt;/p&gt;

&lt;p&gt;Here is another example from SerpApi's Google Search API that checks for different media outlets in the United Kingdom, for the English Language Results, in Mobile Devices:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Lp1yx14L--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/233425486-c7b5ba27-260b-4dde-99ae-0e3c22670d79.png" class="article-body-image-wrapper"&gt;&lt;img alt="link-building" src="https://res.cloudinary.com/practicaldev/image/fetch/s--Lp1yx14L--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/233425486-c7b5ba27-260b-4dde-99ae-0e3c22670d79.png" width="800" height="409"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Social Media and International Outreach
&lt;/h4&gt;

&lt;p&gt;Leverage social media platforms popular in your target markets, such as Yahoo Japan and Naver in South Korea. Engage with influencers and local communities to increase brand awareness and drive traffic to your website. SerpApi can provide data on social media trends and popular platforms in different countries, assisting in your international outreach efforts.&lt;/p&gt;

&lt;p&gt;Here's an example from SerpApi's Google Trends API that compares interest between different social media outlets on a worldwide scale:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mil7hjf6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/233426211-d18a2693-e441-4213-83ec-3f5c4fce0516.png" class="article-body-image-wrapper"&gt;&lt;img alt="social-media" src="https://res.cloudinary.com/practicaldev/image/fetch/s--mil7hjf6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/233426211-d18a2693-e441-4213-83ec-3f5c4fce0516.png" width="800" height="409"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;References&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/google-trends-api"&gt;https://serpapi.com/google-trends-api&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  What are global SEO strategies?
&lt;/h3&gt;

&lt;p&gt;Global SEO strategies involve a combination of the following components:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Language and localization&lt;/li&gt;
&lt;li&gt;Technical SEO&lt;/li&gt;
&lt;li&gt;Content strategy and content marketing&lt;/li&gt;
&lt;li&gt;Link building&lt;/li&gt;
&lt;li&gt;Social media and international outreach&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  What makes strong international SEO?
&lt;/h3&gt;

&lt;p&gt;Strong international SEO involves understanding your target audience, optimizing content for local search engines, and implementing effective technical SEO strategies. Additionally, creating a content strategy that resonates with your audience, building high-quality backlinks, and leveraging social media in your target markets contribute to a robust international SEO strategy.&lt;/p&gt;

&lt;h2&gt;
  
  
  Different Privacy Rules throughout the World
&lt;/h2&gt;

&lt;p&gt;As you expand your global SEO efforts, it's crucial to be aware of different privacy rules throughout the world. For instance, the European Union's General Data Protection Regulation (GDPR) has strict requirements for data protection and privacy, while the California Consumer Privacy Act (CCPA) governs data privacy in the United States. Brazil has the Lei Geral de Proteção de Dados (LGPD), and the Personal Information Protection and Electronic Documents Act (PIPEDA) is in place in Canada. Ensuring compliance with these privacy laws is essential to avoid potential fines and maintain the trust of your international audience. Working with a professional who specializes in the field, and has the know how of these laws, as well as someone who can deliver the information to the SEO team in a healthy manner will come in handy.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;References&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://gdpr-info.eu/"&gt;https://gdpr-info.eu/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://oag.ca.gov/privacy/ccpa"&gt;https://oag.ca.gov/privacy/ccpa&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://iapp.org/resources/article/brazilian-data-protection-law-lgpd-english-translation/"&gt;https://iapp.org/resources/article/brazilian-data-protection-law-lgpd-english-translation/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.priv.gc.ca/en/privacy-topics/privacy-laws-in-canada/the-personal-information-protection-and-electronic-documents-act-pipeda/pipeda_brief/"&gt;https://www.priv.gc.ca/en/privacy-topics/privacy-laws-in-canada/the-personal-information-protection-and-electronic-documents-act-pipeda/pipeda_brief/&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  International SEO Best Practices
&lt;/h2&gt;

&lt;p&gt;Follow these international SEO best practices for optimal results:&lt;/p&gt;

&lt;h3&gt;
  
  
  Understanding Your Target Market
&lt;/h3&gt;

&lt;p&gt;Conduct thorough market research to understand your target audience's demographics, preferences, and online behavior. This information will guide your global SEO strategy. Utilize digital marketing techniques and tools to reach your target market effectively.&lt;/p&gt;

&lt;h3&gt;
  
  
  Domain Structure and Targeting
&lt;/h3&gt;

&lt;p&gt;Choose a domain structure that best suits your global SEO needs. Options include country code top-level domains (ccTLDs), subdomains, and subdirectories. Consider factors like user experience, brand perception, and search engine preferences when making your decision. You may also consider generic top-level domains (gTLDs) for a more global approach. Pay attention to the URL structure and ensure it is easily navigable and understandable by both searchers and search engines.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;References:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://developer.mozilla.org/en-US/docs/Learn/Common_questions/Web_mechanics/What_is_a_URL"&gt;https://developer.mozilla.org/en-US/docs/Learn/Common_questions/Web_mechanics/What_is_a_URL&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Implementing Hreflang Tags
&lt;/h3&gt;

&lt;p&gt;Use hreflang tags to indicate the language and regional targeting of your pages. This helps search engines serve the correct version of your content to users based on their location and language preferences.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;References:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://developer.mozilla.org/en-US/docs/Web/HTML/Element/a"&gt;https://developer.mozilla.org/en-US/docs/Web/HTML/Element/a&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Optimizing Content for Local Search Engines
&lt;/h3&gt;

&lt;p&gt;While Google and Bing dominate the global search market, some regions have their preferred search engines, such as Baidu in China or Yandex in Russia. Optimize your content for local search engines by understanding their unique algorithms and ranking factors. Ensure that your content is also optimized for the local language, as it plays a significant role in search engine optimization. SerpApi allows different APIs so you can track your website's rank through different search engines.&lt;/p&gt;

&lt;p&gt;Here's an example from SerpApi's Baidu Search API:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--1qBfFwcK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/233439040-a5077cb4-f3da-44c8-b326-9c401e1f9037.png" class="article-body-image-wrapper"&gt;&lt;img alt="local-search-engines" src="https://res.cloudinary.com/practicaldev/image/fetch/s--1qBfFwcK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/233439040-a5077cb4-f3da-44c8-b326-9c401e1f9037.png" width="800" height="409"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;References&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/baidu-search-api"&gt;https://serpapi.com/baidu-search-api&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/yandex-search-api"&gt;https://serpapi.com/yandex-search-api&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/naver-search-api"&gt;https://serpapi.com/naver-search-api&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Tracking and Analyzing Your Global SEO Efforts
&lt;/h3&gt;

&lt;p&gt;Monitor your global SEO performance using analytics tools like Google Analytics, Google Search Console, and other region-specific tools. Analyze your data to identify areas for improvement and refine your strategy accordingly. SerpApi can complement these tools by providing additional insights into SERP data across different countries and languages.&lt;/p&gt;

&lt;h2&gt;
  
  
  Common Global SEO Challenges
&lt;/h2&gt;

&lt;p&gt;Some challenges you may face during your global SEO journey include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Language barriers and translation inaccuracies&lt;/li&gt;
&lt;li&gt;Adapting to cultural differences and preferences&lt;/li&gt;
&lt;li&gt;Managing multiple domain structures&lt;/li&gt;
&lt;li&gt;Navigating local search engine algorithms&lt;/li&gt;
&lt;li&gt;Balancing global brand consistency with localization&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;By understanding and addressing these challenges, you can create a more effective global SEO strategy.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Global SEO is crucial for businesses aiming to expand their reach beyond local markets. By focusing on key components like language and localization, technical SEO, content strategy, link building, and international outreach, you can create a successful global SEO strategy. Additionally, follow international SEO best practices and be prepared to overcome common challenges in the process. SerpApi can be a valuable resource in your global SEO efforts, providing data-driven insights to help you make informed decisions and improve your international search engine visibility. You can &lt;a href="https://serpapi.com/users/sign_up"&gt;register to claim free credits&lt;/a&gt;. I am grateful for the attention of the reader.&lt;/p&gt;

</description>
      <category>seo</category>
      <category>globalseo</category>
      <category>serp</category>
      <category>international</category>
    </item>
    <item>
      <title>Real World Example of GPS Targeted Local SERP</title>
      <dc:creator>Emirhan Akdeniz</dc:creator>
      <pubDate>Wed, 12 Apr 2023 13:31:05 +0000</pubDate>
      <link>https://dev.to/serpapi/real-world-example-of-gps-targeted-local-serp-2k1i</link>
      <guid>https://dev.to/serpapi/real-world-example-of-gps-targeted-local-serp-2k1i</guid>
      <description>&lt;p&gt;&lt;strong&gt;Introduction&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Local SEO is crucial for businesses looking to improve their visibility and conversions in their specific locations. In this blog post, we will delve into the concept of local SERP, the importance of local SEO, and the use of UULE parameter for accurate local search results. We will also explore an example grid search code for better search engine results precision using GPS coordinates of a location. The example search term we use will be &lt;code&gt;Cafe&lt;/code&gt;, and we will gather only the results on the landing page of the search. You may scroll to the bottom of the page to get the full code.&lt;/p&gt;

&lt;p&gt;We will use the following to scale the search volume with precision:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/google-local-api"&gt;SerpApi’s Google Local Results Scraper API&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://rubygems.org/gems/uule_converter"&gt;Ruby Gem called &lt;code&gt;uule_converter&lt;/code&gt;&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  What is Local SERP?
&lt;/h1&gt;

&lt;p&gt;A Local Search Engine Results Page (SERP) refers to the search results displayed by search engines when a user conducts a search query with local intent. These results are tailored to the user's geographical location and often include local business listings, maps, and reviews. Local SERPs help users find relevant, nearby businesses or services while enabling businesses to reach potential customers effectively.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--FARzX9-b--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/231441775-9ba02e02-f07f-4b99-837a-2cd42d40d6b2.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--FARzX9-b--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/231441775-9ba02e02-f07f-4b99-837a-2cd42d40d6b2.jpg" alt="local-result-example" width="800" height="404"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What is meant by Local SEO?
&lt;/h2&gt;

&lt;p&gt;Local SEO, or Local Search Engine Optimization, is the practice of optimizing a website, its content, and its online presence to improve visibility in local search results. Local SEO targets potential customers in a specific geographical area by focusing on local search signals such as business listings, reviews, and citations. The goal of local SEO is to increase organic traffic, conversions, and online visibility for businesses with a physical location or service area. SerpApi offers a variety of Local SEO tools including but not limited to &lt;a href="https://serpapi.com/local-results"&gt;Google Local Results API&lt;/a&gt;, &lt;a href="https://serpapi.com/local-pack"&gt;Google Local Pack Results API&lt;/a&gt;, &lt;a href="https://serpapi.com/google-maps-api"&gt;Google Maps API&lt;/a&gt;, &lt;a href="https://serpapi.com/google-local-services-api"&gt;Google Local Services API&lt;/a&gt;, etc.&lt;/p&gt;

&lt;p&gt;References:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://serpapi.com/use-cases/local-seo"&gt;https://serpapi.com/use-cases/local-seo&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://support.google.com/business/answer/7091?hl=en"&gt;https://support.google.com/business/answer/7091?hl=en&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  What are the benefits of Local SERP?
&lt;/h2&gt;

&lt;p&gt;Local SERPs offer numerous benefits for businesses, including:&lt;/p&gt;

&lt;p&gt;1) Improved Visibility: Local SERPs provide targeted exposure for businesses by displaying their information to users in the relevant geographical area.&lt;/p&gt;

&lt;p&gt;2) Increased Conversions: By targeting users with location-specific intent, local SERPs can drive higher conversion rates.&lt;/p&gt;

&lt;p&gt;3) Enhanced Reputation: Local search results often include reviews and ratings, which can help businesses build trust and credibility with potential customers.&lt;/p&gt;

&lt;p&gt;4) Competitive Edge: Optimizing for local SERPs can give businesses a competitive advantage over competitors who neglect local SEO.&lt;/p&gt;

&lt;p&gt;5) Better User Experience: Local SERPs provide users with relevant, location-specific information, resulting in a more satisfying search experience.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is the definition of the word "uule"?
&lt;/h2&gt;

&lt;p&gt;UULE (Universal URL Encoded Location) is a URL parameter utilized by Google to pinpoint a user's specific location during a search. This parameter is encoded with a geolocation identifier, known as a canonical name, which represents a particular geographical area, or by GPS coordinates. By incorporating the UULE parameter in search queries, businesses can obtain precise local search results tailored to their target audience's location.&lt;/p&gt;

&lt;p&gt;There are two different UULE parameter versions:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Version 1:&lt;/strong&gt; w+CAIQICI…&lt;/p&gt;

&lt;p&gt;This version uses a "canonical name," typically used by Google AdWords to geotarget ads. An example of a canonical name is "West New York, New Jersey, United States," which includes the name of the place, region, and country. Decoding this version requires handling the entire string as a URI component, decoding it, splitting the string by the "plus," and base64-decoding the second part of the string (starting with "CAIQICI").&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Version 2:&lt;/strong&gt; a+cm9…&lt;/p&gt;

&lt;p&gt;This version can be found in the cookie written by Google after obtaining the location via the Geolocation API on the website. This version requires no guessing of fields and uses text instead of protocol buffers. After URL-decoding the string and base64-decoding it, you will get ASCII text containing information such as role, producer, provenance, timestamp, latitude, longitude, and radius.&lt;/p&gt;

&lt;p&gt;You can employ the uule parameter to get precise results using &lt;a href="https://serpapi.com/search-api"&gt;most of SerpApi’s Google Serp Engines&lt;/a&gt; to get real-time organic results. We have developed a ruby gem called &lt;a href="https://rubygems.org/gems/uule_converter"&gt;uule_converter&lt;/a&gt; to convert GPS Coordinates into uule Version 2 as described above. The advantage of this uule is that you don’t have to know the Canonical Names of the locations you are gathering data from. You can employ a grid search with a checker for your local rankings to filter them by their SERP features such as an address, or GPS coordinate of a local result, or their metrics such as rating, number of reviews, etc.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--wbakyGdP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/231441750-e0182a35-1510-4257-b83f-a33c8a843346.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--wbakyGdP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://user-images.githubusercontent.com/73674035/231441750-e0182a35-1510-4257-b83f-a33c8a843346.jpg" alt="playground-example" width="800" height="404"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;References:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://github.com/serpapi/uule_converter"&gt;https://github.com/serpapi/uule_converter&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://valentin.app/uule.html"&gt;https://valentin.app/uule.html&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://developers.google.com/google-ads/api/data/geotargets"&gt;https://developers.google.com/google-ads/api/data/geotargets&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Example Serp Checker Code for a GPS Grid
&lt;/h2&gt;

&lt;h4&gt;
  
  
  Possible Integrations
&lt;/h4&gt;

&lt;p&gt;The example code given here is using SerpApi’s Google Search Results Ruby Gem. However, you can use this gem to gather. You may use this gem, or the method therein to gather UULE parameters for a grid search, and then use them using SerpApi’s integrations in Different Languages. You may also make real-time organic searches for mobile devices to expand your keyword research in a specified location and enrich your rank tracker with pagination.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;References:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://github.com/serpapi/"&gt;https://github.com/serpapi/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/integrations"&gt;https://serpapi.com/integrations&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/blog/serpapi-pagination/"&gt;https://serpapi.com/blog/serpapi-pagination/&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Required Gems and Variables
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'uule_converter'&lt;/span&gt;
&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'geocoder'&lt;/span&gt;
&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'json'&lt;/span&gt;
&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'fileutils'&lt;/span&gt;
&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'concurrent'&lt;/span&gt;
&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'rest-client'&lt;/span&gt;
&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'google_search_results'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We will be needing the above gems to construct our searcher.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Set your API key&lt;/span&gt;
&lt;span class="n"&gt;api_key&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'Your API Key'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We’ll also need the SerpApi API Key. &lt;a href="https://serpapi.com/users/sign_up"&gt;You may register to claim free credits.&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Set the Country of the Place you want to search&lt;/span&gt;
&lt;span class="n"&gt;address&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"Liechtenstein"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let’s set the address to a small country in Europe called Liechtenstein. I chose this country because of its size. It is surrounded by Austria and Switzerland. Later we will exclude results from those countries, and use only the cafes in Liechtenstein in our results.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Set the Search Language&lt;/span&gt;
&lt;span class="n"&gt;search_language&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"en"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Search Language has no effect on the ordering of results for this case. We will set it to English for better display.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Set the preference for grid search&lt;/span&gt;
&lt;span class="n"&gt;grid_search&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kp"&gt;false&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I added a Boolean variable here just in case you just want to filter your results with the example code.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Get geocoded address object&lt;/span&gt;
&lt;span class="n"&gt;address_geocoded&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Geocoder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;search&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;address&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;first&lt;/span&gt;

&lt;span class="c1"&gt;# Define the country code based on the geocoded address&lt;/span&gt;
&lt;span class="n"&gt;desired_country&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;address_geocoded&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;country_code&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We will use the Geocoder gem to get relevant data from the address we have given. Distinguishing the country will be important for filtering neighboring countries and making the search more precise in the future.&lt;/p&gt;

&lt;h4&gt;
  
  
  Grid Search
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;grid_search&lt;/span&gt;
  &lt;span class="c1"&gt;# Define address's latitude and longitude boundaries&lt;/span&gt;
  &lt;span class="n"&gt;bounds&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;address_geocoded&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'boundingbox'&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="ss"&gt;:to_f&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="n"&gt;south&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;north&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;west&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;east&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;bounds&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;round&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;6&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;bounds&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;round&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;6&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;bounds&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;round&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;6&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;bounds&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;round&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;6&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

  &lt;span class="c1"&gt;# Grid step size (approximately 1 km)&lt;/span&gt;
  &lt;span class="n"&gt;step_size&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mf"&gt;0.009&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Define the GPS boundaries of a location to create a grid layout. In our case, it will be the farthest point of each cardinal direction of the country Liechtenstein. The step_size variable defines how much distance we must step at each interval in order to cover an area. The uule we will use will give all the cafes in an approximately 1 km radius circle of the GPS coordinates we locked.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;  &lt;span class="c1"&gt;# Create 'searches' directory if it doesn't exist&lt;/span&gt;
  &lt;span class="no"&gt;FileUtils&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;mkdir_p&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'searches'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We will save each search inside the &lt;code&gt;searches&lt;/code&gt; directory to be reunited later. This way, long grid searches will have a chance to be recovered in case of abruptions in your system.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;  &lt;span class="c1"&gt;# Create a thread pool&lt;/span&gt;
  &lt;span class="n"&gt;thread_pool&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Concurrent&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="no"&gt;FixedThreadPool&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

  &lt;span class="c1"&gt;# Create a concurrent counter for request numbering&lt;/span&gt;
  &lt;span class="n"&gt;request_counter&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Concurrent&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="no"&gt;AtomicFixnum&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

  &lt;span class="c1"&gt;# Grid search&lt;/span&gt;
  &lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"Starting grid search..."&lt;/span&gt;
  &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;south&lt;/span&gt;&lt;span class="o"&gt;..&lt;/span&gt;&lt;span class="n"&gt;north&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;step&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;step_size&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;latitude&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;west&lt;/span&gt;&lt;span class="o"&gt;..&lt;/span&gt;&lt;span class="n"&gt;east&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;step&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;step_size&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;longitude&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
      &lt;span class="n"&gt;thread_pool&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;post&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
        &lt;span class="n"&gt;uule_encoded&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;UuleConverter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;encode&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;latitude&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;longitude&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;#gl is set to the country code based on the geocoded address&lt;/span&gt;
        &lt;span class="n"&gt;gl_parameter&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;desired_country&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;downcase&lt;/span&gt;
        &lt;span class="c1"&gt;#hl=en, Language parameter fixed for English&lt;/span&gt;
        &lt;span class="n"&gt;hl_parameter&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;search_language&lt;/span&gt;

        &lt;span class="c1"&gt;#no_cache=true to get live results&lt;/span&gt;

        &lt;span class="c1"&gt;# Here, I have added a domain for extra precision&lt;/span&gt;
        &lt;span class="c1"&gt;# It is optional for the most part.&lt;/span&gt;
        &lt;span class="n"&gt;google_domain&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"google.li"&lt;/span&gt;

        &lt;span class="c1"&gt;# SerpApi request&lt;/span&gt;
        &lt;span class="n"&gt;search&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;GoogleSearch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="ss"&gt;engine: &lt;/span&gt;&lt;span class="s2"&gt;"google_local"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="ss"&gt;google_domain: &lt;/span&gt;&lt;span class="n"&gt;google_domain&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="ss"&gt;q: &lt;/span&gt;&lt;span class="s2"&gt;"Cafe"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="ss"&gt;gl: &lt;/span&gt;&lt;span class="n"&gt;gl_parameter&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="ss"&gt;hl: &lt;/span&gt;&lt;span class="n"&gt;hl_parameter&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="ss"&gt;uule: &lt;/span&gt;&lt;span class="n"&gt;uule_encoded&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="ss"&gt;serp_api_key: &lt;/span&gt;&lt;span class="n"&gt;api_key&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;hash_results&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;search&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get_hash&lt;/span&gt;

        &lt;span class="c1"&gt;# You can get deeper results by following the `serpapi_pagination` -&amp;gt; `next_page`&lt;/span&gt;
        &lt;span class="c1"&gt;# To paginate into deeper results. Some results will correlate with the others.&lt;/span&gt;
        &lt;span class="c1"&gt;# It is later filtered out.&lt;/span&gt;

        &lt;span class="c1"&gt;# Save response to a JSON file&lt;/span&gt;
        &lt;span class="n"&gt;request_number&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;request_counter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;increment&lt;/span&gt;
        &lt;span class="no"&gt;File&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;open&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;"searches/response_&lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;request_number&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;.json"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'w'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;write&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="no"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;pretty_generate&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;hash_results&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
      &lt;span class="k"&gt;end&lt;/span&gt;
    &lt;span class="k"&gt;end&lt;/span&gt;
  &lt;span class="k"&gt;end&lt;/span&gt;

  &lt;span class="c1"&gt;# Shutdown the thread pool and wait for the tasks to complete&lt;/span&gt;
  &lt;span class="n"&gt;thread_pool&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;shutdown&lt;/span&gt;
  &lt;span class="n"&gt;thread_pool&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;wait_for_termination&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We will iterate through each GPS coordinate, make a search using &lt;code&gt;google_search_results&lt;/code&gt; gem, and then save it into a local file inside &lt;code&gt;searches&lt;/code&gt; folder. &lt;/p&gt;

&lt;p&gt;One thing to note here is that I have used only the first page results for demonstration purposes. Some of the results have second and more pages. You may refactor this code to implement pagination results as well.&lt;/p&gt;

&lt;p&gt;Another detail here is that I have implemented the &lt;code&gt;gl&lt;/code&gt; parameter to be the same with Liechtenstein, which is &lt;code&gt;li&lt;/code&gt;. This ensures the locales of the search are in order. For extra precision, I have also implemented the &lt;code&gt;google_domain&lt;/code&gt; parameter to be the same with the country.&lt;/p&gt;

&lt;p&gt;Moreover, you can see that I have made parallel searches. This is a fast way to gather responses from SerpApi. There is also another way to make searches that is even faster, async searches.&lt;/p&gt;

&lt;p&gt;You can find detailed instructions on how to use them in the following references. Although they are in Python, it is quite easy to implement them into any language.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;References:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/blog/making-async-requests-with-serpapi/"&gt;https://serpapi.com/blog/making-async-requests-with-serpapi/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://serpapi.com/blog/serpapi-pagination/"&gt;https://serpapi.com/blog/serpapi-pagination/&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Filtering Results
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="c1"&gt;# If you want to skip the grid search and filter results directly&lt;/span&gt;
&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;grid_search&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="kp"&gt;false&lt;/span&gt;
  &lt;span class="n"&gt;folder_path&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"searches"&lt;/span&gt;
  &lt;span class="n"&gt;file_pattern&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;File&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;join&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;folder_path&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s2"&gt;"*"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="c1"&gt;# get all files in folder&lt;/span&gt;
  &lt;span class="n"&gt;file_count&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Dir&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;glob&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;file_pattern&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;length&lt;/span&gt;
  &lt;span class="n"&gt;request_counter&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;file_count&lt;/span&gt;
&lt;span class="k"&gt;else&lt;/span&gt;
  &lt;span class="n"&gt;request_counter&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;request_counter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;value&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let’s define the number of files we want to go through to filter results.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Read searches from local storage&lt;/span&gt;
&lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"Reading searches from local storage..."&lt;/span&gt;
&lt;span class="n"&gt;cafes&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
&lt;span class="n"&gt;collected_place_ids&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="o"&gt;..&lt;/span&gt;&lt;span class="n"&gt;request_counter&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
  &lt;span class="n"&gt;search_data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;parse&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="no"&gt;File&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;read&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;"searches/response_&lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;.json"&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; &lt;span class="k"&gt;rescue&lt;/span&gt; &lt;span class="kp"&gt;nil&lt;/span&gt;
  &lt;span class="k"&gt;next&lt;/span&gt; &lt;span class="k"&gt;unless&lt;/span&gt; &lt;span class="n"&gt;search_data&lt;/span&gt;

  &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;search_data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'local_results'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="n"&gt;search_data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'local_results'&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
      &lt;span class="k"&gt;unless&lt;/span&gt; &lt;span class="n"&gt;collected_place_ids&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;include?&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'place_id'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
        &lt;span class="n"&gt;cafes&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;push&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;collected_place_ids&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;push&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'place_id'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
      &lt;span class="k"&gt;end&lt;/span&gt;
    &lt;span class="k"&gt;end&lt;/span&gt;
  &lt;span class="k"&gt;end&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The next step is to collect the unique local results. Since we target a circle with a 1 km in radius, we need to ensure that the excerpt areas in between must be scanned by the grid as well. For that reason, we will always have common results between searches.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Find the neighbouring countries&lt;/span&gt;
&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;CountryNeighbours&lt;/span&gt;
  &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nc"&gt;self&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get_neighbours&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;country_name&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;country_name&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;country_name&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;split&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;" "&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;join&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;"%20"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;RestClient&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;"https://restcountries.com/v3.1/name/&lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;country_name&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;?fullText=true"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;parsed_response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;parse&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;body&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;parsed_response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;empty?&lt;/span&gt;
      &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="s2"&gt;"Could not find country: &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;country_name&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
    &lt;span class="k"&gt;end&lt;/span&gt;

    &lt;span class="n"&gt;neighbours&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;parsed_response&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="s1"&gt;'borders'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;neighbours&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;empty?&lt;/span&gt;
      &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;parsed_response&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="s1"&gt;'name'&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="s1"&gt;'official'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt; has no neighbors."&lt;/span&gt;
    &lt;span class="k"&gt;end&lt;/span&gt;

    &lt;span class="n"&gt;neighbour_names&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;

    &lt;span class="n"&gt;neighbours&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;neighbour_code&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
      &lt;span class="n"&gt;neighbour_response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;RestClient&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;"https://restcountries.com/v3.1/alpha/&lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;neighbour_code&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="n"&gt;parsed_neighbour_response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;parse&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;neighbour_response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;body&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="n"&gt;neighbour_names&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;parsed_neighbour_response&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="s1"&gt;'name'&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="s1"&gt;'common'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="k"&gt;end&lt;/span&gt;

    &lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;parsed_response&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="s1"&gt;'name'&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="s1"&gt;'official'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;'s neighbors: &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;neighbour_names&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;join&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;", "&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
    &lt;span class="n"&gt;neighbour_names&lt;/span&gt;
  &lt;span class="k"&gt;end&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let's define a class to find neighbouring countries.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Create a thread pool&lt;/span&gt;
&lt;span class="n"&gt;thread_pool&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Concurrent&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="no"&gt;FixedThreadPool&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Create an empty array to store filtered cafes&lt;/span&gt;
&lt;span class="n"&gt;filtered_cafes&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Concurrent&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="no"&gt;Array&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;

&lt;span class="c1"&gt;# Neigbouring countries&lt;/span&gt;
&lt;span class="n"&gt;neighbors&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;CountryNeighbours&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get_neighbours&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;address_geocoded&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"address"&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="s2"&gt;"country"&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;

&lt;span class="c1"&gt;# Filter cafes in neighbouring countries in their address&lt;/span&gt;
&lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"Filtering cafes in neighbouring countries in their address..."&lt;/span&gt;
&lt;span class="n"&gt;cafes&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;cafe&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
  &lt;span class="n"&gt;thread_pool&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;post&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;cafe&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'address'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
      &lt;span class="n"&gt;neighbouring_country&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kp"&gt;false&lt;/span&gt;
      &lt;span class="n"&gt;neighbors&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;country&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
        &lt;span class="n"&gt;neighbouring_country&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kp"&gt;true&lt;/span&gt; &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;cafe&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'address'&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;include?&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;", &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;country&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="k"&gt;end&lt;/span&gt;
      &lt;span class="k"&gt;next&lt;/span&gt; &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;neighbouring_country&lt;/span&gt;

      &lt;span class="n"&gt;filtered_cafes&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;push&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;cafe&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"Added cafe: &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;cafe&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'title'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt; at &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;cafe&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'address'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
      &lt;span class="c1"&gt;# You may use this part to further filter your data using GPS Coordinates&lt;/span&gt;
      &lt;span class="c1"&gt;#if cafe.key?('gps_coordinates') &amp;amp;&amp;amp; cafe['gps_coordinates'].key?('latitude') &amp;amp;&amp;amp; cafe['gps_coordinates'].key?('longitude')&lt;/span&gt;
      &lt;span class="c1"&gt;#  coordinates = [cafe['gps_coordinates']['latitude'], cafe['gps_coordinates']['longitude']]&lt;/span&gt;
      &lt;span class="c1"&gt;#  country = Geocoder.search(coordinates).first.country_code&lt;/span&gt;
      &lt;span class="c1"&gt;#  if country == desired_country&lt;/span&gt;
      &lt;span class="c1"&gt;#    filtered_cafes.push(cafe)&lt;/span&gt;
      &lt;span class="c1"&gt;#    puts "Added cafe: #{cafe['title']} at #{cafe['address']}"&lt;/span&gt;
      &lt;span class="c1"&gt;#  end&lt;/span&gt;
      &lt;span class="c1"&gt;#end&lt;/span&gt;
    &lt;span class="k"&gt;end&lt;/span&gt;
  &lt;span class="k"&gt;end&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;

&lt;span class="c1"&gt;# Shutdown the thread pool and wait for the tasks to complete&lt;/span&gt;
&lt;span class="n"&gt;thread_pool&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;shutdown&lt;/span&gt;
&lt;span class="n"&gt;thread_pool&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;wait_for_termination&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Since Liechtenstein is small in size, and surrounded by Austria, and Switzerland, we will have lots of results from these countries as well. We can filter them out from their &lt;code&gt;address&lt;/code&gt; field. Also, this script could be improved by checking the GPS coordinates of places individually for their country.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Reindex positions&lt;/span&gt;
&lt;span class="n"&gt;filtered_cafes&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;filtered_cafes&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;each_with_index&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;cafe&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;  &lt;span class="n"&gt;cafe&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"position"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="c1"&gt;# Save filtered cafes to a JSON file&lt;/span&gt;
&lt;span class="no"&gt;File&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;open&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'filtered_results.json'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'w'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;write&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="no"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;pretty_generate&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;filtered_cafes&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Finally, save the filtered results into a resulting JSON file.&lt;/p&gt;

&lt;h4&gt;
  
  
  Results
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"position"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"title"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Georg Brot &amp;amp; Kaffee"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"rating"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;4.5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"reviews_original"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"(86)"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"reviews"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;86&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"price"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"$$"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Cafe"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"address"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Eschen"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"place_id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"15376372137427994719"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"place_id_search"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://serpapi.com/search.json?device=desktop&amp;amp;engine=google_local&amp;amp;gl=li&amp;amp;google_domain=google.li&amp;amp;hl=en&amp;amp;ludocid=15376372137427994719&amp;amp;q=Cafe&amp;amp;uule=a%2Bcm9sZToxCnByb2R1Y2VyOjEyCnByb3ZlbmFuY2U6MAp0aW1lc3RhbXA6MTY4MTIxNjI0MTIwNjExMgpsYXRsbmd7CmxhdGl0dWRlX2U3OjQ3MDQ4NDI5MApsb25naXR1ZGVfZTc6OTQ5ODY3NDAKfQpyYWRpdXM6LTEK"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"lsig"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"AB86z5UWdQN7B3xNWKgYU4wVHFey"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"thumbnail"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://serpapi.com/searches/643552f21e803f1d09648e64/images/f98bc385e043920a6d2147a9e986a289c30fda633fea7e587c4423723afebe06.jpeg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"gps_coordinates"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"latitude"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;47.2124433&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"longitude"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;9.523155500000001&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"service_options"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"dine_in"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"takeaway"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;With a simple grid search, we can gather all the cafes that show up on the first page of Google Local Results in a small country. It has detailed fields for different uses to make inferences as to what ranks a place better than others. Moreover, if you have a local service that a cafe can use, you can gather all the relevant data about cafes around you. There are many use cases for local SEO as I mentioned above.&lt;/p&gt;

&lt;p&gt;I am grateful for the attention of the reader. I hope this blog post will give a good idea about how you can effectively utilize Local SERP data.&lt;/p&gt;

&lt;h4&gt;
  
  
  Full Code
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'uule_converter'&lt;/span&gt;
&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'geocoder'&lt;/span&gt;
&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'json'&lt;/span&gt;
&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'fileutils'&lt;/span&gt;
&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'concurrent'&lt;/span&gt;
&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'rest-client'&lt;/span&gt;
&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s1"&gt;'google_search_results'&lt;/span&gt;

&lt;span class="c1"&gt;# Set your API key&lt;/span&gt;
&lt;span class="n"&gt;api_key&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'Your API Key'&lt;/span&gt;

&lt;span class="c1"&gt;# Set the Country of the Place you want to search&lt;/span&gt;
&lt;span class="n"&gt;address&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"Liechtenstein"&lt;/span&gt;

&lt;span class="c1"&gt;# Set the Search Language&lt;/span&gt;
&lt;span class="n"&gt;search_language&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"en"&lt;/span&gt;

&lt;span class="c1"&gt;# Set the preference for grid search&lt;/span&gt;
&lt;span class="n"&gt;grid_search&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kp"&gt;false&lt;/span&gt;

&lt;span class="c1"&gt;# Get geocoded address object&lt;/span&gt;
&lt;span class="n"&gt;address_geocoded&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Geocoder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;search&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;address&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;first&lt;/span&gt;

&lt;span class="c1"&gt;# Define the country code based on the geocoded address&lt;/span&gt;
&lt;span class="n"&gt;desired_country&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;address_geocoded&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;country_code&lt;/span&gt;

&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;grid_search&lt;/span&gt;
  &lt;span class="c1"&gt;# Define address's latitude and longitude boundaries&lt;/span&gt;
  &lt;span class="n"&gt;bounds&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;address_geocoded&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'boundingbox'&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="ss"&gt;:to_f&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="n"&gt;south&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;north&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;west&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;east&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;bounds&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;round&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;6&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;bounds&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;round&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;6&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;bounds&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;round&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;6&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;bounds&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;round&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;6&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

  &lt;span class="c1"&gt;# Grid step size (approximately 1 km)&lt;/span&gt;
  &lt;span class="n"&gt;step_size&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mf"&gt;0.009&lt;/span&gt;

  &lt;span class="c1"&gt;# Create 'searches' directory if it doesn't exist&lt;/span&gt;
  &lt;span class="no"&gt;FileUtils&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;mkdir_p&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'searches'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

  &lt;span class="c1"&gt;# Create a thread pool&lt;/span&gt;
  &lt;span class="n"&gt;thread_pool&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Concurrent&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="no"&gt;FixedThreadPool&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

  &lt;span class="c1"&gt;# Create a concurrent counter for request numbering&lt;/span&gt;
  &lt;span class="n"&gt;request_counter&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Concurrent&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="no"&gt;AtomicFixnum&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

  &lt;span class="c1"&gt;# Grid search&lt;/span&gt;
  &lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"Starting grid search..."&lt;/span&gt;
  &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;south&lt;/span&gt;&lt;span class="o"&gt;..&lt;/span&gt;&lt;span class="n"&gt;north&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;step&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;step_size&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;latitude&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;west&lt;/span&gt;&lt;span class="o"&gt;..&lt;/span&gt;&lt;span class="n"&gt;east&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;step&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;step_size&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;longitude&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
      &lt;span class="n"&gt;thread_pool&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;post&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
        &lt;span class="n"&gt;uule_encoded&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;UuleConverter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;encode&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;latitude&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;longitude&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;#gl is set to the country code based on the geocoded address&lt;/span&gt;
        &lt;span class="n"&gt;gl_parameter&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;desired_country&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;downcase&lt;/span&gt;
        &lt;span class="c1"&gt;#hl=en, Language parameter fixed for English&lt;/span&gt;
        &lt;span class="n"&gt;hl_parameter&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;search_language&lt;/span&gt;

        &lt;span class="c1"&gt;#no_cache=true to get live results&lt;/span&gt;

        &lt;span class="c1"&gt;# Here, I have added a domain for extra precision&lt;/span&gt;
        &lt;span class="c1"&gt;# It is optional for the most part.&lt;/span&gt;
        &lt;span class="n"&gt;google_domain&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"google.li"&lt;/span&gt;

        &lt;span class="c1"&gt;# SerpApi request&lt;/span&gt;
        &lt;span class="n"&gt;search&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;GoogleSearch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="ss"&gt;engine: &lt;/span&gt;&lt;span class="s2"&gt;"google_local"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="ss"&gt;google_domain: &lt;/span&gt;&lt;span class="n"&gt;google_domain&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="ss"&gt;q: &lt;/span&gt;&lt;span class="s2"&gt;"Cafe"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="ss"&gt;gl: &lt;/span&gt;&lt;span class="n"&gt;gl_parameter&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="ss"&gt;hl: &lt;/span&gt;&lt;span class="n"&gt;hl_parameter&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="ss"&gt;uule: &lt;/span&gt;&lt;span class="n"&gt;uule_encoded&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="ss"&gt;serp_api_key: &lt;/span&gt;&lt;span class="n"&gt;api_key&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;hash_results&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;search&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get_hash&lt;/span&gt;

        &lt;span class="c1"&gt;# You can get deeper results by following the `serpapi_pagination` -&amp;gt; `next_page`&lt;/span&gt;
        &lt;span class="c1"&gt;# To paginate into deeper results. Some results will correlate with the others.&lt;/span&gt;
        &lt;span class="c1"&gt;# It is later filtered out.&lt;/span&gt;

        &lt;span class="c1"&gt;# Save response to a JSON file&lt;/span&gt;
        &lt;span class="n"&gt;request_number&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;request_counter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;increment&lt;/span&gt;
        &lt;span class="no"&gt;File&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;open&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;"searches/response_&lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;request_number&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;.json"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'w'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;write&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="no"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;pretty_generate&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;hash_results&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
      &lt;span class="k"&gt;end&lt;/span&gt;
    &lt;span class="k"&gt;end&lt;/span&gt;
  &lt;span class="k"&gt;end&lt;/span&gt;

  &lt;span class="c1"&gt;# Shutdown the thread pool and wait for the tasks to complete&lt;/span&gt;
  &lt;span class="n"&gt;thread_pool&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;shutdown&lt;/span&gt;
  &lt;span class="n"&gt;thread_pool&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;wait_for_termination&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;

&lt;span class="c1"&gt;# Find the neighbouring countries&lt;/span&gt;
&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;CountryNeighbours&lt;/span&gt;
  &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nc"&gt;self&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get_neighbours&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;country_name&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;country_name&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;country_name&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;split&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;" "&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;join&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;"%20"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;RestClient&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;"https://restcountries.com/v3.1/name/&lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;country_name&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;?fullText=true"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;parsed_response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;parse&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;body&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;parsed_response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;empty?&lt;/span&gt;
      &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="s2"&gt;"Could not find country: &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;country_name&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
    &lt;span class="k"&gt;end&lt;/span&gt;

    &lt;span class="n"&gt;neighbours&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;parsed_response&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="s1"&gt;'borders'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;neighbours&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;empty?&lt;/span&gt;
      &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;parsed_response&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="s1"&gt;'name'&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="s1"&gt;'official'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt; has no neighbors."&lt;/span&gt;
    &lt;span class="k"&gt;end&lt;/span&gt;

    &lt;span class="n"&gt;neighbour_names&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;

    &lt;span class="n"&gt;neighbours&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;neighbour_code&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
      &lt;span class="n"&gt;neighbour_response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;RestClient&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;"https://restcountries.com/v3.1/alpha/&lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;neighbour_code&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="n"&gt;parsed_neighbour_response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;parse&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;neighbour_response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;body&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="n"&gt;neighbour_names&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;parsed_neighbour_response&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="s1"&gt;'name'&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="s1"&gt;'common'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="k"&gt;end&lt;/span&gt;

    &lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;parsed_response&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="s1"&gt;'name'&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="s1"&gt;'official'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;'s neighbors: &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;neighbour_names&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;join&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;", "&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
    &lt;span class="n"&gt;neighbour_names&lt;/span&gt;
  &lt;span class="k"&gt;end&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;

&lt;span class="c1"&gt;# If you want to skip the grid search and filter results directly&lt;/span&gt;
&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;grid_search&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="kp"&gt;false&lt;/span&gt;
  &lt;span class="n"&gt;folder_path&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"searches"&lt;/span&gt;
  &lt;span class="n"&gt;file_pattern&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;File&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;join&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;folder_path&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s2"&gt;"*"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="c1"&gt;# get all files in folder&lt;/span&gt;
  &lt;span class="n"&gt;file_count&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Dir&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;glob&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;file_pattern&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;length&lt;/span&gt;
  &lt;span class="n"&gt;request_counter&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;file_count&lt;/span&gt;
&lt;span class="k"&gt;else&lt;/span&gt;
  &lt;span class="n"&gt;request_counter&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;request_counter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;value&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;


&lt;span class="c1"&gt;# Read searches from local storage&lt;/span&gt;
&lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"Reading searches from local storage..."&lt;/span&gt;
&lt;span class="n"&gt;cafes&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
&lt;span class="n"&gt;collected_place_ids&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="o"&gt;..&lt;/span&gt;&lt;span class="n"&gt;request_counter&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
  &lt;span class="n"&gt;search_data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;parse&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="no"&gt;File&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;read&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;"searches/response_&lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;.json"&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; &lt;span class="k"&gt;rescue&lt;/span&gt; &lt;span class="kp"&gt;nil&lt;/span&gt;
  &lt;span class="k"&gt;next&lt;/span&gt; &lt;span class="k"&gt;unless&lt;/span&gt; &lt;span class="n"&gt;search_data&lt;/span&gt;

  &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;search_data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'local_results'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="n"&gt;search_data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'local_results'&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
      &lt;span class="k"&gt;unless&lt;/span&gt; &lt;span class="n"&gt;collected_place_ids&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;include?&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'place_id'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
        &lt;span class="n"&gt;cafes&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;push&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;collected_place_ids&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;push&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'place_id'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
      &lt;span class="k"&gt;end&lt;/span&gt;
    &lt;span class="k"&gt;end&lt;/span&gt;
  &lt;span class="k"&gt;end&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;

&lt;span class="c1"&gt;# Create a thread pool&lt;/span&gt;
&lt;span class="n"&gt;thread_pool&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Concurrent&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="no"&gt;FixedThreadPool&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Create an empty array to store filtered cafes&lt;/span&gt;
&lt;span class="n"&gt;filtered_cafes&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Concurrent&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="no"&gt;Array&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;

&lt;span class="c1"&gt;# Neigbouring countries&lt;/span&gt;
&lt;span class="n"&gt;neighbors&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;CountryNeighbours&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get_neighbours&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;address_geocoded&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"address"&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="s2"&gt;"country"&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;

&lt;span class="c1"&gt;# Filter cafes in neighbouring countries in their address&lt;/span&gt;
&lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"Filtering cafes in neighbouring countries in their address..."&lt;/span&gt;
&lt;span class="n"&gt;cafes&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;cafe&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
  &lt;span class="n"&gt;thread_pool&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;post&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;cafe&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'address'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
      &lt;span class="n"&gt;neighbouring_country&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kp"&gt;false&lt;/span&gt;
      &lt;span class="n"&gt;neighbors&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;each&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;country&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;
        &lt;span class="n"&gt;neighbouring_country&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kp"&gt;true&lt;/span&gt; &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;cafe&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'address'&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;include?&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;", &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;country&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="k"&gt;end&lt;/span&gt;
      &lt;span class="k"&gt;next&lt;/span&gt; &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;neighbouring_country&lt;/span&gt;

      &lt;span class="n"&gt;filtered_cafes&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;push&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;cafe&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"Added cafe: &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;cafe&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'title'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt; at &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;cafe&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'address'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
      &lt;span class="c1"&gt;# You may use this part to further filter your data using GPS Coordinates&lt;/span&gt;
      &lt;span class="c1"&gt;#if cafe.key?('gps_coordinates') &amp;amp;&amp;amp; cafe['gps_coordinates'].key?('latitude') &amp;amp;&amp;amp; cafe['gps_coordinates'].key?('longitude')&lt;/span&gt;
      &lt;span class="c1"&gt;#  coordinates = [cafe['gps_coordinates']['latitude'], cafe['gps_coordinates']['longitude']]&lt;/span&gt;
      &lt;span class="c1"&gt;#  country = Geocoder.search(coordinates).first.country_code&lt;/span&gt;
      &lt;span class="c1"&gt;#  if country == desired_country&lt;/span&gt;
      &lt;span class="c1"&gt;#    filtered_cafes.push(cafe)&lt;/span&gt;
      &lt;span class="c1"&gt;#    puts "Added cafe: #{cafe['title']} at #{cafe['address']}"&lt;/span&gt;
      &lt;span class="c1"&gt;#  end&lt;/span&gt;
      &lt;span class="c1"&gt;#end&lt;/span&gt;
    &lt;span class="k"&gt;end&lt;/span&gt;
  &lt;span class="k"&gt;end&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;

&lt;span class="c1"&gt;# Shutdown the thread pool and wait for the tasks to complete&lt;/span&gt;
&lt;span class="n"&gt;thread_pool&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;shutdown&lt;/span&gt;
&lt;span class="n"&gt;thread_pool&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;wait_for_termination&lt;/span&gt;

&lt;span class="c1"&gt;# Reindex positions&lt;/span&gt;
&lt;span class="n"&gt;filtered_cafes&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;filtered_cafes&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;each_with_index&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;cafe&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;  &lt;span class="n"&gt;cafe&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"position"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="c1"&gt;# Save filtered cafes to a JSON file&lt;/span&gt;
&lt;span class="no"&gt;File&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;open&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'filtered_results.json'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'w'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;write&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="no"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;pretty_generate&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;filtered_cafes&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>seo</category>
      <category>localseo</category>
      <category>uule</category>
      <category>localserp</category>
    </item>
    <item>
      <title>LLMs vs SerpApi: A Comparative Analysis of the Webscraping Capabilities</title>
      <dc:creator>Emirhan Akdeniz</dc:creator>
      <pubDate>Thu, 26 Jan 2023 18:25:56 +0000</pubDate>
      <link>https://dev.to/serpapi/llms-vs-serpapi-a-comparative-analysis-of-the-webscraping-capabilities-36n9</link>
      <guid>https://dev.to/serpapi/llms-vs-serpapi-a-comparative-analysis-of-the-webscraping-capabilities-36n9</guid>
      <description>&lt;p&gt;In this writing, we are going to compare current Open Source and Paid Large Language Models, ChatGPT and SerpApi for a technical task that requires a precise and fast solution. We will make a comparison by extracting different parts of the visible text from Google Local Results.&lt;/p&gt;

&lt;p&gt;For more in-depth information on what Google Local Results look like, and how they might be served in a structured order, you can visit &lt;a href="https://serpapi.com/local-results" rel="noopener noreferrer"&gt;SerpApi's Google Local Results API Documentation&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;For visual confirmation, here is how the feature looks like this:&lt;/p&gt;

&lt;p&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F73674035%2F214835569-574f6d3d-a61d-4824-bc00-96d5ea3e00d1.png" class="article-body-image-wrapper"&gt;&lt;img alt="image" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F73674035%2F214835569-574f6d3d-a61d-4824-bc00-96d5ea3e00d1.png" width="800" height="276"&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;h2&gt;
  
  
  Prompt to Test out
&lt;/h2&gt;

&lt;p&gt;Here is the structure of the prompt we will be testing throughout different models:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Example 1:
Lineified text of Google Local Result

Classifications of different parts of the text.

Example 2:
...

Example in Question:
Lineified text of Google Local Result

Follow the rules:
- Rule 1
- Rule 2

classification:
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This prompt consists of examples to tighten and empower the classification process, some rules for LLM not to generate unwanted words, and to point to the example in question, and the example in question. The prompt ends with a classification to complete, which will hopefully be followed by other classifications.&lt;/p&gt;

&lt;p&gt;Below is the example prompt engineered to do text classification of Google Local Results. I have deducted the phone numbers for privacy purposes. But know that I have used the actual phone numbers when prompting to eliminate bias in phone numbers being picked up by only consisting of zeros.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Example 1:
Houndstooth Coffee 4.6(824) · $$ · Coffee shop 401 Congress Ave. #100c · In Frost Bank Tower Closed ⋅ Opens 7AM Cozy hangout for carefully sourced brews

title: Houndstooth Coffee
rating: 4.6
number of reviews: 824
expensiveness: $$
type: Coffee shop
address: 401 Congress Ave. #100c  · In Frost Bank Tower
hours information: Opens 7AM
description: Cozy hangout for carefully sourced brews
service options: -
phone number: -
years in business: -

Example 2:
A.D.A. Auto Repair Center 4.8(26) · Auto repair shop 30+ years in business · Nicosia · 00 000000 Closes soon ⋅ 3PM "I strongly recommend this repair shop."

title: A.D.A. Auto Repair Center
rating: 4.8
number of reviews: 26
expensiveness: -
type: Auto repair shop
address: Nicosia
hours information: Closes soon ⋅ 3PM
years in business: 30+ years in business
description: "I strongly recommend this repair shop."
phone number: 00 000000
years in business: -

Example 3:
A to M MARKET 5.0(2) · General store Nicosia · Near Macro Süpermarket Open ⋅ Closes 2AM In-store shopping

title: A to M MARKET
rating:  5.0
number of reviews: 2
expensiveness: -
type: General store
address: Nicosia · Near Macro Süpermarket
hours information: Open ⋅ Closes 2AM
description: -
service options: In-store shopping
phone number: -
years in business: -

Example 4:
Expeditionary 4x4 5.0(1) · Auto parts store (000) 000-0000 Open 24 hours

title: Expeditionary 4x4
rating:  5.0
number of reviews: 1
expensiveness: -
type: Auto parts store
address: -
hours information: Open 24 hours
description: -
service options: -
phone number: (000) 000-0000
years in business: -

Example 5:

Hibbett Sports 4.2(51) · Sporting goods store 4.2 (51) Independence, KS Closed ⋅ Opens 10 AM · (000) 000-0000 Closed ⋅ Opens 10 AM Closed Athletic shoes &amp;amp; activewear

Follow the rules:
    - Give the answer from only after the text `Example 5:`.
    - Do not manipulate the text from Example 5.
    - Make sure to check before and after text for addendum texts.
    - Do not give any reason for your answer.
    - Do not give any regex example.
    - Do not give any code example.
    - Give only classifications.


title:
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Open Source LLM Models
&lt;/h2&gt;

&lt;h3&gt;
  
  
  GPT-2
&lt;/h3&gt;

&lt;p&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F73674035%2F214687431-f8abd1c9-46c0-4ef6-ba0f-76c0666879c2.png" class="article-body-image-wrapper"&gt;&lt;img alt="image" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F73674035%2F214687431-f8abd1c9-46c0-4ef6-ba0f-76c0666879c2.png" width="800" height="801"&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;p&gt;Context: GPT-2 has 124M trainable parameters.&lt;/p&gt;

&lt;p&gt;Answer:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;title: Hibbett Sports 6.4(817) · Fitness Club 1.0 (5) 911-4
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Conclusion: The answer is flawed and contains extra and made-up information. Cannot be used for parsing that requires multiple classifications from the getgo.&lt;/p&gt;

&lt;h3&gt;
  
  
  GPT-2 Large
&lt;/h3&gt;

&lt;p&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F73674035%2F214689929-c76d136c-8cec-4ba3-9466-8f61e3cd9a20.png" class="article-body-image-wrapper"&gt;&lt;img alt="image" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F73674035%2F214689929-c76d136c-8cec-4ba3-9466-8f61e3cd9a20.png" width="800" height="801"&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;p&gt;Context: GPT-2 Large has 774M trainable parameters.&lt;/p&gt;

&lt;p&gt;Answer:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;title: Hibbett Sports 4.3
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Conclusion: The answer is flawed and contains extra and made-up information. Cannot be used for parsing that requires multiple classifications from the getgo.&lt;/p&gt;

&lt;h3&gt;
  
  
  GPT-2 XL
&lt;/h3&gt;

&lt;p&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F73674035%2F214691067-7acd933f-ed95-4a82-86c2-28d2173c1508.png" class="article-body-image-wrapper"&gt;&lt;img alt="image" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F73674035%2F214691067-7acd933f-ed95-4a82-86c2-28d2173c1508.png" width="800" height="801"&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;p&gt;Context: GPT-2 Large has 1.5B trainable parameters.&lt;/p&gt;

&lt;p&gt;Answer:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;title: Hibbett Sports 4.0
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Conclusion: The answer is flawed and contains extra and made-up information. Cannot be used for parsing that requires multiple classifications from the getgo.&lt;/p&gt;

&lt;h3&gt;
  
  
  GPT-J 6B
&lt;/h3&gt;

&lt;p&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F73674035%2F214691259-a231407b-8df2-4cb9-b543-40cc7c0a5e38.png" class="article-body-image-wrapper"&gt;&lt;img alt="image" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F73674035%2F214691259-a231407b-8df2-4cb9-b543-40cc7c0a5e38.png" width="800" height="879"&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;p&gt;Context: GPT-J 6B has 6B trainable parameters.&lt;/p&gt;

&lt;p&gt;Answer:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;title: Hibbett Sports
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;GPT-J 6B is not producing anything other than the requested classification. So I had to ask multiple classes to be sure:&lt;/p&gt;

&lt;p&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F73674035%2F214691633-d6515ea0-d80f-42f4-b3d6-9b0bdf2bb6a8.png" class="article-body-image-wrapper"&gt;&lt;img alt="image" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F73674035%2F214691633-d6515ea0-d80f-42f4-b3d6-9b0bdf2bb6a8.png" width="800" height="852"&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;p&gt;Second Answer:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;address: 4.2 (51) Independence, KS
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Conclusion: Although the model provided the correct answer for the title, it wasn't able to classify the address. Not suitable for multiple classifications out of the box.&lt;/p&gt;

&lt;p&gt;I must confess I was surprised by the speed at which the answer was returned. With proper finetuning, this model has the potential to be used for technical purposes for a relatively cheap price, and fast response times.&lt;/p&gt;

&lt;h3&gt;
  
  
  BLOOM
&lt;/h3&gt;

&lt;p&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F73674035%2F214693287-a10c5fd8-6d9f-446d-ab06-3a271ffedd7d.png" class="article-body-image-wrapper"&gt;&lt;img alt="image" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F73674035%2F214693287-a10c5fd8-6d9f-446d-ab06-3a271ffedd7d.png" width="800" height="715"&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;p&gt;Context: Bloom has 176B trainable parameters. This model is a behemoth when it comes to open-source models. It has support for 46 languages and 13 programming languages. A switch between a greedy mode for technical purposes, and a sampling mode for creative purposes.&lt;/p&gt;

&lt;p&gt;Answer:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;title: Hibbett Sports
rating: 4.2
number of reviews: 51
expensiveness: -
type: Sporting goods store
address: Independence, KS
hours information: Closed ⋅ Opens 10 AM
description: Athletic shoes &amp;amp; activewear
service options: -
phone number: (000) 000-0000
years in business: Closed ⋅ Opens 10 AM
years in business: -
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Conclusion: All the answers are correct. This is definitely suitable for technical tasks of engineering.&lt;br&gt;
I must add my further observations about BLOOM. The downside is that it is big in size. As of now, around 330 GB. You also need at least 16 GB of RAM and some real free time in your hand to test this behemoth.&lt;br&gt;
They were kind enough to set up a managed endpoint for testing purposes. But testing for specific purposes of technical nature will most likely take a hard manual prompt creation and a big decision to set it up.&lt;br&gt;
Another thing I have noticed about BLOOM is that if it creates noisy data at the end, you can always use multiple prompts. Somehow this prompt lead the model to show always the right answer on the first line of generated text.&lt;br&gt;
Also, I have tested cross-language prompts to see how it behaves. Sometimes it gives good results for English examples but Arabic classification. But it would be wise to use the same language prompts for the same language classifications just to be on the safe side.&lt;br&gt;
I have to admit though, I am still surprised this is an open-source model. It is truly a behemoth at the standards of the industry.&lt;/p&gt;
&lt;h2&gt;
  
  
  Paid Models
&lt;/h2&gt;
&lt;h3&gt;
  
  
  GPT-3 Text-Ada-001
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx961s94u06rqnvqrp186.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx961s94u06rqnvqrp186.png" alt="image" width="800" height="285"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Context: This model is the lowest fee text model of OpenAI, and costs $0.0004 for 1000 tokens. Tokens are recognized parts of strings and letters by models. This model accepts up to 2048 tokens. The prompt we provided is 784 tokens (cost is affected by response token size as well).&lt;/p&gt;

&lt;p&gt;Answer:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;title:  Hibbett Sports
rating: 4.2
number of reviews: 51
expensiveness: Sporting goods store 4.2 (51)
type: Sporting goods store
address: 4.2 (51) Independence, KS
hours information: Open 10am-9pm
hours information: (000) 000-0000
description: (000) 000-0000
service options: -
phone number: (000) 000-0000
years in business: -
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Conclusion: This model, even though the generation temperature is quite low (good for consistent classification and minimum made-up text), has failed to produce a healthy classification. I am not sure if finetuning would put it in shape.&lt;/p&gt;

&lt;h3&gt;
  
  
  GPT-3 Text-Babbage-001
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0ba6t7vniyevz3nk4rq6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0ba6t7vniyevz3nk4rq6.png" alt="image" width="800" height="278"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Context: This model is a low-fee text model that is better than Text-Ada-001 of OpenAI, and costs $0.0.0005 for 1000 tokens. This model also accepts up to 2048 tokens.&lt;/p&gt;

&lt;p&gt;Answer:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;title:  Hibbett Sports
rating: 4.2
number of reviews: 51
expensiveness: $$
type: Sporting goods store
address: Independence, KS
hours information: Open 10 AM
description: (000) 000-0000 Closed ⋅ Opens 10 AM Closed
service options: -
phone number: -
years in business: -
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Conclusion: This model missed many parts of the text that is crucial. However, a finetune of this model has the potential to be a good candidate for multiple classifications. However, finetuning costs $0.0006 for 1000 tokens, and using the finetuned version costs $0.0024. This is not really feasible considering the next paid model's base cost is lower than this.&lt;/p&gt;

&lt;h3&gt;
  
  
  GPT-3 Curie
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo12lsx0q6bmq1ban15u0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo12lsx0q6bmq1ban15u0.png" alt="image" width="800" height="278"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Context: This model is a text model that is better than Text-Babbage-001 of OpenAI, and costs $0.0.0020 for 1000 tokens. This model also accepts up to 2048 tokens. This model carries the name of one of the most bada** scientists in history, the undisputed star of the Flying Universities.&lt;/p&gt;

&lt;p&gt;Answer:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;title:  Hibbett Sports
rating: 4.2
number of reviews: 51
expensiveness: -
type: Sporting goods store
address: Independence, KS
hours information: Closed ⋅ Opens 10 AM
description: Athletic shoes &amp;amp; activewear
service options: -
phone number: (000) 000-0000
years in business: -
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Conclusion: All correct. Without a doubt, this is one of the good candidates for industrial scale. The downside is the token size. You have to carry the context and rules at each call, costing you more. The price for this model is $0.0020 per 100 tokens which is kind of limiting for big scales. Also, the concurrency at OpenAI is 3000 requests per minute for paying users. This is another limitation since the limit of tokens is also increasing the number of requests you must make concurrently.&lt;br&gt;
I would suggest using this model for testing what you already have in place for it. Its precision could help improve your existing solution and provide feedback of its state.&lt;/p&gt;
&lt;h3&gt;
  
  
  ChatGPT
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpkbtfz9mlnv29j2g0bvs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpkbtfz9mlnv29j2g0bvs.png" alt="image" width="800" height="401"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Context: I don't think this model even needs an introduction. Possibilities using this model seem quite endless. However, it is not open for commercial use now, and we don't really know what the pricing or concurrency limits will be for certain.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;title:
Hibbett Sports
rating: 4.2
number of reviews: 51
expensiveness: -
type: Sporting goods store
address: Independence, KS
hours information: Closed ⋅ Opens 10 AM
description: Closed Athletic shoes &amp;amp; activewear
service options: -
phone number: (000) 000-000
years in business: -
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Conclusion: It can do basic classification tasks with ease. Moreover, being able to carry out the conversation brings the possibility of using the contextual prompt to be reused with ease. I would have only one concern for this model though, and that is persistence. It would be good to have a temperature option for ChatGPT as well. Since definite pricing and limitations weren't announced, I cannot say something certain about its industrial use.&lt;/p&gt;

&lt;h3&gt;
  
  
  GPT-3 Text-Da-Vinci-003
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv28ieslkez8g80729e1r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv28ieslkez8g80729e1r.png" alt="image" width="800" height="312"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Context: This is the best text model of OpenAI so far, and costs $0.0.0200 for 1000 tokens. Time will tell if naming it DaVinci was a good choice or if they should've waited for a superior model for that name. But for sure this is the most advanced text model out there. Its precision in understanding the prompt is absolutely beating the 4000 token limit. For many technical tasks, you don't even need finetuning.&lt;/p&gt;

&lt;p&gt;Answer:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;title: Hibbett Sports
rating: 4.2
number of reviews: 51
expensiveness: -
type: Sporting goods store
address: Independence, KS
hours information: Closed ⋅ Opens 10 AM
phone number: (000) 000-0000
years in business: -
description: Athletic shoes &amp;amp; activewear
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Conclusion: This is without a doubt a killer product. However, when you need to correlate between different parts of the HTML to gather deeper info about the feature you want to extract, the 4000 token limit is playing a major blockage.&lt;br&gt;
Another blockage is the price. Google allows 20 local results per page. This means the estimated cost for one page of visible elements will be around $0.16 per page. Now, this is just the price for parsing the features. You may add a proxy and server maintenance to it. When you scale this number, an in-house solution using Text-Da-Vinci-003 only becomes viable when you have a derivative product using the Google Local Results feature that is more valuable than the raw data. This is only possible if the solution is cutting down the maintenance fee, and the error rates are significantly less to a point where you don't have to hire new staff to take care of it.&lt;br&gt;
But I have to point out that this is a very suitable model for testing existing models at this time. Its semantic power in understanding the rules and order in the generation is very suitable for testing the efficiency of technical tasks at this point.&lt;/p&gt;
&lt;h2&gt;
  
  
  SerpApi Google Local Results Parser
&lt;/h2&gt;

&lt;p&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F73674035%2F214835238-5da57b65-b352-4085-803b-5a6f3b05065c.png" class="article-body-image-wrapper"&gt;&lt;img alt="image" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F73674035%2F214835238-5da57b65-b352-4085-803b-5a6f3b05065c.png" width="800" height="609"&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;p&gt;Context: SerpApi is a real-time API to access Google search results. It handles proxies, solves captchas, and parses all rich structured data for you. So any kind of preprocessing and parsing you have to do with an LLM model is actually handled by SerpApi. Apart from that, SerpApi can easily parse data invisible to the eye as well like Place ID or GPS coordinates.&lt;/p&gt;

&lt;p&gt;Answer:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="w"&gt;    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"position"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"title"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Hibbett Sports"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"place_id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"5028461961943251508"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"place_id_search"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://serpapi.com/search.json?device=desktop&amp;amp;engine=google&amp;amp;gl=us&amp;amp;google_domain=google.com&amp;amp;hl=en&amp;amp;ludocid=5028461961943251508&amp;amp;q=Sports+Shop&amp;amp;tbm=lcl"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"lsig"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"AB86z5USlyxNPnHhPC2QbT2VYbMc"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"rating"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;4.2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"reviews_original"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"(51)"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"reviews"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;51&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Sporting goods store"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"address"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Independence, KS"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"phone"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"(000) 000-0000"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"hours"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Open ⋅ Closes 8 PM"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"description"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Athletic shoes &amp;amp; activewear"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"thumbnail"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://serpapi.com/searches/63d036577bea270d77944e2d/images/581525d33cb22a3571e8ea251d12f5f818ab21466a5be2cfc24aa6e2784524e7.jpeg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"gps_coordinates"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"latitude"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;37.2243156&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"longitude"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;-95.7405395&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="err"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Conclusion: SerpApi's biggest power comes from its dedication to its mission. Because it has a team dedicated to fixing errors and adding new features, you are getting the most out of what you expect. Moreover, not worrying about proxies, server maintenance, and many other things is another bonus. It beats the pricing point by a good margin, and it is fast. You can see the &lt;a href="https://serpapi.com/pricing" rel="noopener noreferrer"&gt;Pricing Page&lt;/a&gt; to make a comparison in pricing, and &lt;a href="https://serpapi.com/status" rel="noopener noreferrer"&gt;API Status Page&lt;/a&gt; to get more information on speed. It also guarantees up to 20% of your plan searches per hour as a throughput. You can also &lt;a href="https://serpapi.com/users/sign_up" rel="noopener noreferrer"&gt;Register to Claim Free Credits&lt;/a&gt;.&lt;br&gt;
All of this information is enough to see that some technical tasks are still yet not feasible to achieve by LLMs. &lt;/p&gt;

&lt;h2&gt;
  
  
  Final Thoughts
&lt;/h2&gt;

&lt;p&gt;I can see that the companies that have a tradition of implementing AI have a better advantage in the future. the capabilities of such tools as shown above are truly impressive. Considering these are just baby steps of AI, one has to be ready for the point at which AI can beat traditional methods. This is only possible via an existing culture around AI.&lt;/p&gt;

&lt;p&gt;If we go from an example, in my previous blog posts, I have parsed the inline version of Google Local Results with a CNN structured Hybrid AI and compared the results. That was a definite win in the long run in terms of AI. But considering there is a dedicated page for local results, and it is completely parsable via traditional methods, it nullifies the advantages of AI until it is faster and cheaper. I know that in the future, prices will be more accessible for the industry, and the speed will be much more efficient. It will make the entire process more interesting and efficient.&lt;/p&gt;

&lt;p&gt;I am grateful to the reader for their time and attention. The contents of this writing only contain a portion of my observations. In my opinion, some fields, especially fields requiring technical precision, are not viable for LLMs at this point.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Analyzing the Distribution of Problems in Competition Reviews</title>
      <dc:creator>Emirhan Akdeniz</dc:creator>
      <pubDate>Mon, 23 Jan 2023 12:23:48 +0000</pubDate>
      <link>https://dev.to/serpapi/analyzing-the-distribution-of-problems-in-competition-reviews-3hbd</link>
      <guid>https://dev.to/serpapi/analyzing-the-distribution-of-problems-in-competition-reviews-3hbd</guid>
      <description>&lt;h1&gt;
  
  
  Analyzing the Distribution of Problems in Competition's Reviews
&lt;/h1&gt;

&lt;p&gt;In the highly competitive world of mobile app development, it is essential to stay ahead of the curve in terms of optimization and marketing strategies using best app store optimization tools. One key area of focus is understanding the common problems and complaints of competitors' app reviews, in order to improve upon and address those issues in your own app. With the use of ASO tools like &lt;a href="https://serpapi.com/apple-app-store"&gt;SerpApi's Apple App Store Search Scraper API&lt;/a&gt; and &lt;a href="https://serpapi.com/apple-reviews"&gt;Apple App Store Reviews Scraper API&lt;/a&gt;, as well as Natural Language Processing models such as &lt;a href="https://huggingface.co/facebook/bart-large-mnli"&gt;facebook/bart-large-mnli&lt;/a&gt; and &lt;a href="https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad"&gt;bert-large-uncased-whole-word-masking-finetuned-squad&lt;/a&gt;, we can analyze the distribution of problems in competition's reviews and gain valuable insights for our own app optimization which could potentially increase download an install rate, the conversion rate from your competition, give key ideas on your app marketing, boost your app visibility, and many more KPI objectives you can achieve with app analytics. You can find the full code at the end of this page.&lt;/p&gt;

&lt;h2&gt;
  
  
  Collecting Product IDs from the Apple App Store Search Scraper API
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_0NeSBYE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/73674035/214034067-cf3f84da-e421-4929-8acb-8de15d95178f.png" class="article-body-image-wrapper"&gt;&lt;img alt="image" src="https://res.cloudinary.com/practicaldev/image/fetch/s--_0NeSBYE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/73674035/214034067-cf3f84da-e421-4929-8acb-8de15d95178f.png" width="880" height="453"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serpapi.com/apple-app-store"&gt;SerpApi’s Apple App Store Search Scraper API&lt;/a&gt; is a tool used for gathering in-depth metrics, keyword rankings, the number of app downloads, and much more details of an app in the iOS App Store, or Mac App Store. It is an essential tool for keyword tracking and gives you a good insight into how App Store’s ranking algorithm works. The first step we will use in creating this code is the process is to gather the product IDs of apps related to our search term, using the Apple App Store Search Scraper API. This can be done with the following code snippet:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;call_app_store_api&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;search_term&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;api_key&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
  &lt;span class="n"&gt;app_store_url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="s"&gt;"https://serpapi.com/search.json?api_key=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;api_key&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;&amp;amp;engine=apple_app_store&amp;amp;term=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;search_term&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;
  &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;requests&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;app_store_url&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="n"&gt;product_ids&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;organic_result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"id"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;organic_result&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;()[&lt;/span&gt;&lt;span class="s"&gt;"organic_results"&lt;/span&gt;&lt;span class="p"&gt;]]&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;product_ids&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This function takes in a search term and API key and returns a list of product IDs for apps related to that search term. You may &lt;a href="https://serpapi.com/users/sign_up"&gt;register to claim free credits&lt;/a&gt; and use your API Key in the tool.&lt;/p&gt;

&lt;h2&gt;
  
  
  Collecting Reviews from the Apple App Store Reviews Scraper API
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--QKmVChHZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/73674035/214034342-e1fcc86e-411a-45b9-b85e-58a1c11c1da1.png" class="article-body-image-wrapper"&gt;&lt;img alt="image" src="https://res.cloudinary.com/practicaldev/image/fetch/s--QKmVChHZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/73674035/214034342-e1fcc86e-411a-45b9-b85e-58a1c11c1da1.png" width="880" height="453"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once we have the product IDs of the apps related to our search term, we can use the &lt;a href="https://serpapi.com/apple-reviews"&gt;Apple App Store Reviews Scraper API&lt;/a&gt; to gather the reviews of each app with automation. This can be done with the following code snippet:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;call_apple_reviews_api&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;product_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;api_key&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
  &lt;span class="n"&gt;apple_reviews_url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="s"&gt;"https://serpapi.com/search.json?api_key=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;api_key&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;&amp;amp;engine=apple_reviews&amp;amp;product_id=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;product_id&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;&amp;amp;sort=mostcritical"&lt;/span&gt;
  &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;requests&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;apple_reviews_url&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="n"&gt;reviews&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;review&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"text"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;review&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;()[&lt;/span&gt;&lt;span class="s"&gt;"reviews"&lt;/span&gt;&lt;span class="p"&gt;]]&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;reviews&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This function takes in a product ID of an App Store Listing and SerpApi API key, and returns a list of reviews for that app, sorted from most negative to least negative with the help of &lt;code&gt;sort=mostcritical&lt;/code&gt; parameter.&lt;/p&gt;

&lt;h2&gt;
  
  
  Collecting All Reviews for a Search Term
&lt;/h2&gt;

&lt;p&gt;To gather all reviews for apps related to our search term, we can use the following function in the code snippet:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;get_product_reviews&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;product_ids&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;api_key&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
  &lt;span class="n"&gt;reviews&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;asyncio&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;gather&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;call_apple_reviews_api&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;product_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;api_key&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;product_id&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;product_ids&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
  &lt;span class="n"&gt;reviews&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;list&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;chain&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="n"&gt;reviews&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;reviews&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This function takes in a list of product IDs and SerpApi API key, and returns a list of all reviews for those apps. Notice that it doesn’t store which reviews belong to which app name. It will serve all the reviews of top apps ranked in a relevant keyword to your app.&lt;/p&gt;

&lt;h2&gt;
  
  
  Classifying Reviews as Positive or Negative
&lt;/h2&gt;

&lt;p&gt;Sometimes, an app could have a little number of reviews, or a user might give a 1-star but a positive review. These kinds of events might prevent you from seeing genuine negative reviews of your competition. With the use of natural language processing models, we can classify the reviews as positive or negative, and return only the absolutely negative reviews in the tool. The following code snippet demonstrates this process:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;get_absolutely_negative_product_reviews&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;classifier&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;reviews&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
  &lt;span class="n"&gt;absolutely_negative_reviews&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
  &lt;span class="n"&gt;candidate_labels&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;'negative'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;'positive'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
  &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;review&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;reviews&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;result&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;classifier&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;review&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;candidate_labels&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"scores"&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;max&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"scores"&lt;/span&gt;&lt;span class="p"&gt;]))&lt;/span&gt;
    &lt;span class="n"&gt;classification&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"labels"&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;classification&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="s"&gt;"negative"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
      &lt;span class="n"&gt;absolutely_negative_reviews&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;review&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;absolutely_negative_reviews&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This function takes in a classifier and a list of reviews and returns a list of only the absolutely negative reviews.&lt;/p&gt;

&lt;h2&gt;
  
  
  Summarizing Absolutely Negative Reviews
&lt;/h2&gt;

&lt;p&gt;To gain a clearer understanding of the specific problems mentioned in the absolutely negative reviews, we can use a question-answering pipeline to summarize the reviews. This is important to find the main concern of the user in a long review, or in a mixed review with positive and negative aspects. Imagine the following review:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;Screenshots are nice. But they don’t represent the app.&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;What we want is the concern for the honesty of the developer from this review which is the latter sentence. So we want the question-answer pipeline to give us keyword suggestions into what the main problem is in the review. The following code snippet demonstrates this extraction process:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;get_summaries&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;qa_pipeline&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;absolutely_negative_reviews&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
  &lt;span class="n"&gt;summaries&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
  &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;context&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;absolutely_negative_reviews&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;result&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;qa_pipeline&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
      &lt;span class="s"&gt;'context'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="s"&gt;'question'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;'What is the problem?'&lt;/span&gt;
    &lt;span class="p"&gt;})[&lt;/span&gt;&lt;span class="s"&gt;"answer"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="n"&gt;summaries&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;summaries&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This function takes in a question-answering pipeline and a list of absolutely negative reviews and returns a list of summaries for those reviews.&lt;/p&gt;

&lt;h2&gt;
  
  
  Classifying Summaries and Returning Distribution of Problems
&lt;/h2&gt;

&lt;p&gt;Understanding the ratio of problems in your competition’s reviews is a crucial part of mechanics in app ranking, user acquisition, and overall store intelligence. To understand the distribution of problems mentioned in the summaries, we can classify the summaries with a preset of potential problems and return the distribution of problems. The following code snippet demonstrates this process:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;get_problems_distribution&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;classifier&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;possible_problems&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;summaries&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
  &lt;span class="n"&gt;summary_dicts&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
  &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;summary&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;summaries&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;result&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;classifier&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;summary&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;possible_problems&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"scores"&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;max&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"scores"&lt;/span&gt;&lt;span class="p"&gt;]))&lt;/span&gt;
    &lt;span class="n"&gt;classification&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"labels"&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="n"&gt;summmary_dict&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="s"&gt;"summary"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;summary&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"classification"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;classification&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="n"&gt;summary_dicts&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;summmary_dict&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

  &lt;span class="n"&gt;classifications&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;d&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;'classification'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;d&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;summary_dicts&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
  &lt;span class="n"&gt;counter&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;Counter&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;classifications&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="n"&gt;total&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;counter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;values&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
  &lt;span class="n"&gt;ratios&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="n"&gt;problem&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;counter&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;problem&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="n"&gt;total&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;problem&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;possible_problems&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="n"&gt;ratios&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;dict&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;sorted&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ratios&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;items&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="n"&gt;key&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="k"&gt;lambda&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;reverse&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;ratios&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This function takes in a classifier, a list of possible problems, and a list of summaries, and returns the distribution of problems mentioned in the summaries.&lt;/p&gt;

&lt;h2&gt;
  
  
  Showing Results
&lt;/h2&gt;

&lt;p&gt;Finally, the following code snippet can be used to print the results of the problem distribution:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"-----"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="s"&gt;"Summary of problems for search term: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;term&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;ratio&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;ratios&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
  &lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;ratio&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;ratios&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;ratio&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;%"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here is an example output:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;Summary of problems &lt;span class="k"&gt;for &lt;/span&gt;search term: Coffee
unenjoyable: 29.28%
incompleteness: 27.03%
expensiveness: 12.61%
dishonesty: 9.01%
too many advertisements: 8.11%
inefficiency: 8.11%
bugs: 5.86%
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  How is App Store Optimization done?
&lt;/h2&gt;

&lt;p&gt;I will explain different aspects of a possible ASO strategy using this tool. With the results of our analysis, we can see that the top problems among the competition's negative reviews for the search term "Coffee" are unenjoyable experiences, incompleteness, and expensiveness. Here are some specific strategies to address these issues:&lt;/p&gt;

&lt;h3&gt;
  
  
  Unenjoyable experiences
&lt;/h3&gt;

&lt;p&gt;To address this problem, it is crucial to gather customer feedback and gain a deep understanding of what is causing the negative experiences. This could involve conducting surveys, focus groups, or user testing to gather qualitative data. Additionally, using tools such as sentiment analysis on customer reviews can provide valuable insights into whether the root causes are from technical reasons such as app performance, or from design issues. Once the root causes of the negative experiences have been identified, it is important to develop and implement solutions to improve the overall enjoyment that is supported with app intelligence.&lt;/p&gt;

&lt;h3&gt;
  
  
  Incompleteness
&lt;/h3&gt;

&lt;p&gt;In order to address this problem, it is essential to have a clear understanding of the features and functionality that customers expect from the app. This could involve conducting market research, surveying customers, localization demands, notifications for bugs or gathering data on the features of competitors' apps. Once the desired features have been identified, it is important to prioritize and implement them in the app, while also continuously gathering feedback to ensure they are meeting the needs of the customers.&lt;/p&gt;

&lt;h3&gt;
  
  
  Expensiveness
&lt;/h3&gt;

&lt;p&gt;To address this issue, it is important to review and optimize pricing strategies regularly. This could involve conducting market research to understand the prices of similar apps and identifying opportunities to offer competitive pricing. Additionally, implementing in-app purchase options or subscription models can provide alternative revenue streams and make the app more accessible to a wider range of customers.&lt;/p&gt;

&lt;p&gt;Additionally, it is also essential to address other problems such as dishonesty, too many advertisements, inefficiency, and bugs, as these can also negatively impact the customer's experience. To address these issues, it is crucial to have a robust testing and quality assurance process in place, as well as regular monitoring of customer feedback to identify and address any problems in a timely manner.&lt;/p&gt;

&lt;p&gt;Overall, by gathering and analyzing customer feedback, understanding the competition, and implementing solutions to address the identified problems, it is possible to improve the overall performance of the app and outrank the competition in the search results.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;In conclusion, outranking the competition in the search results requires a comprehensive approach that involves gathering and analyzing customer feedback, understanding the competition, and implementing solutions to address the identified problems. By using natural language processing models and other tools such as &lt;a href="https://serpapi.com/apple-app-store"&gt;SerpApi's Apple Search Scraper API&lt;/a&gt;, and &lt;a href="https://serpapi.com/apple-reviews"&gt;SerpApi's Apple App Store Reviews Scraper API&lt;/a&gt;, sentiment analysis, question-answering pipelines, and A/B testing, it is possible to gain valuable insights into customer needs and preferences, and to make data-driven decisions to improve the overall performance of the app, and decide according to your App Store Optimization Strategy. Additionally, it is important to stay up to date with the latest trends and technologies, and to continuously gather and analyze customer feedback to identify new areas for improvement such as finding the best keywords to compete in via keyword research, keyword optimization for metadata of your app description, Search Engine Optimization(SEO) for your app page on App Store or Google Play Store (for Android Apps), effective social media campaigns and overall market intelligence, etc. By following these strategies, it is possible to improve the overall performance of the app and outrank the competition in the search results or find the best app title, app keywords to use in the app description, optimize your app for the most organic downloads, and organic installs, gather crucial information on the weak points of the competitive app developers in top apps in a keyword, etc. from the getgo before entering the competition as a startup. I am grateful to the reader for their time and attention. You can find the full code below.&lt;/p&gt;

&lt;h2&gt;
  
  
  Full Code:
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="nn"&gt;collections&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Counter&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="nn"&gt;transformers&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;pipeline&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="nn"&gt;itertools&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;chain&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="nn"&gt;requests&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="nn"&gt;asyncio&lt;/span&gt;

&lt;span class="c1"&gt;# Collect product_ids from Apple App Store Search Scraper API
&lt;/span&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;call_app_store_api&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;search_term&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;api_key&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
  &lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"-----"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="s"&gt;"Gathering product ids from SerpApi's Apple App Store Scraper API for search term: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;search_term&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="n"&gt;app_store_url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="s"&gt;"https://serpapi.com/search.json?api_key=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;api_key&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;&amp;amp;engine=apple_app_store&amp;amp;term=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;search_term&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;
  &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;requests&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;app_store_url&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="n"&gt;product_ids&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;organic_result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"id"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;organic_result&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;()[&lt;/span&gt;&lt;span class="s"&gt;"organic_results"&lt;/span&gt;&lt;span class="p"&gt;]]&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;product_ids&lt;/span&gt;

&lt;span class="c1"&gt;# Collect reviews of an individual app from most negative to least negative from Apple App Store Reviews Scraper API
&lt;/span&gt;&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;call_apple_reviews_api&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;product_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;api_key&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
  &lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"-----"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="s"&gt;"Gathering Reviews from Apple App Store Reviews Scraper API for product id: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;product_id&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="n"&gt;apple_reviews_url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="s"&gt;"https://serpapi.com/search.json?api_key=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;api_key&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;&amp;amp;engine=apple_reviews&amp;amp;product_id=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;product_id&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;&amp;amp;sort=mostcritical"&lt;/span&gt;
  &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;requests&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;apple_reviews_url&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="n"&gt;reviews&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;review&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"text"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;review&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;()[&lt;/span&gt;&lt;span class="s"&gt;"reviews"&lt;/span&gt;&lt;span class="p"&gt;]]&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;reviews&lt;/span&gt;

&lt;span class="c1"&gt;# Collect all reviews of apps in a search term from Apple App Store Reviews Scraper API
&lt;/span&gt;&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;get_product_reviews&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;product_ids&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;api_key&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
  &lt;span class="n"&gt;reviews&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;asyncio&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;gather&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;call_apple_reviews_api&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;product_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;api_key&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;product_id&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;product_ids&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
  &lt;span class="n"&gt;reviews&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;list&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;chain&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="n"&gt;reviews&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;reviews&lt;/span&gt;

&lt;span class="c1"&gt;# Classify reviews as positive or negative and return absolutely negative reviews
&lt;/span&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;get_absolutely_negative_product_reviews&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;classifier&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;reviews&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
  &lt;span class="n"&gt;absolutely_negative_reviews&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
  &lt;span class="n"&gt;candidate_labels&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;'negative'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;'positive'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
  &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;review&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;reviews&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"-----"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="s"&gt;"Classifying review: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;review&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;result&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;classifier&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;review&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;candidate_labels&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"scores"&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;max&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"scores"&lt;/span&gt;&lt;span class="p"&gt;]))&lt;/span&gt;
    &lt;span class="n"&gt;classification&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"labels"&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="s"&gt;"Classification: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;classification&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;classification&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="s"&gt;"negative"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
      &lt;span class="n"&gt;absolutely_negative_reviews&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;review&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;absolutely_negative_reviews&lt;/span&gt;

&lt;span class="c1"&gt;# Summarize absolutely negative reviews
&lt;/span&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;get_summaries&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;qa_pipeline&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;absolutely_negative_reviews&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
  &lt;span class="n"&gt;summaries&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
  &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;context&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;absolutely_negative_reviews&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"-----"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="s"&gt;"Summarizing review: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;result&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;qa_pipeline&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
      &lt;span class="s"&gt;'context'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="s"&gt;'question'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;'What is the problem?'&lt;/span&gt;
    &lt;span class="p"&gt;})[&lt;/span&gt;&lt;span class="s"&gt;"answer"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="s"&gt;"Summary: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;summaries&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;summaries&lt;/span&gt;

&lt;span class="c1"&gt;# Classify summaries and return distribution of problems
&lt;/span&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;get_problems_distribution&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;classifier&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;possible_problems&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;summaries&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
  &lt;span class="n"&gt;summary_dicts&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
  &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;summary&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;summaries&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"-----"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="s"&gt;"Classifying summary: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;summary&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;result&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;classifier&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;summary&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;possible_problems&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"scores"&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;max&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"scores"&lt;/span&gt;&lt;span class="p"&gt;]))&lt;/span&gt;
    &lt;span class="n"&gt;classification&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"labels"&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="s"&gt;"Classification: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;classification&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;summmary_dict&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="s"&gt;"summary"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;summary&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"classification"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;classification&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="n"&gt;summary_dicts&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;summmary_dict&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

  &lt;span class="n"&gt;classifications&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;d&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;'classification'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;d&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;summary_dicts&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
  &lt;span class="n"&gt;counter&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;Counter&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;classifications&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="n"&gt;total&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;counter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;values&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
  &lt;span class="n"&gt;ratios&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="n"&gt;problem&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;counter&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;problem&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="n"&gt;total&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;problem&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;possible_problems&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="n"&gt;ratios&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;dict&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;sorted&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ratios&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;items&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="n"&gt;key&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="k"&gt;lambda&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;reverse&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;ratios&lt;/span&gt;

&lt;span class="n"&gt;api_key&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"&amp;lt;SerpApi Key&amp;gt;"&lt;/span&gt;
&lt;span class="n"&gt;term&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"Coffee"&lt;/span&gt;
&lt;span class="n"&gt;possible_problems&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
  &lt;span class="s"&gt;"bugs"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="s"&gt;"too many advertisements"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="s"&gt;"inefficiency"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="s"&gt;"dishonesty"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="s"&gt;"unenjoyable"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="s"&gt;"incompleteness"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="s"&gt;"expensiveness"&lt;/span&gt;
&lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="n"&gt;classifier&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;pipeline&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="s"&gt;"zero-shot-classification"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"facebook/bart-large-mnli"&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;qa_pipeline&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;pipeline&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="s"&gt;"question-answering"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"bert-large-uncased-whole-word-masking-finetuned-squad"&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;product_ids&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;call_app_store_api&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;term&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;api_key&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;reviews&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;asyncio&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;run&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;get_product_reviews&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;product_ids&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;api_key&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="n"&gt;absolutely_negative_reviews&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;get_absolutely_negative_product_reviews&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;classifier&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;reviews&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;summaries&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;get_summaries&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;qa_pipeline&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;absolutely_negative_reviews&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;ratios&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;get_problems_distribution&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;classifier&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;possible_problems&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;summaries&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Print results
&lt;/span&gt;&lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"-----"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="s"&gt;"Summary of problems for search term: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;term&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;ratio&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;ratios&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
  &lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;ratio&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;ratios&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;ratio&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;%"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>mobile</category>
      <category>machinelearning</category>
      <category>programming</category>
      <category>tooling</category>
    </item>
    <item>
      <title>Open Source AI Image Classifier with Automatic Dataset Creator</title>
      <dc:creator>Emirhan Akdeniz</dc:creator>
      <pubDate>Fri, 06 Jan 2023 19:08:18 +0000</pubDate>
      <link>https://dev.to/serpapi/open-source-ai-image-classifier-with-automatic-dataset-creator-30j8</link>
      <guid>https://dev.to/serpapi/open-source-ai-image-classifier-with-automatic-dataset-creator-30j8</guid>
      <description>&lt;p&gt;
  &lt;a href="https://github.com/serpapi/serapis-ai-image-classifier" rel="noopener noreferrer"&gt;Serapis AI Image Classifier&lt;/a&gt; is a program that allows you to automatically create image datasets using &lt;a href="https://serpapi.com/images-results" rel="noopener noreferrer"&gt;SerpApi's Google Images Scraper API&lt;/a&gt;, finetune a ResNet50 model, and classify images using the trained model.
&lt;br&gt;
&lt;/p&gt;




&lt;h3&gt;Installation&lt;/h3&gt;

&lt;p&gt;You can install these dependencies using the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pip &lt;span class="nb"&gt;install &lt;/span&gt;requirements.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h3&gt;Usage&lt;/h3&gt;

&lt;p&gt;You can use Serapis AI Image Classifier in one of the following three modes:&lt;/p&gt;




&lt;h3&gt;Create a dataset and train a new model from scratch&lt;/h3&gt;

&lt;p&gt;To create a dataset and train a new model from scratch, you will need to provide a list of labels and an image to use as a reference for the scraping process.&lt;br&gt;
&lt;a href="https://serpapi.com/manage-api-key" rel="noopener noreferrer"&gt;SerpApi API Key&lt;/a&gt; is necessary for this mode for the program to automatically scrape images you will use in your database using &lt;a href="https://serpapi.com/images-results" rel="noopener noreferrer"&gt;SerpApi's Google Images Scraper API&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;You can &lt;a href="https://serpapi.com/users/sign_up" rel="noopener noreferrer"&gt;register to SerpApi to claim free credits&lt;/a&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;python serapis.py &lt;span class="nt"&gt;--train&lt;/span&gt; &lt;span class="nt"&gt;--labels&lt;/span&gt; eagle, bull, lion, man &lt;span class="nt"&gt;--image-path&lt;/span&gt; lionimage.jpg &lt;span class="nt"&gt;--api-key&lt;/span&gt; &amp;lt;SerpApi-API-KEY&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h3&gt;Use old scraped images and train a new model&lt;/h3&gt;

&lt;p&gt;To use old scraped images and train a new model, you will need to provide a list of labels and specify that you want to use old images with the --use-old-images flag.&lt;br&gt;
You can also put your images in the &lt;code&gt;images/&lt;/code&gt; folder and also add enter them in &lt;code&gt;images/catalogue.csv&lt;/code&gt; to manually train models using your own dataset.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;python serapis.py &lt;span class="nt"&gt;--train&lt;/span&gt; &lt;span class="nt"&gt;--labels&lt;/span&gt; eagle, bull, lion, man &lt;span class="nt"&gt;--use-old-images&lt;/span&gt; &lt;span class="nt"&gt;--image-path&lt;/span&gt; lionimage.jpg
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h3&gt;Use a previously trained model&lt;/h3&gt;

&lt;p&gt;To use a previously trained model, you will need to provide the path to the trained model and an image to classify.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;python serapis.py &lt;span class="nt"&gt;--model-path&lt;/span&gt; models/1.pth &lt;span class="nt"&gt;--image-path&lt;/span&gt; lionimage.jpg
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h3&gt;Dialogue Mode&lt;/h3&gt;

&lt;p&gt;You can also navigate the program by not providing any arguments and using the dialogue mode:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;python serapis.py
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h3&gt;The Output&lt;/h3&gt;

&lt;p&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F73674035%2F211052320-f53cc530-6047-4ac3-8177-53f3daa6371a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F73674035%2F211052320-f53cc530-6047-4ac3-8177-53f3daa6371a.png" alt="Classified Image of a Lion" width="683" height="400"&gt;&lt;/a&gt;
&lt;/p&gt;

&lt;p&gt;The output will give you the answer:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;The image contains Lion
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;p&gt;Optional Arguments:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;-h, --help                    Help to Nagigate
--train                       Whether to train a new model
--model-path MODEL_PATH       Pretrained Model path you want to use
--dialogue                    Whether to use dialogue to navigate through the program
--use-old-images              Whether to use old images you have downloaded to train a new model
--api-key API_KEY             SerpApi API Key
--limit LIMIT                 Number of images you want to scrape at most for each label
--labels LABELS [LABELS ...]  Labels you want to use to train a new model
--image-path IMAGE_PATH       Path to the image you want to classify
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;I am grateful to readers for their time and attention. I hope this repository helps in your tasks, or helps you grasp the possibilities you have when you have SERP data at your disposal.&lt;/p&gt;

</description>
      <category>community</category>
    </item>
  </channel>
</rss>
