<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Emmanuel Uchenna</title>
    <description>The latest articles on DEV Community by Emmanuel Uchenna (@eunit).</description>
    <link>https://dev.to/eunit</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/eunit"/>
    <language>en</language>
    <item>
      <title>How to get Twitter (X) data for sentiment analysis</title>
      <dc:creator>Emmanuel Uchenna</dc:creator>
      <pubDate>Fri, 03 Apr 2026 14:40:48 +0000</pubDate>
      <link>https://dev.to/eunit/how-to-get-twitter-x-data-for-sentiment-analysis-4c2h</link>
      <guid>https://dev.to/eunit/how-to-get-twitter-x-data-for-sentiment-analysis-4c2h</guid>
      <description>&lt;p&gt;When a product fails publicly, Twitter (X) is usually the first place it shows up. Not the press. Not review sites. Twitter. Within hours, a single product flaw can spawn thousands of tweets, drive a hashtag into the top ten trends in three countries, and permanently shift how a brand is perceived.&lt;/p&gt;

&lt;p&gt;That's why researchers, brand managers, journalists, and quantitative analysts all want the same thing: reliable Twitter data, fast, and at scale.&lt;/p&gt;

&lt;p&gt;The problem is that getting that data has never been harder or more expensive. Twitter's own Application Programming Interface (API) now costs anywhere from $100 to $5,000 per month, and even then, it doesn't give you clean, structured trend data. Do-it-yourself scrapers require proxies, constant maintenance, and a tolerance for early-morning debugging sessions when Twitter changes its UI without warning.&lt;/p&gt;

&lt;p&gt;This article is about a smarter path. You'll learn what kinds of Twitter data are most useful for sentiment analysis, how the main collection methods compare, and how to get structured, real-time Twitter trend data using the &lt;a href="https://apify.com/eunit/x-twitter-trends-ppe" rel="noopener noreferrer"&gt;Twitter (X) Trends Scraper (PPE)&lt;/a&gt; on Apify - in under five minutes, with no setup.&lt;/p&gt;




&lt;h2&gt;
  
  
  What is Twitter sentiment analysis?
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://huggingface.co/blog/sentiment-analysis-twitter" rel="noopener noreferrer"&gt;Sentiment analysis&lt;/a&gt; is the process of using natural language processing (NLP) or large language models (LLMs) to automatically classify the emotional tone behind a piece of text - typically as positive, negative, or neutral. When applied to Twitter, it turns raw social data into measurable public opinion signals.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faty7738uyf9my2llqnpd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faty7738uyf9my2llqnpd.png" alt="Twitter (X) data for sentiment analysis" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Twitter (X) is uniquely valuable for this because people say what they actually think there. Unlike LinkedIn, where professional image management dominates, or product review sites, where feedback is structured and often incentivized, Twitter is where unfiltered reactions occur in real-time. A product launch, a policy announcement, a celebrity controversy - they all land on Twitter first.&lt;/p&gt;

&lt;p&gt;Two distinct types of Twitter data matter for sentiment analysis:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Tweet content&lt;/strong&gt; - the actual text of individual tweets, replies, quote-tweets, and threads posted about a topic, keyword, or brand. This is what most sentiment analysis tutorials focus on. You scrape tweets, run them through a classifier, and obtain a sentiment label for each tweet.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Trending topic data&lt;/strong&gt; - the ranked list of hashtags and topics that are dominating conversation in a specific location at a specific point in time. This includes the trend rank, name, tweet count, and hourly history. I wrote an article on how to get &lt;a href="https://www.eunit.me/blog/how-to-scrape-twitter-x-trends-without-breaking-the-bank" rel="noopener noreferrer"&gt;Twitter (X) Trends Without Breaking the Bank&lt;/a&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  Why trend data is an underrated input for sentiment analysis
&lt;/h2&gt;

&lt;p&gt;Here's a question most sentiment analysis tutorials skip: what if you don't know which topic to analyze yet?&lt;/p&gt;

&lt;p&gt;Tweet-level sentiment analysis works well when you already have a clear target - a brand name, a product, a specific hashtag. But trend data answers a different and often more valuable question: &lt;em&gt;What are people talking about right now, and how intense is that conversation?&lt;/em&gt; That intensity is itself a sentiment signal.&lt;/p&gt;

&lt;p&gt;Consider this: &lt;code&gt;#ProductX&lt;/code&gt; is trending in São Paulo with 45k tweets and rising. You don't need to read a single tweet to know something significant is happening. The trend data alone indicate that a significant number of people are discussing this topic simultaneously, which almost always reflects a strong emotional reaction - positive or negative.&lt;/p&gt;

&lt;p&gt;Here's what trend data actually reveals:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Tweet count as a proxy for emotional intensity.&lt;/strong&gt; High tweet counts on a trending topic consistently correlate with strong emotional reactions. A hashtag with 80k tweets is generating far more sentiment "energy" than one with 2k. That count becomes your first filter for what's worth analyzing in depth.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Hourly trend history reveals sentiment momentum.&lt;/strong&gt; The &lt;a href="https://apify.com/eunit/x-twitter-trends-ppe" rel="noopener noreferrer"&gt;Twitter (X) Trends Scraper (PPE)&lt;/a&gt; captures trends across hourly snapshots, not just the current moment. A trend that appears at rank 40 with 5k tweets and climbs to rank 3 with 50k tweets in two hours signals something going viral. A trend that hits rank 1 and then rapidly falls often signals a controversy that burned hot and fast. These patterns are distinct and actionable.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Location-based trends reveal geographic sentiment clusters.&lt;/strong&gt; A consumer product campaign trending in New York but not in London, or a political hashtag dominating Lagos but absent in Nairobi, gives you geographic precision that tweet-level analysis rarely provides without significant pre-processing.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Practical use cases where trend data drives sentiment decisions:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Brand monitoring:&lt;/strong&gt; Catch a PR crisis developing before it peaks, while you still have time to respond.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Product launch tracking:&lt;/strong&gt; See whether your campaign is gaining traction geographically, not just globally.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Competitor intelligence:&lt;/strong&gt; Know the moment your competitor's brand starts trending - and with what intensity.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Academic research:&lt;/strong&gt; Track regional public opinion on political events, elections, or policy announcements.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Content strategy:&lt;/strong&gt; Identify topics with proven high engagement before committing creative resources to them.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;So what does an effective data collection strategy actually look like?&lt;/p&gt;




&lt;h2&gt;
  
  
  Methods for getting Twitter (X) data
&lt;/h2&gt;

&lt;p&gt;There are three realistic options for collecting Twitter data at scale. Here's an honest comparison.&lt;/p&gt;

&lt;h3&gt;
  
  
  The official X API
&lt;/h3&gt;

&lt;p&gt;The X API is Twitter's own data access layer. It gives you structured, compliant access to tweets, user profiles, and some trend data via REST and streaming endpoints.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.eunit.me/blog/how-to-scrape-twitter-x-trends-without-breaking-the-bank" rel="noopener noreferrer"&gt;The problem is the pricing model.&lt;/a&gt; The free tier is limited to 1,500 tweets per month for write-only access - virtually useless for sentiment analysis. The Basic plan costs &lt;code&gt;$100.00&lt;/code&gt; per month and gives you limited read access. The Pro plan, which provides meaningful volume, costs &lt;code&gt;$5,000.00&lt;/code&gt; per month. Enterprise pricing isn't even listed publicly.&lt;/p&gt;

&lt;p&gt;For trend data specifically, the API provides access to the &lt;code&gt;trends/place&lt;/code&gt; endpoint, but it's been progressively restricted alongside the broader API changes since 2023.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Best for:&lt;/strong&gt; Enterprise teams with compliance requirements who need guaranteed terms-of-service alignment and have the budget to justify it.&lt;/p&gt;

&lt;h3&gt;
  
  
  DIY web scraping
&lt;/h3&gt;

&lt;p&gt;You could write your own scraper using tools like Selenium or Playwright to automate a browser, log in to Twitter, and extract trend data from the web interface.&lt;/p&gt;

&lt;p&gt;The appeal is obvious: potentially free and fully flexible. The reality is less appealing. &lt;a href="https://www.eunit.me/blog/how-to-scrape-twitter-x-trends-without-breaking-the-bank#:~:text=Obfuscated%20and%20Dynamic%20DOM:%20X,flagged%20and%20throttled%20into%20oblivion." rel="noopener noreferrer"&gt;Twitter's front-end changes frequently, and every UI update risks breaking your scraper&lt;/a&gt;. To avoid IP blocks, you'll need rotating residential proxies, which can cost hundreds of dollars per month on their own. And all of that assumes you have the time and technical skill to build and maintain the scraper indefinitely.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Best for:&lt;/strong&gt; Developers who treat scraper maintenance as a learning exercise rather than a cost.&lt;/p&gt;

&lt;h3&gt;
  
  
  Ready-made Apify Actors
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://apify.com/store?fpr=eunit" rel="noopener noreferrer"&gt;Apify Actors&lt;/a&gt; are pre-built, cloud-hosted scrapers maintained by their developers and hosted on the &lt;a href="https://console.apify.com/sign-up?fpr=eunit" rel="noopener noreferrer"&gt;Apify platform&lt;/a&gt;. They handle proxy management, authentication avoidance, output formatting, and cloud execution - so you don't have to.&lt;/p&gt;

&lt;p&gt;The &lt;a href="https://apify.com/eunit/x-twitter-trends-ppe" rel="noopener noreferrer"&gt;Twitter (X) Trends Scraper (PPE)&lt;/a&gt; is exactly this kind of tool. It was built specifically to extract real-time trending topic data from Twitter, supports granular location targeting across hundreds of countries and cities, and requires zero configuration beyond selecting your location.&lt;/p&gt;

&lt;p&gt;Operating on a &lt;strong&gt;pay-per-event&lt;/strong&gt; model (starting from $1.00 per 1,000 results), it is a fraction of the cost of the Pro API plan - and delivers trend data the API doesn't even expose cleanly. You only pay for the successful scrapes you actually perform.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Best for:&lt;/strong&gt; Researchers, marketers, and developers who need reliable data without infrastructure overhead or expensive monthly subscriptions.&lt;/p&gt;

&lt;h3&gt;
  
  
  At a glance: method comparison
&lt;/h3&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;&lt;/th&gt;
&lt;th&gt;Official X API&lt;/th&gt;
&lt;th&gt;DIY scraping&lt;/th&gt;
&lt;th&gt;Apify Actor (PPE)&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Cost&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;$100–$5,000+/month&lt;/td&gt;
&lt;td&gt;"Free" + proxy costs&lt;/td&gt;
&lt;td&gt;From $1.00 per 1,000 results&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Trend data quality&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Limited, restricted since 2023&lt;/td&gt;
&lt;td&gt;Variable (breaks on UI changes)&lt;/td&gt;
&lt;td&gt;Structured, real-time, hourly history&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Setup complexity&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;High (API keys, OAuth, rate limits)&lt;/td&gt;
&lt;td&gt;Very high (proxies, parsing, maintenance)&lt;/td&gt;
&lt;td&gt;None (click to run)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Maintenance burden&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Low (API is stable until deprecated)&lt;/td&gt;
&lt;td&gt;High (breaks on any UI update)&lt;/td&gt;
&lt;td&gt;None (maintained by developer)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Location targeting&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Country-level only&lt;/td&gt;
&lt;td&gt;Depends on implementation&lt;/td&gt;
&lt;td&gt;300+ countries and cities&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Structured output&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Yes (JSON)&lt;/td&gt;
&lt;td&gt;Manual (requires parsing)&lt;/td&gt;
&lt;td&gt;Yes (JSON, CSV, Excel, Sheets)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;ToS compliance&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;✅ Fully compliant&lt;/td&gt;
&lt;td&gt;⚠️ Gray area&lt;/td&gt;
&lt;td&gt;✅ Public data only&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Best for&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Enterprise compliance-first teams&lt;/td&gt;
&lt;td&gt;Developers who enjoy scraper maintenance&lt;/td&gt;
&lt;td&gt;Researchers, marketers, and developers&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;




&lt;h2&gt;
  
  
  How to use the Twitter (X) Trends Scraper (PPE) to get data for sentiment analysis
&lt;/h2&gt;

&lt;p&gt;The &lt;a href="https://apify.com/eunit/x-twitter-trends-ppe" rel="noopener noreferrer"&gt;Twitter (X) Trends Scraper (PPE)&lt;/a&gt; extracts real-time trending topics from Twitter (X) for any country or city in its location list. It captures trend rank, hashtag names, tweet counts, direct search links, and a rolling hourly history - making it directly useful as a sentiment intensity tracker rather than just a static snapshot tool.&lt;/p&gt;

&lt;p&gt;Here's how to get your first dataset.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1: Access the Actor
&lt;/h3&gt;

&lt;p&gt;Go to &lt;a href="https://apify.com/store" rel="noopener noreferrer"&gt;Apify Store&lt;/a&gt; and search for "Twitter (X) Trends Scraper (PPE)", or navigate directly to &lt;code&gt;apify.com/eunit/x-twitter-trends-ppe&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;If you don't already have an Apify account, sign up for free. No credit card is required to start the trial. Once on the Actor page, click &lt;strong&gt;Try for free&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwfjh59yg8j3t3gi4wiqi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwfjh59yg8j3t3gi4wiqi.png" alt="Twitter (X) Trends Scraper (PPE) on Apify" width="800" height="447"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Configure your input
&lt;/h3&gt;

&lt;p&gt;The input form has a single field: &lt;strong&gt;Country / City&lt;/strong&gt;. This simplicity is intentional - the Actor is designed to return complete trend data for your selected location in one run.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9nev13l7tl04wzq3v1y8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9nev13l7tl04wzq3v1y8.png" alt="Configure input for Twitter (X) Trends Scraper (PPE)" width="800" height="356"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The dropdown supports a wide range of locations, from top-level country selections like &lt;code&gt;Worldwide&lt;/code&gt; or &lt;code&gt;united-kingdom&lt;/code&gt;, to city-level targeting such as &lt;code&gt;united-states/new-york&lt;/code&gt;, &lt;code&gt;nigeria/lagos&lt;/code&gt;, or &lt;code&gt;india/mumbai&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;For sentiment analysis purposes, location selection strategy matters:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;If you're monitoring a brand or product, start with &lt;code&gt;Worldwide&lt;/code&gt; to establish whether the topic is trending globally.&lt;/li&gt;
&lt;li&gt;Then re-run with specific market locations (e.g., &lt;code&gt;united-states&lt;/code&gt;, &lt;code&gt;united-kingdom&lt;/code&gt;, &lt;code&gt;australia&lt;/code&gt;) to understand where the conversation is concentrated.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The equivalent JSON input looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"country"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"united-states/new-york"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can also pass this input directly via the API or Apify client libraries, which is covered in the automation section below.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 3: Run and inspect the output
&lt;/h3&gt;

&lt;p&gt;Click &lt;strong&gt;Run&lt;/strong&gt;. The Actor typically completes in seconds and returns a structured JSON dataset. Here's what each key field contains:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;&lt;code&gt;scraped_at&lt;/code&gt;&lt;/strong&gt; - the UTC timestamp of the run. Critical for tracking data freshness and building time-series datasets.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;code&gt;country_input&lt;/code&gt;&lt;/strong&gt; - the location string you submitted, useful for multi-location runs.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;code&gt;timeline&lt;/code&gt;&lt;/strong&gt; - an array of hourly snapshots, each containing up to 50 ranked trends. Each trend includes its &lt;code&gt;rank&lt;/code&gt;, &lt;code&gt;name&lt;/code&gt; (the hashtag or topic), &lt;code&gt;tweet_count&lt;/code&gt;, and a direct search &lt;code&gt;link&lt;/code&gt;. This is the richest output for sentiment analysis.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;code&gt;tag_cloud&lt;/code&gt;&lt;/strong&gt; - a broader set of trending terms for that location, useful for topic discovery.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;code&gt;table_data&lt;/code&gt;&lt;/strong&gt; - the current top 50 ranked trends in a flat table format, ideal for quick dashboards.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A sample of the &lt;code&gt;timeline&lt;/code&gt; output:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"scraped_at"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2026-04-03T10:30:00.000Z"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"country_input"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"united-states/new-york"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"timeline"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"datetime"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2 minutes ago"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"timestamp"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"1743672600"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"trends"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"rank"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"#TechLayoffs"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"link"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://twitter.com/search?q=%23TechLayoffs"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"tweet_count"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"82K"&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"rank"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"New York Giants"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"link"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://twitter.com/search?q=%22New+York+Giants%22"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"tweet_count"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"14K"&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 4: Export and use the data
&lt;/h3&gt;

&lt;p&gt;Once the run completes, Apify Console gives you several export options directly from the dataset view: JSON, CSV, Excel, XML, and a direct Google Sheets integration.&lt;/p&gt;

&lt;p&gt;For sentiment analysis pipelines, JSON or CSV are the most practical formats. JSON preserves the nested &lt;code&gt;timeline&lt;/code&gt; structure (essential if you're tracking momentum across hourly snapshots), while CSV flattens the data for easy import into tools like Excel, Google Sheets, or pandas.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Tip:&lt;/strong&gt; For a fully automated pipeline - where you scrape trends, then instantly analyze the sentiment of trending topics without writing any code - connect this Actor with &lt;a href="https://apify.com/dusan.vystrcil/llm-dataset-processor" rel="noopener noreferrer"&gt;LLM Dataset Processor&lt;/a&gt; directly in Apify Console. Point the LLM processor at your trend dataset, define a prompt like &lt;code&gt;"Classify the likely sentiment associated with this trending hashtag as Positive, Negative, or Neutral based on context"&lt;/code&gt;, and export the enriched results to Google Sheets automatically.&lt;/p&gt;
&lt;/blockquote&gt;




&lt;h2&gt;
  
  
  For developers: automating the pipeline
&lt;/h2&gt;

&lt;p&gt;If you want to integrate live Twitter trend data into your own application or run it on a schedule, Apify provides client libraries for Python and JavaScript that make this straightforward.&lt;/p&gt;

&lt;p&gt;First, install the Python client:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pip &lt;span class="nb"&gt;install &lt;/span&gt;apify-client
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then run the Actor programmatically:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;apify_client&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;ApifyClient&lt;/span&gt;

&lt;span class="c1"&gt;# Initialize the client with your Apify API token
&lt;/span&gt;&lt;span class="n"&gt;client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ApifyClient&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;&amp;lt;YOUR_API_TOKEN&amp;gt;&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Define your input
&lt;/span&gt;&lt;span class="n"&gt;run_input&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;country&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;united-states/new-york&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="c1"&gt;# Run the Actor and wait for completion
&lt;/span&gt;&lt;span class="n"&gt;run&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;actor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;eunit/x-twitter-trends-ppe&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;call&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;run_input&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;run_input&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Fetch and process the results
&lt;/span&gt;&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;dataset&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;run&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;defaultDatasetId&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]).&lt;/span&gt;&lt;span class="nf"&gt;iterate_items&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Trends for: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;country_input&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;trend&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;timeline&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;trends&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;][:&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;]:&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;  #&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;trend&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;rank&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;trend&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;name&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; - &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;trend&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;tweet_count&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; tweets&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;For JavaScript/Node.js projects, the pattern is identical using the &lt;code&gt;apify-client&lt;/code&gt; npm package:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;ApifyClient&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;apify-client&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;ApifyClient&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;token&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;&amp;lt;YOUR_API_TOKEN&amp;gt;&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;run&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;actor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;eunit/x-twitter-trends-ppe&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;call&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;country&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;united-states/new-york&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;items&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;dataset&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;run&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;defaultDatasetId&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;listItems&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="nx"&gt;items&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;forEach&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;item&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;item&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;timeline&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nx"&gt;trends&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;slice&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;forEach&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;trend&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`#&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;trend&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;rank&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt; &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;trend&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;name&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt; - &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;trend&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;tweet_count&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Scheduling for longitudinal datasets.&lt;/strong&gt; The real power for sentiment analysis is in running the Actor on a regular cadence - say, every hour - and storing the results in a database or data warehouse. This builds a time-series dataset of trend rank and tweet count, which lets you track sentiment momentum over hours, days, or weeks. Apify's built-in scheduler handles this without external cron jobs or infrastructure.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AI agent integration.&lt;/strong&gt; If you're building an AI agent that needs to be aware of current social trends, the Actor supports Model Context Protocol (MCP) server integration. Your agent can query live trend data and use it as context for downstream reasoning tasks.&lt;/p&gt;




&lt;h2&gt;
  
  
  Is it legal to scrape Twitter (X) trends?
&lt;/h2&gt;

&lt;p&gt;This is a fair question, and it deserves a direct answer.&lt;/p&gt;

&lt;p&gt;The Twitter (X) Trends Scraper (PPE) collects publicly available, factual data - specifically the list of trending hashtags and topics that Twitter displays to any unauthenticated visitor. It does not scrape private profiles, direct messages, content behind a login wall, or any personally identifiable information.&lt;/p&gt;

&lt;p&gt;Scraping publicly available factual data is generally considered legal in most jurisdictions. The landmark &lt;em&gt;hiQ Labs v. LinkedIn&lt;/em&gt; case in the United States established that scraping publicly available data does not constitute a violation of the Computer Fraud and Abuse Act (CFAA). Trend data - hashtag names, rankings, and aggregate tweet counts - is about as factual and public as data gets.&lt;/p&gt;

&lt;p&gt;That said, you should always review the Terms of Service of any platform you interact with, and ensure your use case complies with applicable regulations. The General Data Protection Regulation (GDPR) and California Consumer Privacy Act (CCPA) govern &lt;em&gt;personal data&lt;/em&gt;, not aggregated metadata like trend rankings and hashtag names. As long as you're collecting and using trend data responsibly, you're on solid ground.&lt;/p&gt;




&lt;h2&gt;
  
  
  Wrapping Up
&lt;/h2&gt;

&lt;p&gt;The hardest part of any Twitter sentiment analysis project isn't the analysis. It's getting clean, reliable, structured data to analyze. The official API is prohibitively expensive for most teams. DIY scrapers are fragile and require time-consuming maintenance. And most sentiment analysis tutorials skip past the data collection problem entirely.&lt;/p&gt;

&lt;p&gt;The &lt;a href="https://apify.com/eunit/x-twitter-trends-ppe" rel="noopener noreferrer"&gt;Twitter (X) Trends Scraper (PPE)&lt;/a&gt; fills that gap directly. It gives you real-time Twitter trend data - ranked hashtags, tweet counts, hourly history, and location-level granularity - in a structured format that plugs directly into your sentiment pipeline, whether that's a Python NLP script, an LLM-powered classifier, or a no-code Apify workflow.&lt;/p&gt;

&lt;p&gt;The free trial requires no credit card. A single run takes about ten seconds. And the data you get back can be the difference between catching a PR crisis before it peaks and reading about it in the Monday morning report.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://apify.com/eunit/x-twitter-trends-ppe" rel="noopener noreferrer"&gt;Try the Twitter (X) Trends Scraper (PPE) on Apify&lt;/a&gt; and start turning Twitter's real-time noise into a clear, actionable signal.&lt;/p&gt;




&lt;h2&gt;
  
  
  Frequently asked questions
&lt;/h2&gt;

&lt;h3&gt;
  
  
  How do you get data from Twitter (X) for sentiment analysis?
&lt;/h3&gt;

&lt;p&gt;The three main options are the official X API (expensive, limited trend data), custom web scrapers (fragile, requires proxy infrastructure), and ready-made tools like Apify Actors. The &lt;a href="https://apify.com/eunit/x-twitter-trends-ppe" rel="noopener noreferrer"&gt;Twitter (X) Trends Scraper (PPE)&lt;/a&gt; is the most practical option for sentiment-focused use cases, as it returns structured trend data, including tweet counts and hourly history with zero configuration.&lt;/p&gt;

&lt;h3&gt;
  
  
  What is the best tool for scraping Twitter trends in 2026?
&lt;/h3&gt;

&lt;p&gt;The &lt;a href="https://apify.com/eunit/x-twitter-trends-ppe" rel="noopener noreferrer"&gt;Twitter (X) Trends Scraper (PPE)&lt;/a&gt; on Apify is the most practical option for most use cases. It supports granular location data for hundreds of countries and cities, requires no proxy setup, and uses a fair &lt;strong&gt;pay-per-event&lt;/strong&gt; pricing model (from $1.00 / 1,000 results) - a fraction of the cost of the official API.&lt;/p&gt;

&lt;h3&gt;
  
  
  Is it legal to scrape Twitter (X) for sentiment analysis data?
&lt;/h3&gt;

&lt;p&gt;Scraping publicly available trend data is generally legal. Trend rankings and hashtag names are factual, public-facing data, not personal information subject to GDPR or CCPA. However, you should always review the platform's Terms of Service and ensure your use case complies with local regulations.&lt;/p&gt;

&lt;h3&gt;
  
  
  How much does it cost to get Twitter data for sentiment analysis?
&lt;/h3&gt;

&lt;p&gt;The official X API starts at $100 per month for basic access and $5,000 per month for professional volume. The Twitter (X) Trends Scraper (PPE) on Apify operates on a pay-per-event basis, meaning you only pay for the data you successfully scrape. If you need tweet-level data in addition to trends, additional Actors on the Apify Store are available, each with their own pricing.&lt;/p&gt;

&lt;h3&gt;
  
  
  Can I get historical Twitter trend data?
&lt;/h3&gt;

&lt;p&gt;Twitter does not provide historical trend data through its official API. The Twitter (X) Trends Scraper (PPE) captures trends at the time of each run, including the hourly timeline snapshot that Twitter maintains for the current period. To build a historical dataset, schedule the Actor to run at regular intervals (e.g., hourly) and store each run's output in your own database. Apify Console also retains your run history and datasets for later access.&lt;/p&gt;

</description>
      <category>sentimentanalysis</category>
      <category>twitter</category>
      <category>datascaper</category>
    </item>
    <item>
      <title>How to scrape TikTok search results: A complete guide for 2026</title>
      <dc:creator>Emmanuel Uchenna</dc:creator>
      <pubDate>Sat, 21 Mar 2026 14:58:52 +0000</pubDate>
      <link>https://dev.to/eunit/how-to-scrape-tiktok-search-results-a-complete-guide-for-2026-17nm</link>
      <guid>https://dev.to/eunit/how-to-scrape-tiktok-search-results-a-complete-guide-for-2026-17nm</guid>
      <description>&lt;p&gt;&lt;a href="https://www.tiktok.com/" rel="noopener noreferrer"&gt;TikTok&lt;/a&gt; &lt;a href="https://blog.alandotchin.com/tiktoks-global-influence-redefining-social-media-and-culture/#:~:text=virality%20and%20discoverability.-,Cultural%20and%20Generational%20Influence,find%20the%20same%20platform%20elsewhere." rel="noopener noreferrer"&gt;has transformed from a simple video-sharing app into a global cultural engine&lt;/a&gt;. With over 1 billion active users, the platform dictates what we listen to, what we buy, and how we communicate. For businesses, marketers, and researchers, TikTok is not just entertainment; it is a repository of real-time consumer sentiment and trend data.&lt;/p&gt;

&lt;p&gt;However, getting that data is a top challenge as &lt;a href="https://www.tiktok.com/business/en/blog/how-tiktok-protects-users-from-spam-and-inauthentic-behavior" rel="noopener noreferrer"&gt;TikTok employs some of the most sophisticated anti-scraping technologies in the world&lt;/a&gt;. In this article, we will explore why TikTok data is essential, the technical barriers to getting it, and how you can use the &lt;a href="https://apify.com/eunit/tiktok-search-scraper" rel="noopener noreferrer"&gt;TikTok Search Scraper&lt;/a&gt; to automate your data collection at scale.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq6ns2jq8uivj6cg69prk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq6ns2jq8uivj6cg69prk.png" alt="How to scrape TikTok search results: A complete guide for 2026 " width="600" height="300"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The importance of TikTok data in modern marketing
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://www.qualtrics.com/articles/strategy-research/market-research-guide/" rel="noopener noreferrer"&gt;In the past, brands relied on slow-moving market research surveys&lt;/a&gt;. Today, they rely on &lt;a href="https://oxgital.com/how-nigerian-brands-can-use-tiktok-for-viral-growth/" rel="noopener noreferrer"&gt;TikTok in 2026 as a primary growth engine&lt;/a&gt;, utilizing authentic, low-polish content, behind-the-scenes storytelling, and trending sounds to drive viral engagement. With over 5 billion daily users, it is crucial for building brand loyalty and reaching audiences who spend a significant amount of time on social media. The speed at which a trend moves from a single video to a global phenomenon is unprecedented.&lt;/p&gt;

&lt;h3&gt;
  
  
  Trend tracking and cultural shifts
&lt;/h3&gt;

&lt;p&gt;By scraping TikTok search results, you can identify emerging trends before they hit the mainstream. For example, a search for "sustainable fashion" might reveal specific local hashtags or challenges that are gaining momentum. Tracking these shifts allows you to adjust your content strategy in real-time, ensuring your brand remains relevant.&lt;/p&gt;

&lt;h3&gt;
  
  
  Audience behavior and sentiment analysis
&lt;/h3&gt;

&lt;p&gt;Comments, view counts, and video descriptions provide insight into how users feel about specific topics. Analyzing these data points helps you understand the language your target audience uses, their pain points, and what kind of content they find engaging.&lt;/p&gt;

&lt;h3&gt;
  
  
  Competitor and influencer benchmarking
&lt;/h3&gt;

&lt;p&gt;Who are the top creators in your niche? What music are they using? How often do they post? You can answer all these questions by scraping search results for relevant keywords. This data helps you benchmark your performance against competitors and identify potential influencers for future collaborations.&lt;/p&gt;

&lt;h2&gt;
  
  
  Key aspects of brand reliance on TikTok
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Authenticity over production&lt;/strong&gt;: Brands, particularly in fashion and beauty, succeed by using smartphone cameras to show raw "before and after" transformations rather than polished ads.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Driving growth&lt;/strong&gt;: Small businesses in sectors like fashion, fintech, and beauty use TikTok as their primary marketing channel to reach new audiences.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Immersion &amp;amp; trends&lt;/strong&gt;: Successful brands like Sandro Paris and Yahoo create immersive, trend-driven content to stay relevant in pop culture.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Massive reach &amp;amp; user-generated content (UGC)&lt;/strong&gt;: Major brands like ESPN hold millions of followers, while companies increasingly rely on UGC to build trust.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Strategic outsourcing&lt;/strong&gt;: To keep up with the pace of trends, many companies outsource their TikTok creative to specialists&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Why traditional scraping methods fail
&lt;/h2&gt;

&lt;p&gt;If you have tried to build a simple Python script to scrape TikTok, you have likely encountered immediate blocks or endless captchas. TikTok's security infrastructure is designed to distinguish between automated bots and real human users.&lt;/p&gt;

&lt;h3&gt;
  
  
  Dynamic content and JavaScript heavy lifting
&lt;/h3&gt;

&lt;p&gt;TikTok is a single-page application (SPA) that heavily relies on JavaScript to load content. A basic HTTP request will return an empty shell of a page. To get actual video data, you need a browser-based scraper that can render the DOM and handle infinite scrolling.&lt;/p&gt;

&lt;h3&gt;
  
  
  Advanced bot detection
&lt;/h3&gt;

&lt;p&gt;TikTok &lt;a href="https://www.tiktok.com/privacy/blog/how-we-combat-scraping/en" rel="noopener noreferrer"&gt;checks for several technical signals&lt;/a&gt; to identify automation, among which include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Navigator.webdriver&lt;/strong&gt;: Many automated browsers leave this flag set to &lt;code&gt;true&lt;/code&gt;, making them easy to spot.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Fingerprinting&lt;/strong&gt;: Systems analyze your browser's font list, canvas rendering, and hardware identifiers to see if they match a standard device.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Client Hints&lt;/strong&gt;: Modern anti-bot systems check &lt;strong&gt;sec-ch-ua&lt;/strong&gt; headers. If your User-Agent indicates that you are on Chrome 123 but your headers say something else, you will be blocked.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The solution: TikTok Search Scraper
&lt;/h2&gt;

&lt;p&gt;The &lt;a href="https://apify.com/eunit/tiktok-search-scraper" rel="noopener noreferrer"&gt;TikTok Search Scraper&lt;/a&gt; is a professional-grade &lt;a href="https://www.apify.com?fpr=eunit" rel="noopener noreferrer"&gt;Apify Actor&lt;/a&gt; designed to bypass these barriers. It does not just "read" the page; it interacts with it as a human would.&lt;/p&gt;

&lt;h3&gt;
  
  
  Key features for maximum resilience
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Human-like behavior&lt;/strong&gt;: The scraper mimics real user interaction. It does not just jump to the bottom of the page; it performs smooth, variable-speed scrolling. It even "pauses" to simulate a user watching a video, which significantly reduces the risk of triggering behavioral blocks.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Browser profile rotation&lt;/strong&gt;: Every run can use a different browser profile. These profiles include curated User-Agents, realistic viewports, and synchronized Client Hints. This makes each request appear to be coming from a unique, legitimate device.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Stealth technology&lt;/strong&gt;: By leveraging the &lt;strong&gt;stealth plugin&lt;/strong&gt;, &lt;a href="https://apify.com/eunit/tiktok-search-scraper" rel="noopener noreferrer"&gt;the Actor&lt;/a&gt; masks its automated nature, overriding the &lt;code&gt;navigator.webdriver&lt;/code&gt; property and other common detection vectors.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Rehydration script fallback&lt;/strong&gt;: TikTok stores its initial data in a JSON blob called a rehydration script. If the DOM structure changes or is temporarily hidden, &lt;a href="https://apify.com/eunit/tiktok-search-scraper" rel="noopener noreferrer"&gt;the Actor&lt;/a&gt; can pull data directly from this script, ensuring high reliability even during platform updates.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  What data can you extract?
&lt;/h2&gt;

&lt;p&gt;The &lt;a href="https://apify.com/eunit/tiktok-search-scraper" rel="noopener noreferrer"&gt;TikTok Search Scraper&lt;/a&gt; provides a comprehensive set of data points for every video result:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Field&lt;/th&gt;
&lt;th&gt;Description&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Video ID&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The unique identifier for the video (&lt;code&gt;id&lt;/code&gt;).&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Video URL&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The direct link to the TikTok video (&lt;code&gt;url&lt;/code&gt;).&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Description&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The full caption, including hashtags and mentions (&lt;code&gt;description&lt;/code&gt;).&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;View Count&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The number of times the video has been played (&lt;code&gt;views&lt;/code&gt;).&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Likes&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The number of likes the video has received (&lt;code&gt;likes&lt;/code&gt;).&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Comments&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The number of comments on the video (&lt;code&gt;comments&lt;/code&gt;).&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Shares&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The number of times the video has been shared (&lt;code&gt;shares&lt;/code&gt;).&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Author Info&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Username (&lt;code&gt;uniqueId&lt;/code&gt;), nickname, avatar URL, verification status (&lt;code&gt;verified&lt;/code&gt;), and &lt;code&gt;followers&lt;/code&gt; count (&lt;code&gt;author&lt;/code&gt;).&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Music Title&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The song or sound used in the video (&lt;code&gt;music&lt;/code&gt;).&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Music Author&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The creator of the music or sound used in the video (&lt;code&gt;musicAuthor&lt;/code&gt;).&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Hashtags&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;A list of hashtags included in the video (&lt;code&gt;hashtags&lt;/code&gt;).&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Duration&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The length or duration of the video (&lt;code&gt;duration&lt;/code&gt;).&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Timestamp&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The date and time the video was published (&lt;code&gt;timestamp&lt;/code&gt;).&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Scraped At&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The exact date and time the data was extracted (&lt;code&gt;scrapedAt&lt;/code&gt;).&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Source&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The method or source used to extract the data (&lt;code&gt;source&lt;/code&gt;).&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Here is a sample of the extracted JSON output:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"7580436439.........."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"url"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://www.tiktok.com/@techuser112/video/7580436439.........."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"description"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Comment &lt;/span&gt;&lt;span class="se"&gt;\"&lt;/span&gt;&lt;span class="s2"&gt;tech&lt;/span&gt;&lt;span class="se"&gt;\"&lt;/span&gt;&lt;span class="s2"&gt; if you want a road map on how to get into any of these careers"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"views"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"11K"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"likes"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;""&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"comments"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;""&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"shares"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;""&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"author"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"uniqueId"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"techuser112"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"nickname"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"tech with Lesedi’s profile"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"avatar"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://p16-sign-sg.tiktokcdn.com/tos-alisg-avt-0068/9f813384bac2a......."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"verified"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"followers"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;""&lt;/span&gt;&lt;span class="w"&gt;
 &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"music"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Lesedi with Adrian’s 7AM - Slowed + Reverb"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"musicAuthor"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;""&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"hashtags"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"technology"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"techjobs"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"techreview"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"techroadmap"&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"duration"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2025-12-5"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"timestamp"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;""&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"scrapedAt"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2026-03-21T10:47:10.773Z"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"source"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"dom"&lt;/span&gt;&lt;span class="w"&gt;
 &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  How to use the scraper on the Apify platform
&lt;/h2&gt;

&lt;p&gt;Running the &lt;a href="https://apify.com/eunit/tiktok-search-scraper" rel="noopener noreferrer"&gt;TikTok Search Scraper&lt;/a&gt; on the &lt;a href="https://console.apify.com/sign-up?fpr=eunit" rel="noopener noreferrer"&gt;Apify platform&lt;/a&gt; is a simple four-step process. You do not need any coding knowledge to get started.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1: Set your queries
&lt;/h3&gt;

&lt;p&gt;Once you are on the &lt;a href="https://apify.com/eunit/tiktok-search-scraper" rel="noopener noreferrer"&gt;&lt;strong&gt;TikTok Search Scraper&lt;/strong&gt;&lt;/a&gt; page, navigate to the &lt;strong&gt;Input&lt;/strong&gt; tab. In the &lt;strong&gt;Search Queries&lt;/strong&gt; field, enter an array of search terms you want to analyze. For example: &lt;code&gt;["tech review", "laptop unboxing", "coding tips"]&lt;/code&gt;. This screenshot below shows what you need to get started:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fukvdqozoe4gd0dgztujr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fukvdqozoe4gd0dgztujr.png" alt="Tiktok Search Scraper" width="800" height="708"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Define your limits
&lt;/h3&gt;

&lt;p&gt;Use the &lt;strong&gt;maxItems&lt;/strong&gt; field to specify how many videos you want to collect per query. If you are conducting a deep dive, consider setting this to 500. For a quick pulse check, 20 results might be enough.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 3: Configure proxy settings
&lt;/h3&gt;

&lt;p&gt;TikTok has strict regional and IP-based limits. To avoid being blocked, always use &lt;strong&gt;Apify Proxy&lt;/strong&gt;. We highly recommend using &lt;strong&gt;Residential&lt;/strong&gt; proxies, as they use IP addresses assigned to real households, making your scraping activity indistinguishable from regular traffic. Learn more about ApifyProxy from the &lt;a href="https://docs.apify.com/proxy" rel="noopener noreferrer"&gt;Apify documentation&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 4: Run and export
&lt;/h3&gt;

&lt;p&gt;Click the &lt;strong&gt;Start&lt;/strong&gt; button at the bottom of the screen. You can monitor the progress in the &lt;strong&gt;Log&lt;/strong&gt; tab. Once the run is finished, go to the &lt;strong&gt;Storage&lt;/strong&gt; tab to export your data. You can choose from several formats, e.g., JSON, CSV, Excel, or XML.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foz1icv2jyftytqaj77me.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foz1icv2jyftytqaj77me.png" alt="Save and Start the TickTok Search Scraper" width="800" height="141"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Optimization and cost efficiency with Pay-Per-Event
&lt;/h2&gt;

&lt;p&gt;Many scrapers charge based on the computing power they consume, which can be unpredictable and vary significantly. The &lt;a href="https://apify.com/eunit/tiktok-search-scraper" rel="noopener noreferrer"&gt;TikTok Search Scraper&lt;/a&gt; uses a more transparent model: &lt;a href="https://docs.apify.com/platform/actors/publishing/monetize/pay-per-event" rel="noopener noreferrer"&gt;Pay-Per-Event (PPE)&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;With PPE, you are charged a fixed price of $0.29 for every 1,000 video results. This means:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You only pay for the data you actually get.&lt;/li&gt;
&lt;li&gt;Your costs are perfectly predictable.&lt;/li&gt;
&lt;li&gt;You do not pay for the platform's overhead or idle time.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Use cases: Putting the data to work
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Influencer marketing
&lt;/h3&gt;

&lt;p&gt;Agencies use the scraper to find rising stars. By searching for niche keywords, they can identify creators who have high engagement but have not yet been "discovered" by major brands.&lt;/p&gt;

&lt;h3&gt;
  
  
  Market research
&lt;/h3&gt;

&lt;p&gt;Pharma and retail companies scrape TikTok to see how people are using their products. Are they finding new use cases? Are they complaining about a specific feature? This qualitative data is often more honest than what you get in a focus group.&lt;/p&gt;

&lt;h3&gt;
  
  
  Trend capitalization
&lt;/h3&gt;

&lt;p&gt;Media companies track trending hashtags to decide what topics to cover next. By seeing which videos are gaining the most views in a short period, they can jump on trends while they are still hot.&lt;/p&gt;

&lt;h2&gt;
  
  
  For Developer: Automate with JavaScript
&lt;/h2&gt;

&lt;p&gt;If you are a developer looking to integrate the &lt;a href="https://apify.com/eunit/tiktok-search-scraper" rel="noopener noreferrer"&gt;TikTok Search Scraper&lt;/a&gt; directly into your projects, you can use the official &lt;a href="https://docs.apify.com/api/client/js.md" rel="noopener noreferrer"&gt;Apify API JavaScript/TypeScript client&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Installation
&lt;/h3&gt;

&lt;p&gt;First, install the &lt;code&gt;apify-client&lt;/code&gt; package in your project:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm &lt;span class="nb"&gt;install &lt;/span&gt;apify-client
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Usage Example
&lt;/h3&gt;

&lt;p&gt;To get started, initialize the client with your Apify API token and pass your search queries. Here is an example script showing how to scrape data for the "AI coding" query:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;ApifyClient&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;apify-client&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;// Initialize the ApifyClient with your Apify API token&lt;/span&gt;
&lt;span class="c1"&gt;// Replace the '&amp;lt;YOUR_API_TOKEN&amp;gt;' with your token&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;ApifyClient&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
 &lt;span class="na"&gt;token&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;&amp;lt;YOUR_API_TOKEN&amp;gt;&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="c1"&gt;// Prepare Actor input&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;input&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;queries&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;AI coding&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
 &lt;span class="p"&gt;],&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;maxItems&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;proxyConfiguration&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;useApifyProxy&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;apifyProxyGroups&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;RESIDENTIAL&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
 &lt;span class="p"&gt;]&lt;/span&gt;
 &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="c1"&gt;// Run the Actor and wait for it to finish&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;run&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;actor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;eunit/tiktok-search-scraper&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;call&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;input&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="c1"&gt;// Fetch and print Actor results from the run's dataset (if any)&lt;/span&gt;
&lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Results from dataset&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`💾 Check your data here: https://console.apify.com/storage/datasets/&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;run&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;defaultDatasetId&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;items&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;dataset&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;run&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;defaultDatasetId&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;listItems&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="nx"&gt;items&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;forEach&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;item&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
 &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;dir&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;item&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="c1"&gt;// 📚 Want to learn more 📖? Go to → https://docs.apify.com/api/client/js/docs&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Legal and ethical considerations
&lt;/h2&gt;

&lt;p&gt;When scraping TikTok, it is important to act responsibly.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Respect privacy&lt;/strong&gt;: Only scrape publicly available data. Never attempt to access private accounts or unauthorized sections of the platform.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Avoid overloading&lt;/strong&gt;: Use the scraper's built-in delays and human-like interactions to avoid putting unnecessary strain on TikTok's servers.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Comply with regulations&lt;/strong&gt;: Ensure your use of the data complies with local regulations, such as GDPR in the EU or CCPA in California.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Wrapping Up
&lt;/h2&gt;

&lt;p&gt;TikTok is a powerful source of real-time intelligence, but accessing it requires a professional approach. The &lt;a href="https://apify.com/eunit/tiktok-search-scraper" rel="noopener noreferrer"&gt;TikTok Search Scraper&lt;/a&gt; on the Apify platform provides the stealth, speed, and reliability you need to turn raw social media activity into actionable business insights.&lt;/p&gt;

&lt;p&gt;Start &lt;a href="https://apify.com/eunit/tiktok-search-scraper" rel="noopener noreferrer"&gt;your first run today&lt;/a&gt; and stop guessing about the latest trends.&lt;/p&gt;

&lt;p&gt;Happy automating!&lt;/p&gt;

</description>
      <category>tiktok</category>
      <category>datascraping</category>
      <category>trendanalysis</category>
      <category>webscraping</category>
    </item>
    <item>
      <title>How to actually start your fitness journey and stick to it (with the FitJourney.app platform)</title>
      <dc:creator>Emmanuel Uchenna</dc:creator>
      <pubDate>Sat, 07 Mar 2026 12:46:36 +0000</pubDate>
      <link>https://dev.to/eunit/how-to-actually-start-your-fitness-journey-and-stick-to-it-with-the-fitjourney-platform-1oj4</link>
      <guid>https://dev.to/eunit/how-to-actually-start-your-fitness-journey-and-stick-to-it-with-the-fitjourney-platform-1oj4</guid>
      <description>&lt;p&gt;You have been thinking about it for months. Maybe even years. That expensive gym membership is gathering dust, your workout clothes still have the tags on them, and you have a growing, endless list of reasons why “tomorrow” will be the perfect day to begin. Here is the honest, unfiltered truth about health and wellness: starting a fitness journey is absolutely the hardest part, but it is also the most transformative decision you will ever make for your body and your mind.&lt;/p&gt;

&lt;p&gt;The first workout is always harder mentally than it is physically. We often build up incredible, insurmountable barriers in our minds. When you finally take that first step, you are not just beginning a workout routine - you are rewiring decades of self-doubt and breaking down established negative habits. However, taking that step can feel incredibly daunting because the modern fitness landscape is undeniably confusing. You have fitness influencers screaming conflicting advice, terrifying big box gyms full of confusing machines, and a dozen different apps you supposedly need to track your life.&lt;/p&gt;

&lt;p&gt;This article is designed to cut through the noise. We are going to deeply explore why starting your health and fitness journey feels so overwhelming, the science-backed benefits of committing to a routine, and how to find the lasting fitness motivation you need. Most importantly, we will introduce the &lt;a href="https://www.fitjourney.app/" rel="noopener noreferrer"&gt;FitJourney platform&lt;/a&gt;, an all-in-one ecosystem that drastically simplifies progress tracking, fitness challenges, and educational content so you can finally stop planning and start doing.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7jw59kkk7xr5zw5acp83.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7jw59kkk7xr5zw5acp83.png" alt="FitJourney.app platform on various devices" width="800" height="378"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Why starting your fitness journey feels so overwhelming
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Obstacle 1: Analysis paralysis
&lt;/h3&gt;

&lt;p&gt;You have probably researched every workout program, compared various gym memberships in your local area, and read countless articles about the ”best” way to get started. You spend hours watching videos on the optimal grip for a barbell deadlift, the exact macronutrient ratio for fat loss, or the perfect supplements to buy before you have even picked up a pair of dumbbells. Meanwhile, weeks turn into months of intense planning rather than taking any real action.&lt;/p&gt;

&lt;p&gt;The truth about how to start a fitness journey is simple: the best program is the one you will actually follow consistently. Analysis paralysis keeps you stuck on the starting line, waiting for perfect conditions that will never arrive. You don't need to know everything to begin. You need to take the first step and learn as you go.&lt;/p&gt;

&lt;h3&gt;
  
  
  Obstacle 2: Fragmentation
&lt;/h3&gt;

&lt;p&gt;Relying on scattered, disconnected tools completely kills your momentum. Right now, you might be using a basic notes app for &lt;a href="https://www.fitjourney.app/tracking-science" rel="noopener noreferrer"&gt;tracking your sets and reps&lt;/a&gt;, a separate application for logging your meals and counting calories, another website for learning how to perform exercises safely, and a group chat for accountability. This fragmentation makes the simple act of working out feel like a stressful, overwhelming part-time job.&lt;/p&gt;

&lt;p&gt;When your fitness tools are scattered across the wind, it is incredibly easy to drop the ball on one of them, which often leads to dropping the whole routine out of pure frustration. &lt;a href="https://www.fitjourney.app/" rel="noopener noreferrer"&gt;A unified, dedicated fitness tracking platform is essential&lt;/a&gt; for keeping everything organized and keeping you focused on the actual physical work.&lt;/p&gt;

&lt;h3&gt;
  
  
  Obstacle 3: Fear of judgment and unrealistic expectations
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://www.fitjourney.app/articles/overcoming-gym-anxiety" rel="noopener noreferrer"&gt;Walking into a new gym feels intimidating&lt;/a&gt;, especially when you are convinced that everyone else there has it perfectly figured out. Big box gyms can dramatically amplify this fear of judgment. You might worry that people are watching you struggle with a machine, or you might feel deeply embarrassed because you do not look like the fitness models prominently featured on your social media feed.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;It is vital to remember that social media only shows the highlight reel, not the messy, sweaty middle of the journey&lt;/strong&gt;. Furthermore, unrealistic expectations set you up for immediate failure. You might expect to drop 10 pounds in your first week and get discouraged when the scale barely moves. Sustainable fitness is about showing up consistently, embracing the learning curve, and rejecting the pressure to be perfect from day one.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;It is vital to remember that social media only shows the highlight reel, not the messy, sweaty middle of the journey&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  The core benefits of committing to your fitness
&lt;/h2&gt;

&lt;p&gt;Drawing inspiration from real science and established medical research, it becomes clear that &lt;a href="https://www.mayoclinic.org/healthy-lifestyle/fitness/in-depth/exercise/art-20048389" rel="noopener noreferrer"&gt;committing to regular exercise is the ultimate game-changer for your entire life&lt;/a&gt;, far beyond just looking good in the mirror.&lt;/p&gt;

&lt;h3&gt;
  
  
  Physical health
&lt;/h3&gt;

&lt;p&gt;The most obvious benefits of a health and fitness journey are physical. Exercise helps control weight by helping you burn calories and build active, lean muscle mass. But beyond the aesthetics, &lt;a href="https://www.cdc.gov/physical-activity-basics/benefits/index.html" rel="noopener noreferrer"&gt;regular physical activity is a powerhouse for your internal, functional health&lt;/a&gt;. It vigorously combats chronic health conditions and diseases. Whether you are worried about hereditary heart disease or hoping to naturally prevent high blood pressure, being active boosts your high-density lipoprotein (HDL) cholesterol—often known as the “good” cholesterol—and decreases unhealthy triglycerides.&lt;/p&gt;

&lt;p&gt;This one-two punch keeps your blood flowing smoothly, which significantly lowers your risk of cardiovascular diseases, type 2 diabetes, metabolic syndrome, and even certain types of cancer. Additionally, as you build muscle strength and greatly boost your endurance, you will find you have significantly more physical energy to tackle your daily chores, play with your children, and live life without getting easily winded.&lt;/p&gt;

&lt;h3&gt;
  
  
  Mental well-being
&lt;/h3&gt;

&lt;p&gt;If you currently need a significant emotional lift or a reliable way to lower stress after a difficult, demanding day at work, a solid workout session can help tremendously. Physical activity stimulates various powerful brain chemicals, including endorphins and dopamine, that leave you feeling happier, significantly more relaxed, and noticeably less anxious. Exercise is a potent, natural antidepressant that helps you regulate cortisol levels in your body.&lt;/p&gt;

&lt;h3&gt;
  
  
  Longevity
&lt;/h3&gt;

&lt;p&gt;Fitness is a lifelong marathon, not a frantic 30-day sprint. It is the single best investment you can make in your future self and your long-term independence. By maintaining your muscle mass and increasing your bone density through dedicated resistance training, you actively prevent the risk of debilitating falls and arthritis as you age. Getting fit is not just about extending the total years of your lifespan; it is about drastically improving your health span. This ensures that your later years are vibrant, active, and fulfilling, free from the extreme physical limitations that plague so many older adults.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to define your "why" and set realistic goals
&lt;/h2&gt;

&lt;p&gt;Before nervously lacing up your running shoes or paying for an expensive gym membership, you absolutely need to understand the psychology of goal setting and intrinsic motivation.&lt;/p&gt;

&lt;h3&gt;
  
  
  Finding your motivation
&lt;/h3&gt;

&lt;p&gt;To continually sustain your health and fitness journey through the tough days, you must clearly define your underlying "why". Why do you genuinely want to get fit? Your motivation must be unique to you. If your initial goal is "to look good in a swimsuit," your motivation will likely plummet to zero the moment you put on a cozy sweater in the winter.&lt;/p&gt;

&lt;p&gt;You need to look beyond surface-level aesthetics to understand your deepest, most personal reasons for getting fit. Do you deeply want to have the boundless energy to play with your children without losing your breath? Do you desperately want to reverse a frightening pre-diabetes diagnosis so you can live a long, full life? Do you want to build the physical confidence to hike a steep mountain on your next vacation? Write down your deeply rooted reasons on paper, and revisit them whenever your motivation wavers. Knowing your true "why" gives your fitness journey unshakeable, relentless purpose.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;To continually sustain your health and fitness journey through the tough days, you must clearly define your underlying "why". Why do you genuinely want to get fit?&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  Goal setting
&lt;/h3&gt;

&lt;p&gt;Establishing clear, intelligent, and achievable goals is crucial for maintaining constant motivation. However, there is a massive conceptual difference between outcome goals and behavioral goals.&lt;/p&gt;

&lt;p&gt;An outcome goal might be something like "I want to lose 20 pounds by summer." While this is a perfectly fine desire, many complex biological factors influence the actual, literal outcome, which is outside of your direct control, such as your shifting metabolism, hormonal water weight fluctuations, and daily stress levels.&lt;/p&gt;

&lt;p&gt;A behavioral goal is an actionable step that is 100% within your complete control, such as "I will work out three times a week for exactly 45 minutes" or "I will eat a serving of green vegetables with every single dinner." Focus your mental energy entirely on mastering behavioral goals. Celebrate each small behavioral victory, because these repeatable actions serve as tangible evidence of your commitment and will inevitably lead to the desired long-term outcome.&lt;/p&gt;

&lt;h3&gt;
  
  
  Consistency over perfection
&lt;/h3&gt;

&lt;p&gt;Consistency beats perfection every single time. It is the golden, unbreakable rule of fitness. You don't need to execute perfectly to start a fitness journey, and you absolutely do not need to have a flawless diet to see amazing results. Life happens to all of us. Work gets crazy, the kids get sick, and natural motivation fluctuates from day to day.&lt;/p&gt;

&lt;p&gt;If you miss a scheduled workout, it does not mean you have failed. If you eat a piece of cake at a birthday party, your progress is not magically ruined forever. Remind yourself constantly that taking a brisk 15-minute walk on a bad, stressful day is infinitely better than doing nothing at all. The people who ultimately succeed in fitness are not those who never make mistakes; they are the ones who refuse ever to quit.&lt;/p&gt;

&lt;h2&gt;
  
  
  Enter the FitJourney platform: your all-in-one fitness ecosystem
&lt;/h2&gt;

&lt;p&gt;Now that you have a comprehensive understanding of the mental challenges, the profound medical benefits, and the resilient mindset explicitly required, you desperately need the right tools to execute your plan flawlessly. This is where the &lt;a href="https://www.fitjourney.app/" rel="noopener noreferrer"&gt;FitJourney platform&lt;/a&gt; comes in to revolutionize your approach.&lt;/p&gt;

&lt;h3&gt;
  
  
  Introducing the FitJourney.app platform
&lt;/h3&gt;

&lt;p&gt;We built the &lt;a href="https://www.fitjourney.app/" rel="noopener noreferrer"&gt;FitJourney platform&lt;/a&gt; to permanently eliminate the confusing guesswork from your health and wellness goals. Remember the detrimental obstacle of fragmentation we discussed earlier? The FitJourney platform solves this critical issue by providing a comprehensive, all-in-one ecosystem. You no longer need five different, annoying applications to manage your health. &lt;a href="https://www.fitjourney.app/about" rel="noopener noreferrer"&gt;Our integrated fitness tracking platform&lt;/a&gt; brings your scheduled workouts, &lt;a href="https://www.fitjourney.app/articles" rel="noopener noreferrer"&gt;your ongoing education&lt;/a&gt;, &lt;a href="https://www.fitjourney.app/tracking-science" rel="noopener noreferrer"&gt;your nutrition guidelines&lt;/a&gt;, and your vital progress data into a single, beautiful, and highly intuitive interface.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fchhv3py34mirvsq9daje.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fchhv3py34mirvsq9daje.png" alt="FitJourney.app dashboard" width="609" height="800"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Structured challenges
&lt;/h3&gt;

&lt;p&gt;One of the biggest hurdles for nervous beginners is bravely walking into a gym and thinking, "What in the world do I do now?" The FitJourney platform removes this paralyzing anxiety entirely through our structured, expertly designed programs. By navigating to the &lt;a href="https://www.fitjourney.app/app/challenges" rel="noopener noreferrer"&gt;&lt;strong&gt;Challenges&lt;/strong&gt;&lt;/a&gt; section of the platform, you can quickly enroll in fitness challenges purposefully tailored to your specific experience level.&lt;/p&gt;

&lt;p&gt;Whether you are tentatively stepping into a gym for the very first time in your life or you are an intermediate lifter looking to break through a plateau aggressively, our comprehensive challenges provide step-by-step guidance. You will know exactly which exercises to do, how many sets and reps to complete properly, and how to progress safely week after week.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0b98mhqcbntlftcfxk2m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0b98mhqcbntlftcfxk2m.png" alt="Challenges screen" width="800" height="444"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Education and practicals
&lt;/h3&gt;

&lt;p&gt;We firmly believe that education is the ultimate form of empowerment. We do not just want you to follow a generic workout routine; we desperately want you to understand exactly how your body works fully. The FitJourney platform features built-in, highly detailed educational articles and interactive practical exercises that teach you exactly how to train effectively and eat perfectly right for your dynamic goals.&lt;/p&gt;

&lt;p&gt;A practical is a hands-on learning module that rigorously tests your knowledge in a real-world setting. You will thoroughly learn about the scientific principles of progressive overload, how to properly fuel your precious body before a grueling workout, and how to proactively prevent common injuries. This deep, expanding well of knowledge strongly ensures that you are building sustainable, intelligent habits that will truly last a lifetime, rather than just seeking an impossible 30-day quick fix.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz3vm8j8q8e66wp3y2z6s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz3vm8j8q8e66wp3y2z6s.png" alt="Education and practicals on FitJourney.app" width="800" height="442"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  All-in-one tracking
&lt;/h3&gt;

&lt;p&gt;Seeing clear, tangible data is unequivocally the most reliable source of endless fitness motivation. When you can objectively see on a chart that you are getting stronger, running faster, and becoming healthier, everything falls into place wonderfully.&lt;/p&gt;

&lt;p&gt;The FitJourney platform makes tracking completely effortless and highly rewarding. Your personalized &lt;strong&gt;Dashboard&lt;/strong&gt; provides a clear, bird's-eye view of your current active streaks, your upcoming workouts, and your recent achievements. Meanwhile, &lt;strong&gt;Your Progress Tracker&lt;/strong&gt; allows you to intimately log the specific, granular details of your journey, securely storing your daily workout data, body measurements, and athletic performance milestones over time. Seeing your personal line graph steadily trending in the right direction provides the incredibly powerful psychological reinforcement you need to keep showing up every single week.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feg2gl9lu9ugjqaamgvd9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feg2gl9lu9ugjqaamgvd9.png" alt="Progress tracking on FitJourney.app" width="800" height="441"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  3 simple steps to kickstart your journey today
&lt;/h2&gt;

&lt;p&gt;Your fitness journey is quietly waiting for you. Not next Monday. Not after the busy holidays. Not when life miraculously gets less busy. Right today. &lt;a href="https://www.fitjourney.app/articles/motivation-vs-discipline" rel="noopener noreferrer"&gt;The objectively hardest part is truly just making the bold decision that you are worth the time and physical investment&lt;/a&gt;. To help you aggressively overcome analysis paralysis, here are three simple, actionable steps you can confidently take right now to begin successfully.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1: Get set up
&lt;/h3&gt;

&lt;p&gt;Your very first action is to swiftly &lt;a href="https://www.fitjourney.app/sign-up" rel="noopener noreferrer"&gt;create your account on the FitJourney platform&lt;/a&gt;. The entire sign-up process is remarkably quick, intentionally. Once you are successfully logged in, take a few dedicated minutes to explore your newly personalized &lt;strong&gt;Dashboard&lt;/strong&gt;. Familiarize yourself with the clean interface, set your initial user preferences, and complete your basic profile. Setting up your digital environment is a crucially important psychological cue to your brain that you are officially beginning a completely new chapter of your life.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F227zme3t16jwywndhkj8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F227zme3t16jwywndhkj8.png" alt="FitJourney Dashboard" width="777" height="800"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Join a challenge
&lt;/h3&gt;

&lt;p&gt;Once your account is fully set up, purposefully navigate to the &lt;a href="https://www.fitjourney.app/app/challenges" rel="noopener noreferrer"&gt;&lt;strong&gt;Challenges&lt;/strong&gt;&lt;/a&gt; tab located in the main menu. Enthusiastically browse through the available, structured programs and select a beginner-friendly challenge that perfectly aligns with your current fitness level and your personal goals. Do not overthink it; the primary goal right now is to build the consistent, daily habit of working out.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0b98mhqcbntlftcfxk2m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0b98mhqcbntlftcfxk2m.png" alt="FitJourney Challenges" width="800" height="444"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 3: Track and learn
&lt;/h3&gt;

&lt;p&gt;When it is finally time for your critical first workout, open up the platform, diligently follow the detailed instructions safely provided in your chosen challenge, and give it your absolute best effort. Immediately after finishing, triumphantly jump into &lt;strong&gt;Your Progress Tracker&lt;/strong&gt; to carefully log your completed workout. Openly celebrate this massive, foundational win! Then, take 10 minutes to read your first educational article properly or complete an engaging practical on the platform.&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping up
&lt;/h2&gt;

&lt;p&gt;Actively starting your health and fitness journey is undoubtedly the hardest part of the entire process. It demands that you aggressively overcome deep-seated self-doubt, push decisively past paralyzing analysis paralysis, and bravely step into the vast unknown. But the staggering rewards—boundless daily energy, profound mental clarity, and exceptional long-term health—are ultimately worth infinitely more than the brief initial discomfort.&lt;/p&gt;

&lt;p&gt;To predictably succeed, you must define your deeply personal "why", prioritize relentless consistency over impossible perfection, and set behavioral goals that you can easily and directly control. Furthermore, strategically leveraging the right tools is paramount. By actively using the FitJourney platform, you permanently eliminate the distracting confusion of fragmented apps and systematically replace it with a focused, all-in-one fitness ecosystem that tracks your progress and educates your mind.&lt;/p&gt;

&lt;p&gt;Ready to definitively stop planning and start doing? &lt;a href="https://www.fitjourney.app/sign-up" rel="noopener noreferrer"&gt;Create your free account on the FitJourney platform&lt;/a&gt; and join your very first challenge today. Your future self will profoundly thank you for the incredible courage you showed right now.&lt;/p&gt;

</description>
      <category>fitness</category>
      <category>motivation</category>
      <category>digital</category>
      <category>health</category>
    </item>
    <item>
      <title>How to scrape Telegram channels: The complete guide to extracting posts, media, and metadata</title>
      <dc:creator>Emmanuel Uchenna</dc:creator>
      <pubDate>Sat, 07 Feb 2026 12:28:27 +0000</pubDate>
      <link>https://dev.to/eunit/how-to-scrape-telegram-channels-the-complete-guide-to-extracting-posts-media-and-metadata-2ff</link>
      <guid>https://dev.to/eunit/how-to-scrape-telegram-channels-the-complete-guide-to-extracting-posts-media-and-metadata-2ff</guid>
      <description>&lt;p&gt;Telegram has evolved into a global powerhouse for real-time information, &lt;a href="https://techcrunch.com/2025/03/19/telegram-founder-pavel-durov-says-app-now-has-1b-users-calls-whatsapp-a-cheap-watered-down-imitation/" rel="noopener noreferrer"&gt;hosting over 1 billion active users&lt;/a&gt; in 2026. From niche specialized groups to massive news broadcasts with millions of subscribers, the platform is a goldmine for market researchers, AI developers, and competitive intelligence analysts. However, the platform is notoriously difficult to scrape. Its data is locked behind a mobile-first interface and strictly protected by anti-bot measures.&lt;/p&gt;

&lt;p&gt;Manual data collection doesn't scale in today's era of automation. If you are trying to monitor trends across a hundred channels or build a dataset for a large language model, you need an automated solution. This is where &lt;a href="https://apify.com/eunit/telegram-channels-scraper" rel="noopener noreferrer"&gt;Telegram Channels Scraper&lt;/a&gt; comes in.&lt;/p&gt;

&lt;p&gt;In this walkthrough, we will show you how to use this advanced &lt;a href="https://www.apify.com?fpr=eunit" rel="noopener noreferrer"&gt;Apify&lt;/a&gt; Actor to extract everything from message text and timestamps to direct media links and engagement metrics.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://apify.com/eunit/telegram-channels-scraper" rel="noopener noreferrer"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdjmpaetbbiablzwmy349.png" alt="Telegram Scraper" width="800" height="387"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Why scrape Telegram data in 2026?
&lt;/h2&gt;

&lt;p&gt;Telegram is more than just a messaging app. It is a news agency, a marketplace, and a community hub. &lt;a href="http://oreateai.com/blog/telegram-more-than-just-a-chat-app-its-a-global-digital-hub/95dc6ddff2c39a2aa5c3242dcb302897" rel="noopener noreferrer"&gt;Telegram has been referred to as a "Global Digital Hub"&lt;/a&gt;. As of 2026, &lt;a href="https://magnetto.com/blog/telegram-marketing-ultimate-guide" rel="noopener noreferrer"&gt;Telegram has evolved far beyond a simple instant messaging tool&lt;/a&gt; into a massive, multifaceted digital ecosystem. It serves as a "super-app" combining social networking, information, and commerce, particularly popular for its algorithm-free broadcast capabilities.&lt;/p&gt;

&lt;p&gt;Organizations and individual developers are increasingly looking for ways to "unlock" this siloed data for several key reasons:&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Market research and competitive intelligence
&lt;/h3&gt;

&lt;p&gt;In many industries, specifically crypto, finance, and tech, Telegram is where news breaks first. By scraping competitor channels, you can monitor their announcements, track which content resonates with their audience based on reaction counts, and analyze their growth through subscriber metrics.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Dataset building for AI and NLP
&lt;/h3&gt;

&lt;p&gt;Generative AI models are only as good as the data on which they are trained. Telegram provides high-quality, real-time human conversation in dozens of languages. It is an ideal source for training sentiment analysis models or building niche-specific knowledge bases for AI agents.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Content monitoring and automation
&lt;/h3&gt;

&lt;p&gt;Many businesses use Telegram scrapers to feed external dashboards or notification systems. For example, a financial analyst might scrape various "trading signals" channels to consolidate all data into a single, searchable database or a Slack notification bot.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. Audience behavior and sentiment analysis
&lt;/h3&gt;

&lt;p&gt;Understanding which emojis (reactions) users use in response to specific news can provide deep qualitative insights into brand sentiment. Tracking views per post relative to the total subscriber count helps calculate actual engagement rates.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is the Telegram Channels Scraper?
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://apify.com/eunit/telegram-channels-scraper" rel="noopener noreferrer"&gt;Telegram Channels Scraper&lt;/a&gt; is a powerful, production-ready &lt;a href="https://console.apify.com/sign-up?fpr=eunit" rel="noopener noreferrer"&gt;Apify Actor&lt;/a&gt; designed to extract structured content from public Telegram channels. Unlike many other tools that require you to manage your own proxies or deal with complex API authorizations, this Actor handles the technical heavy lifting for you.&lt;/p&gt;

&lt;h3&gt;
  
  
  Key features of the tool
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;QR code self-authentication&lt;/strong&gt;: No developer API keys needed. Users authenticate with their own Telegram account by scanning a QR code directly in the console — just like linking a desktop device.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Session string persistence&lt;/strong&gt;: After scanning the QR code once, the Actor outputs a reusable session string. Paste it into the input on future runs to skip the login step entirely.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Full engagement metrics&lt;/strong&gt;: Extract views, reaction counts, and post links.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Media extraction&lt;/strong&gt;: Get direct download links for images, videos, and documents.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Developer-friendly&lt;/strong&gt;: Includes a built-in FastAPI server for real-time requests and support for Model Context Protocol (MCP) to integrate with AI agents like ChatGPT or Claude.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Pay-per-result pricing&lt;/strong&gt;: The Actor uses Apify's Pay-Per-Event model — you only pay for what you actually get, making it highly cost-effective for both small and large-scale projects.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  What data can you extract?
&lt;/h2&gt;

&lt;p&gt;The Actor produces a clean, structured JSON output (also exportable to CSV, Excel, or XML). Here is the data you can expect for every post:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Field&lt;/th&gt;
&lt;th&gt;Description&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Channel Name&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The display name of the Telegram channel.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Username&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The @handle of the channel.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Bio &amp;amp; Subs&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The channel's description and current subscriber count.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Post ID &amp;amp; Link&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The unique identifier and direct URL to the message.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Date&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;ISO-formatted timestamp of the post.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Text&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The full text content of the message.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Views&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Number of views at the time of scraping.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Reactions&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;An object containing counts for every emoji used (e.g., 👍, ❤️, 🔥).&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Media Type&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Whether the post is text-only, a photo, a video, or a document.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Media URLs&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Direct links to the media files.&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h2&gt;
  
  
  Step-by-step walkthrough: Extracting your first dataset
&lt;/h2&gt;

&lt;p&gt;Getting started with Apify Actors is easy. You don't need to write a single line of code unless you want to integrate the results into your own application.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://apify.com/eunit/telegram-channels-scraper" rel="noopener noreferrer"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbin8i1ipaxpwmjc4ucb0.png" alt="Telegram Channels Scraper" width="800" height="304"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1: Find the Actor on Apify Store
&lt;/h3&gt;

&lt;p&gt;Navigate to the &lt;a href="https://apify.com/eunit/telegram-channels-scraper" rel="noopener noreferrer"&gt;Telegram Channels Scraper&lt;/a&gt; page. If you don't have an Apify account yet, you can &lt;a href="https://console.apify.com/sign-up?fpr=eunit" rel="noopener noreferrer"&gt;sign up for free&lt;/a&gt; in just a few seconds.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Configure the input
&lt;/h3&gt;

&lt;p&gt;Once you are on the Actor page, click on the &lt;strong&gt;Input&lt;/strong&gt; tab. You will see these fields:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Channels&lt;/strong&gt;: Enter a comma-separated list of the channel usernames or links you want to target. For example: &lt;code&gt;@svtvnews, https://t.me/cryptninjas&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Limit&lt;/strong&gt;: Decide how many posts you want to extract from each channel. If you need the most recent news, &lt;code&gt;10&lt;/code&gt; or &lt;code&gt;20&lt;/code&gt; is usually enough. For historical research, you can set this higher.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Days back&lt;/strong&gt;: Specify how far back in time you want to go. This ensures you don't waste resources on outdated content if you only care about the last 3 days.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Saved Session String&lt;/strong&gt; &lt;em&gt;(optional)&lt;/em&gt;: Paste a previously generated session token here to skip the QR login on repeat runs.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;2FA Password&lt;/strong&gt; &lt;em&gt;(optional)&lt;/em&gt;: If your Telegram account has Two-Step Verification enabled, provide it here so the Actor can complete the login automatically.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://apify.com/eunit/telegram-channels-scraper" rel="noopener noreferrer"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9f6qpfyu656anujrn758.png" alt="Telegram Channels Scraper configuration" width="800" height="598"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 3: Authenticate with QR code (first run only)
&lt;/h3&gt;

&lt;p&gt;On your very first run — or when no session string is provided — the Actor will display a QR code directly in the &lt;strong&gt;Log&lt;/strong&gt; tab of the Apify Console. This is how you authenticate:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open &lt;strong&gt;Telegram&lt;/strong&gt; on your mobile device.&lt;/li&gt;
&lt;li&gt;Go to &lt;strong&gt;Settings → Devices → Link Desktop Device&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Scan the QR code shown in the log.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Once scanned, the Actor logs your session string prominently in the console and also saves it to the Key-Value Store under the key &lt;code&gt;SESSION_STRING&lt;/code&gt;. Copy it and paste it into the &lt;strong&gt;Saved Session String&lt;/strong&gt; input field — all subsequent runs will skip the QR prompt entirely.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2pijg4scky3gvbkojdpr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2pijg4scky3gvbkojdpr.png" alt="Telegram QR code from the Apify console" width="800" height="473"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; The QR code automatically refreshes every 30 seconds if not scanned. The Actor will regenerate it up to 10 times, so you have plenty of time.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  Step 4: Run the Actor
&lt;/h3&gt;

&lt;p&gt;Click the &lt;strong&gt;Start&lt;/strong&gt; button at the bottom of the page. You will be redirected to the &lt;strong&gt;Runs&lt;/strong&gt; tab, where you can see the log in real-time. The Actor will connect to Telegram, navigate to the specified channels, and push the data into your default dataset.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 5: Download your results
&lt;/h3&gt;

&lt;p&gt;When the run finishes, navigate to the &lt;strong&gt;Storage&lt;/strong&gt; tab. You can preview the data in a table format or download it in your preferred format. Most users find CSV or Excel to be the best formats for analysis, while developers typically prefer JSON.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://apify.com/eunit/telegram-channels-scraper" rel="noopener noreferrer"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr7bmuy5ns0n5cdmaojr2.png" alt="Telegram Channels Scraper results" width="800" height="468"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Advanced features for power users
&lt;/h2&gt;

&lt;p&gt;This Actor is built for scale. If you are a developer or a business with high-volume needs, you can leverage its advanced capabilities.&lt;/p&gt;

&lt;h3&gt;
  
  
  Real-time scraping via API (Standby mode)
&lt;/h3&gt;

&lt;p&gt;The Actor supports &lt;strong&gt;Standby mode&lt;/strong&gt;. This means you can keep the Actor running in the background and send it HTTP requests whenever you need data. Instead of waiting for a container to boot up, you get an immediate response.&lt;/p&gt;

&lt;p&gt;You can trigger a scrape by sending a &lt;code&gt;POST&lt;/code&gt; request to the Actor's standard URL with the following JSON body:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"channels"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"@example_channel"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"limit"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"daysBack"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"session"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"YOUR_SESSION_STRING_HERE"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  AI integration with MCP
&lt;/h3&gt;

&lt;p&gt;In the era of AI agents, data extraction needs to be "AI-readable". This Actor supports the &lt;strong&gt;Model Context Protocol (MCP)&lt;/strong&gt;. This allows AI tools like Claude Desktop or custom GPTs to "call" the scraper as a tool. You can ask your AI, "What are the top 5 news posts from the last 24 hours on &lt;code&gt;@svtvnews&lt;/code&gt;?" and it will use the Actor to fetch the data and summarize it for you.&lt;/p&gt;

&lt;h3&gt;
  
  
  Pay-Per-Event pricing: you only pay for what you get
&lt;/h3&gt;

&lt;p&gt;The Actor uses Apify's &lt;strong&gt;Pay-Per-Event (PPE)&lt;/strong&gt; monetization model, meaning the cost is tied directly to your results:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Actor start&lt;/strong&gt;: A small flat fee is charged once per run to cover Telegram connection and authentication overhead.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Per result&lt;/strong&gt;: A charge is applied for each post successfully scraped and pushed to your dataset.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Critically, the Actor is built to &lt;strong&gt;respect your budget limit&lt;/strong&gt;. The moment you reach the maximum charge you have configured, the Actor stops scraping immediately — it will never do work beyond what you have agreed to pay for. This makes it far more cost-predictable than time-based or flat-rate pricing models.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why use Apify Actors instead of building your own scraper?
&lt;/h2&gt;

&lt;p&gt;You might be tempted to use a library like &lt;code&gt;Telethon&lt;/code&gt; or &lt;code&gt;Pyrogram&lt;/code&gt; to build your own script. While this works for hobby projects, it quickly breaks down in production:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Anti-blocking&lt;/strong&gt;: Telegram is very aggressive with "Too Many Requests" (FloodWait) errors. Apify Actors include logic to handle these delays gracefully.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;No infrastructure to manage&lt;/strong&gt;: You don't need to worry about Docker containers, server costs, or task scheduling. Apify Cloud manages everything.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Easy scaling&lt;/strong&gt;: Whether you want to scrape one channel or ten thousand, the platform scales the resources automatically.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Standardized data&lt;/strong&gt;: No need to spend hours cleaning messy HTML or inconsistent JSON responses.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Use cases in action
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Scenario A: The crypto trader
&lt;/h3&gt;

&lt;p&gt;A crypto trader wants to monitor sentiment on the "CryptoNinjas" channel. By scraping the reaction counts for every post, they can determine if a specific announcement is being received positively (lots of 🔥 and 👍) or negatively (lots of 🤡 or 🤬). They utilize the Apify API to fetch this data hourly and feed it into a sentiment score dashboard.&lt;/p&gt;

&lt;h3&gt;
  
  
  Scenario B: The news aggregator
&lt;/h3&gt;

&lt;p&gt;A media startup wants to create a consolidated feed of information from various independent news channels. They use the &lt;strong&gt;Media URLs&lt;/strong&gt; provided by the Actor to automatically download photos and videos, which they then republish on their own platform (respecting copyright and platform terms, of course).&lt;/p&gt;

&lt;h3&gt;
  
  
  Scenario C: The academic researcher
&lt;/h3&gt;

&lt;p&gt;A sociology student is studying the spread of specific political narratives. They use the scraper to download 5,000 posts across various community channels. Because the data is delivered in a clean CSV format, it can be immediately uploaded to a data analysis tool, like NVivo or R, for qualitative coding.&lt;/p&gt;

&lt;h2&gt;
  
  
  Best practices for responsible scraping
&lt;/h2&gt;

&lt;p&gt;When using any web scraping tool, it is important to be a good "web citizen":&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Respect rate limits&lt;/strong&gt;: Don't try to scrape millions of posts in a single second. Use the Actor's built-in limits to stay under the radar.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Public data only&lt;/strong&gt;: Only scrape data from public channels. This Actor is not designed to violate privacy or access private messages.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Adhere to GDPR and CCPA&lt;/strong&gt;: If you are collecting data that contains usernames, ensure you are handling this information in compliance with local data protection laws.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Keep your session string private&lt;/strong&gt;: Your session string grants full access to your Telegram account. Never share it publicly or commit it to source control.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Wrapping Up
&lt;/h2&gt;

&lt;p&gt;The days of manually scrolling through Telegram channels to find the information you need are now behind us. With &lt;a href="https://apify.com/eunit/telegram-channels-scraper" rel="noopener noreferrer"&gt;Telegram Channels Scraper&lt;/a&gt;, you can turn the world's most popular messaging app into your own structured database — all authenticated with nothing more than a QR code scan.&lt;/p&gt;

&lt;p&gt;Whether you are building the next big AI model, monitoring your competitors, or just trying to stay organized, our tool provides the reliability and depth of data you need to succeed.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;a href="https://apify.com/eunit/telegram-channels-scraper" rel="noopener noreferrer"&gt;Try Telegram Channels Scraper on Apify today&lt;/a&gt; and get ahead of your competitors!&lt;/strong&gt;&lt;/p&gt;




&lt;h3&gt;
  
  
  FAQ
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Do I need a Telegram account to use this?&lt;/strong&gt;&lt;br&gt;
Yes, a Telegram account is required for authentication. However, the process is frictionless — you simply scan a QR code on your first run, just like linking Telegram Web. No API keys, no developer credentials.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Do I need to create a Telegram Developer App?&lt;/strong&gt;&lt;br&gt;
No. The Actor uses built-in Telegram credentials by default, so you don't need to register at &lt;code&gt;my.telegram.org&lt;/code&gt; or generate an &lt;code&gt;api_id&lt;/code&gt; / &lt;code&gt;api_hash&lt;/code&gt;. If you prefer to use your own credentials for extra reliability, you can optionally set &lt;code&gt;TELEGRAM_API_ID&lt;/code&gt; and &lt;code&gt;TELEGRAM_API_HASH&lt;/code&gt; as environment variables (as a developer).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What is the session string and how do I save it?&lt;/strong&gt;&lt;br&gt;
After your first QR code login, the Actor prints a session string in the log. This string lets you re-authenticate on future runs without scanning the QR code again. Copy it from the log (or retrieve it from the &lt;code&gt;SESSION_STRING&lt;/code&gt; key in your run's Key-Value Store) and paste it into the &lt;strong&gt;Saved Session String&lt;/strong&gt; input field.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Is it legal to scrape Telegram?&lt;/strong&gt;&lt;br&gt;
Extracting publicly available factual data is generally considered legal. However, you should always check Telegram's Terms of Service and consult with legal counsel if you are unsure about your specific use case.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Can I export the data to my CRM?&lt;/strong&gt;&lt;br&gt;
Yes. Since Apify provides a robust API and integrations with platforms like Zapier and Make, you can automatically send your scraped Telegram data to Salesforce, HubSpot, or even a simple Google Sheet.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What formats do you support?&lt;/strong&gt;&lt;br&gt;
The Actor supports JSON, CSV, Excel (XLSX), XML, and HTML table formats.&lt;/p&gt;

</description>
      <category>telegram</category>
      <category>scraping</category>
      <category>intelligence</category>
      <category>ai</category>
    </item>
    <item>
      <title>How to Build a Global Internationalization (I18n) App with Next.js and AI</title>
      <dc:creator>Emmanuel Uchenna</dc:creator>
      <pubDate>Sat, 10 Jan 2026 13:16:17 +0000</pubDate>
      <link>https://dev.to/eunit/how-to-build-a-global-internationalization-i18n-app-with-nextjs-and-ai-1ejl</link>
      <guid>https://dev.to/eunit/how-to-build-a-global-internationalization-i18n-app-with-nextjs-and-ai-1ejl</guid>
      <description>&lt;p&gt;Before the proliferation of AI, localization was often a nightmare. You had to hire translators, manage extensive JSON files for every language, and pray that your context didn't get lost in translation. If you wanted to translate an &lt;em&gt;entire website&lt;/em&gt; on the fly? Forget about it. That was a job for enterprise giants with bottomless budgets.&lt;/p&gt;

&lt;p&gt;In this article, we will build &lt;a href="https://lingo-app-azure.vercel.app/" rel="noopener noreferrer"&gt;&lt;strong&gt;Lingo.app&lt;/strong&gt;&lt;/a&gt; from scratch. It is a stunning, production-ready web application that can scrape any website and instantly localize its content into 83+ languages. We will do this using the power of &lt;strong&gt;Next.js&lt;/strong&gt;, &lt;strong&gt;Tailwind CSS&lt;/strong&gt;, and the &lt;a href="https://apify.com/eunit/ai-website-content-localizer-scraper" rel="noopener noreferrer"&gt;&lt;strong&gt;AI Website Content Localizer &amp;amp; Scraper&lt;/strong&gt;&lt;/a&gt; Actor on the &lt;a href="https://apify.com/store?fpr=eunit" rel="noopener noreferrer"&gt;Apify platform&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;We wrote this article for developers who want to ship global apps without the headache. Intermediate developers can likely speed through the setup, but we will explain every step clearly.&lt;/p&gt;

&lt;p&gt;By the end of this article, you will have a fully functional app that manages scraping of any website, context-aware AI translation, and UI state—all in a scalable Next.js architecture.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Note&lt;/strong&gt;: This article was originally published in &lt;a href="https://www.eunit.me/blog/build-a-global-internationalization-i18n-app-with-nextjs-and-ai" rel="noopener noreferrer"&gt;Eunit.me&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;If you are excited, then let’s dive in.&lt;/p&gt;

&lt;h2&gt;
  
  
  TL;DR
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Checkout the complete repository on GitHub: &lt;a href="https://github.com/Eunit99/lingo.app" rel="noopener noreferrer"&gt;https://github.com/Eunit99/lingo.app&lt;/a&gt; 📖&lt;/li&gt;
&lt;li&gt;Checkout the deployed version of the Lingo.app live: &lt;a href="https://lingo-app-azure.vercel.app/" rel="noopener noreferrer"&gt;https://lingo-app-azure.vercel.app/&lt;/a&gt; 🚀&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdkbjzirewz318vflm3zj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdkbjzirewz318vflm3zj.png" alt="lingo.app screenshot" width="800" height="396"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Architecture
&lt;/h2&gt;

&lt;p&gt;We are keeping it modern and simple:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Frontend&lt;/strong&gt;: Next.js 15 (App Router) for the framework.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Styling&lt;/strong&gt;: Tailwind CSS v4 for that premium, dark-mode aesthetic.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Backend&lt;/strong&gt;: A Next.js API Route acting as a secure proxy.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Intelligence&lt;/strong&gt;: The &lt;strong&gt;AI Website Content Localizer &amp;amp; Scraper&lt;/strong&gt; Actor on Apify, which handles the headless browser scraping (Playwright) and AI translation (Lingo.dev).&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  File tree
&lt;/h3&gt;

&lt;p&gt;Below is how the file structure of this app will be:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;|   .gitignore
|   next-env.d.ts
|   eslint.config.mjs
|   next.config.ts
|   postcss.config.mjs
|   tsconfig.json
|   package.json
|   package-lock.json
|   env.example
|   .env.local
|   tsconfig.tsbuildinfo
|   README.md
|   
+---app
|   |   page.tsx
|   |   globals.css
|   |   icon.png
|   |   layout.tsx
|   |   
|   \---api
|       \---localize
|               route.ts
|               
+---public
|       grid.svg
|       
+---components
|       Features.tsx
|       Hero.tsx
|       LocalizationForm.tsx
|       
+---lib
|       apify.ts
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;p&gt;To follow along, you will need:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Node.js&lt;/strong&gt; installed (v18 or higher).&lt;/li&gt;
&lt;li&gt;An &lt;strong&gt;Apify account&lt;/strong&gt;. You can &lt;a href="https://console.apify.com/sign-up?fpr=eunit" rel="noopener noreferrer"&gt;sign up for free&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;A text editor like VS Code.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Step 1: Initialize the project
&lt;/h2&gt;

&lt;p&gt;First, we create a new Next.js application. We use the &lt;code&gt;--no-src-dir&lt;/code&gt; flag to keep our project root clean and flat.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npx create-next-app@latest lingo.app &lt;span class="nt"&gt;--typescript&lt;/span&gt; &lt;span class="nt"&gt;--eslint&lt;/span&gt; &lt;span class="nt"&gt;--tailwind&lt;/span&gt; &lt;span class="nt"&gt;--no-src-dir&lt;/span&gt; &lt;span class="nt"&gt;--app&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Navigate into your new directory and install the necessary dependencies, specifically the explicit React types to avoid TS errors and the Apify client.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;cd &lt;/span&gt;lingo.app
npm &lt;span class="nb"&gt;install &lt;/span&gt;apify-client
npm &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;--save-dev&lt;/span&gt; @types/react @types/react-dom
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Setting up the Global Design System
&lt;/h3&gt;

&lt;p&gt;We want a premium feel right from the start. Open &lt;code&gt;app/globals.css&lt;/code&gt; and replace the default styles with this modern Tailwind v4 setup. We are defining a deep slate color palette for a developer-focused dark mode.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight css"&gt;&lt;code&gt;&lt;span class="k"&gt;@import&lt;/span&gt; &lt;span class="s1"&gt;"tailwindcss"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="nd"&gt;:root&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="py"&gt;--background&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="m"&gt;#0f172a&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="c"&gt;/* Slate 900 */&lt;/span&gt;
  &lt;span class="py"&gt;--foreground&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="m"&gt;#f8fafc&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="c"&gt;/* Slate 50 */&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="nt"&gt;body&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nl"&gt;color&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;var&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;--foreground&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="nl"&gt;background&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;var&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;--background&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="nl"&gt;font-family&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Arial&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;Helvetica&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;sans-serif&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="nl"&gt;overflow-x&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;hidden&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="k"&gt;@layer&lt;/span&gt; &lt;span class="n"&gt;components&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nc"&gt;.btn-primary&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="err"&gt;@apply&lt;/span&gt; &lt;span class="err"&gt;px-6&lt;/span&gt; &lt;span class="err"&gt;py-3&lt;/span&gt; &lt;span class="err"&gt;bg-indigo-600&lt;/span&gt; &lt;span class="py"&gt;hover&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="n"&gt;bg-indigo-500&lt;/span&gt; &lt;span class="n"&gt;text-white&lt;/span&gt; &lt;span class="n"&gt;rounded-lg&lt;/span&gt; &lt;span class="n"&gt;font-medium&lt;/span&gt; &lt;span class="n"&gt;transition-all&lt;/span&gt; &lt;span class="n"&gt;duration-200&lt;/span&gt; &lt;span class="n"&gt;shadow-lg&lt;/span&gt; &lt;span class="n"&gt;hover&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="n"&gt;shadow-indigo-500&lt;/span&gt;&lt;span class="p"&gt;/&lt;/span&gt;&lt;span class="m"&gt;30&lt;/span&gt; &lt;span class="n"&gt;active&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="n"&gt;scale-95&lt;/span&gt; &lt;span class="n"&gt;disabled&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="n"&gt;opacity-50&lt;/span&gt; &lt;span class="n"&gt;disabled&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="n"&gt;cursor-not-allowed&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="nc"&gt;.input-field&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="err"&gt;@apply&lt;/span&gt; &lt;span class="err"&gt;w-full&lt;/span&gt; &lt;span class="err"&gt;bg-slate-800&lt;/span&gt; &lt;span class="err"&gt;border&lt;/span&gt; &lt;span class="err"&gt;border-slate-700&lt;/span&gt; &lt;span class="err"&gt;rounded-lg&lt;/span&gt; &lt;span class="err"&gt;px-4&lt;/span&gt; &lt;span class="err"&gt;py-3&lt;/span&gt; &lt;span class="err"&gt;text-white&lt;/span&gt; &lt;span class="err"&gt;placeholder-slate-400&lt;/span&gt; &lt;span class="py"&gt;focus&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="n"&gt;outline-none&lt;/span&gt; &lt;span class="n"&gt;focus&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="n"&gt;ring-2&lt;/span&gt; &lt;span class="n"&gt;focus&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="n"&gt;ring-indigo-500&lt;/span&gt; &lt;span class="n"&gt;focus&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="n"&gt;border-transparent&lt;/span&gt; &lt;span class="n"&gt;transition-all&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="nc"&gt;.card&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="err"&gt;@apply&lt;/span&gt; &lt;span class="err"&gt;bg-slate-800/50&lt;/span&gt; &lt;span class="err"&gt;backdrop-blur-md&lt;/span&gt; &lt;span class="err"&gt;border&lt;/span&gt; &lt;span class="err"&gt;border-slate-700/50&lt;/span&gt; &lt;span class="err"&gt;rounded-xl&lt;/span&gt; &lt;span class="err"&gt;p-6&lt;/span&gt; &lt;span class="err"&gt;shadow-xl;&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Next, let's configure the root layout to apply this background globally. Open &lt;code&gt;app/layout.tsx&lt;/code&gt;. We are adding a subtle grid background pattern for texture.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight tsx"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="kd"&gt;type&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;Metadata&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;next&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;Inter&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;next/font/google&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;./globals.css&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;inter&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Inter&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;subsets&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;latin&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;metadata&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;Metadata&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="na"&gt;title&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Lingo.app - Globalize Your Web&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;description&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Instantly localize any website or text into 83+ languages using AI.&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;default&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;RootLayout&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
  &lt;span class="nx"&gt;children&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;}:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nl"&gt;children&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;React&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;ReactNode&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;return &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;html&lt;/span&gt; &lt;span class="na"&gt;lang&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"en"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;body&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;inter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;className&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt; min-h-screen bg-gradient-to-br from-slate-900 via-slate-900 to-indigo-950 selection:bg-indigo-500/30`&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
        &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="cm"&gt;/* Subtle Grid Background */&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"fixed inset-0 bg-[url('/grid.svg')] bg-center [mask-image:linear-gradient(180deg,white,rgba(255,255,255,0))]"&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;

        &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;main&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"relative z-10 w-full max-w-7xl mx-auto px-4 sm:px-6 lg:px-8 py-8"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
             &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="cm"&gt;/* Simple Header */&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
            &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;header&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"flex justify-between items-center py-6 mb-12"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
                &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"flex items-center space-x-2"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
                    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"text-2xl font-bold bg-clip-text text-transparent bg-gradient-to-r from-white to-slate-400"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;lingo.app&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
                &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
            &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;header&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
            &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;children&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;main&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;body&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;html&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;(Note: You will need a simple &lt;code&gt;public/grid.svg&lt;/code&gt; file for the background. You can find free SVG patterns online or generate one.)&lt;/strong&gt; See the one use for the lingo.app project on &lt;a href="https://github.com/Eunit99/lingo.app/blob/main/public/grid.svg" rel="noopener noreferrer"&gt;GitHub&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Step 2: Integrate the Apify Client
&lt;/h2&gt;

&lt;p&gt;Now for the engine room. We need a way to communicate with the Apify platform securely.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Client Wrapper (&lt;code&gt;lib/apify.ts&lt;/code&gt;)
&lt;/h3&gt;

&lt;p&gt;This client will enable communication with the &lt;a href="https://apify.com/eunit/ai-website-content-localizer-scraper" rel="noopener noreferrer"&gt;AI Website Content Localizer &amp;amp; Scraper Actor&lt;/a&gt; on Apify.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuzwxyenelvsv443d4d2c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuzwxyenelvsv443d4d2c.png" alt="AI Website Content Localizer &amp;amp; Scraper" width="800" height="453"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Create a folder &lt;code&gt;lib&lt;/code&gt; and a file &lt;code&gt;apify.ts&lt;/code&gt;. This acts as our localized SDK. It initializes the client and defines the types for our input.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;ApifyClient&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;apify-client&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kr"&gt;interface&lt;/span&gt; &lt;span class="nx"&gt;LocalizerInput&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nl"&gt;token&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="nl"&gt;lingoApiKey&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="c1"&gt;// The secret key for Lingo.dev&lt;/span&gt;
  &lt;span class="nl"&gt;mode&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;WEB&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;TEXT&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="nl"&gt;targetLanguages&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;[];&lt;/span&gt;
  &lt;span class="nl"&gt;startUrls&lt;/span&gt;&lt;span class="p"&gt;?:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;url&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt; &lt;span class="p"&gt;}[];&lt;/span&gt;
  &lt;span class="nl"&gt;text&lt;/span&gt;&lt;span class="p"&gt;?:&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;runLocalizerActor&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;params&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;LocalizerInput&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;token&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;lingoApiKey&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;mode&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;targetLanguages&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;startUrls&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;text&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;params&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;ApifyClient&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;token&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;token&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;

  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;input&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;mode&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="nx"&gt;lingoApiKey&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="nx"&gt;targetLanguages&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;startUrls&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;mode&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;WEB&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="nx"&gt;startUrls&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;undefined&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;text&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;mode&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;TEXT&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="nx"&gt;text&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;undefined&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;};&lt;/span&gt;

  &lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="c1"&gt;// Run the Actor and wait for it to finish&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;run&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;actor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;eunit/ai-website-content-localizer-scraper&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;call&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;input&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;run&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;throw&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Actor run failed to start or return.&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="c1"&gt;// Fetch results from the default dataset&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;items&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;dataset&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;run&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;defaultDatasetId&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;listItems&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;items&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Apify Actor Run Error:&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="k"&gt;throw&lt;/span&gt; &lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  The API Proxy (&lt;code&gt;app/api/localize/route.ts&lt;/code&gt;)
&lt;/h3&gt;

&lt;p&gt;We cannot leak our Apify API Token to the client-side browser code. To solve this, we create an API Route that acts as a proxy. The client sends the URL/Text, and the server attaches the token and calls Apify.&lt;/p&gt;

&lt;p&gt;Create &lt;code&gt;app/api/localize/route.ts&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;NextResponse&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;next/server&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;runLocalizerActor&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;@/lib/apify&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;POST&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;Request&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;body&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;text&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;mode&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;targetLanguages&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;lingoApiKey&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;apifyToken&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;APIFY_API_TOKEN&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;apifyToken&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;NextResponse&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;error&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Server misconfigured: APIFY_API_TOKEN missing. Please set it in .env.local&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;status&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;500&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;finalLingoKey&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;lingoApiKey&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;LINGO_API_KEY&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;finalLingoKey&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;NextResponse&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;error&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Lingo API Key is required. Please provide it in the form or set LINGO_API_KEY env.&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;status&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;400&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;mode&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;TEXT&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="nx"&gt;text&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="nx"&gt;text&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;500&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;NextResponse&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;error&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Text content exceeds the 500 character limit.&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;status&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;400&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;startUrls&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;url&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="p"&gt;[{&lt;/span&gt; &lt;span class="nx"&gt;url&lt;/span&gt; &lt;span class="p"&gt;}]&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;undefined&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="c1"&gt;// Default languages if not provided&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;languages&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;targetLanguages&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="nx"&gt;targetLanguages&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="nx"&gt;targetLanguages&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;es&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;fr&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;de&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;];&lt;/span&gt;

    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;items&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;runLocalizerActor&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
      &lt;span class="na"&gt;token&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;apifyToken&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;lingoApiKey&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;finalLingoKey&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;mode&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;mode&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;WEB&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;TEXT&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;targetLanguages&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;languages&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="nx"&gt;startUrls&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="nx"&gt;text&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;

    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;NextResponse&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;success&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;data&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;items&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;any&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;API Route Error:&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;NextResponse&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;success&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;error&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Unknown error&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;status&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;500&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To prevent usage abuse and manage API costs, we implemented a strict 500-character limit across both the frontend and backend.&lt;/p&gt;

&lt;p&gt;In the user interface in &lt;code&gt;components/LocalizationForm.tsx&lt;/code&gt;, we verified that the localizer input remains within bounds by adding a &lt;code&gt;maxLength&lt;/code&gt; attribute and a real-time character counter (e.g., "0/500") that visually alerts the user as they type.&lt;/p&gt;

&lt;p&gt;Crucially, we backed this up with server-side validation in &lt;code&gt;app/api/localize/route.ts&lt;/code&gt;. Before sending any data to Apify, the API route checks the text length and immediately rejects requests exceeding 500 characters with a &lt;code&gt;400 Bad Request&lt;/code&gt;. This ensures the application remains secure and cost-efficient, even if the client-side restrictions are bypassed.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 3: Building the UI Components
&lt;/h2&gt;

&lt;p&gt;Now let's build the interactive frontend.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Hero Component
&lt;/h3&gt;

&lt;p&gt;In &lt;code&gt;components/Hero.tsx&lt;/code&gt;, we create a bold introduction.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight tsx"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;React&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;react&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;default&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;Hero&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;return &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"text-center max-w-3xl mx-auto space-y-6"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"inline-flex items-center space-x-2 px-3 py-1 rounded-full bg-indigo-500/10 border border-indigo-500/20 text-indigo-400 text-sm font-medium mb-4"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;Powered by Lingo.dev &lt;span class="err"&gt;&amp;amp;&lt;/span&gt; Apify&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;h1&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"text-5xl md:text-7xl font-extrabold tracking-tight text-white leading-tight"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
        Globalize your &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"text-indigo-500"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;content&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt; instantly.
      &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;h1&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;p&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"text-lg md:text-xl text-slate-400 max-w-2xl mx-auto"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
        Scrape any website and translate it into 83+ languages with context-aware AI.
        The ultimate tool for shipping global apps fast.
      &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;p&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://lingo-app-azure.vercel.app/" rel="noopener noreferrer"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fajxv3qw7hn6r5dswkx3e.png" alt="Lingo.app UI" width="597" height="800"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  The Main Logic: LocalizationForm
&lt;/h3&gt;

&lt;p&gt;This is where the user interacts with the app. We need to handle:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Toggling between &lt;strong&gt;WEB&lt;/strong&gt; (URL) and &lt;strong&gt;TEXT&lt;/strong&gt; mode.&lt;/li&gt;
&lt;li&gt;Selecting target languages.&lt;/li&gt;
&lt;li&gt;Calling our API route.&lt;/li&gt;
&lt;li&gt;Displaying the results in a tabbed interface.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Create &lt;code&gt;components/LocalizationForm.tsx&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight tsx"&gt;&lt;code&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;use client&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;React&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;useState&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;react&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;default&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;LocalizationForm&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;mode&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;setMode&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;useState&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;WEB&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;TEXT&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;WEB&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;setUrl&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;useState&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;''&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;textInput&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;setTextInput&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;useState&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;''&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;selectedLangs&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;setSelectedLangs&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;useState&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;[]&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;es&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;fr&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;de&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]);&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;loading&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;setLoading&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;useState&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;results&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;setResults&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;useState&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kr"&gt;any&lt;/span&gt;&lt;span class="p"&gt;[]&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;([]);&lt;/span&gt;

  &lt;span class="c1"&gt;// ... (Toggle logic helper functions)&lt;/span&gt;

  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;popularLangs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;code&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;es&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;label&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Spanish&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;code&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;fr&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;label&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;French&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;code&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;de&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;label&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;German&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;code&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;ja&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;label&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Japanese&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;code&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;zh&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;label&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Chinese&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;code&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;pt&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;label&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Portuguese&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;code&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;it&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;label&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Italian&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;code&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;ko&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;label&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Korean&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="p"&gt;];&lt;/span&gt;

  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;toggleLang&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;code&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;selectedLangs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;includes&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;code&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="nf"&gt;setSelectedLangs&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;selectedLangs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;filter&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;c&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nx"&gt;c&lt;/span&gt; &lt;span class="o"&gt;!==&lt;/span&gt; &lt;span class="nx"&gt;code&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="nf"&gt;setSelectedLangs&lt;/span&gt;&lt;span class="p"&gt;([...&lt;/span&gt;&lt;span class="nx"&gt;selectedLangs&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;code&lt;/span&gt;&lt;span class="p"&gt;]);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;};&lt;/span&gt;

  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;handleSubmit&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;React&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;FormEvent&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;e&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;preventDefault&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="nf"&gt;setLoading&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="nf"&gt;setResults&lt;/span&gt;&lt;span class="p"&gt;([]);&lt;/span&gt;

    &lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;res&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;fetch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;/api/localize&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="na"&gt;method&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;POST&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="na"&gt;headers&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Content-Type&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;application/json&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
        &lt;span class="na"&gt;body&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stringify&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
          &lt;span class="nx"&gt;mode&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
          &lt;span class="na"&gt;url&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;mode&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;WEB&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="nx"&gt;url&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;undefined&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
          &lt;span class="na"&gt;text&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;mode&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;TEXT&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="nx"&gt;textInput&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;undefined&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
          &lt;span class="na"&gt;targetLanguages&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;selectedLangs&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="p"&gt;}),&lt;/span&gt;
      &lt;span class="p"&gt;});&lt;/span&gt;

      &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
      &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;success&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="nf"&gt;setResults&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;err&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;err&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;finally&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="nf"&gt;setLoading&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;};&lt;/span&gt;

  &lt;span class="k"&gt;return &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"w-full max-w-4xl mx-auto space-y-8"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"card"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
        &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="cm"&gt;/* Mode Toggles */&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"flex space-x-4 mb-6 border-b border-slate-700/50 pb-4"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
          &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;button&lt;/span&gt; &lt;span class="na"&gt;onClick&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nf"&gt;setMode&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;WEB&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="s2"&gt;`... &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;mode&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;WEB&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;text-indigo-400&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;text-slate-400&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
            Web Scraper
          &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;button&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
          &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;button&lt;/span&gt; &lt;span class="na"&gt;onClick&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nf"&gt;setMode&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;TEXT&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="s2"&gt;`... &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;mode&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;TEXT&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;text-indigo-400&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;text-slate-400&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
            Text Localizer
          &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;button&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;

        &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;form&lt;/span&gt; &lt;span class="na"&gt;onSubmit&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;handleSubmit&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"space-y-6"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
            &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="cm"&gt;/* Input Fields specific to mode */&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
            &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;mode&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;WEB&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
                &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;input&lt;/span&gt; &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"url"&lt;/span&gt; &lt;span class="na"&gt;value&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;url&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="na"&gt;onChange&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;e&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nf"&gt;setUrl&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;e&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;target&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;value&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"input-field"&lt;/span&gt; &lt;span class="na"&gt;placeholder&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"https://example.com"&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
            &lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
                &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
                  &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"flex justify-between items-center mb-2"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
                    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;label&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"block text-sm font-medium text-slate-300"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;Text Content&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;label&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
                    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="s2"&gt;`text-xs &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;textInput&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;=&lt;/span&gt; &lt;span class="mi"&gt;500&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;text-red-400&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;text-slate-500&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
                      &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;textInput&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;/500
                    &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
                  &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
                  &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;textarea&lt;/span&gt;
                    &lt;span class="na"&gt;placeholder&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"Paste your text here to localize..."&lt;/span&gt;
                    &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"input-field min-h-[150px]"&lt;/span&gt;
                    &lt;span class="na"&gt;value&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;textInput&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
                    &lt;span class="na"&gt;onChange&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;e&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nf"&gt;setTextInput&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;e&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;target&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;value&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;slice&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;500&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
                    &lt;span class="na"&gt;maxLength&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="mi"&gt;500&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
                    &lt;span class="na"&gt;required&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;mode&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;TEXT&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
                  &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
                &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
            &lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;

            &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="cm"&gt;/* Language Selector Logic Here */&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
            &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="cm"&gt;/* ... */&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;

            &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"flex justify-end"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
                &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;button&lt;/span&gt; &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"submit"&lt;/span&gt; &lt;span class="na"&gt;disabled&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;loading&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"btn-primary"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
                    &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;loading&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Processing...&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Localize Content&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
                &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;button&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
            &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;form&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;

      &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="cm"&gt;/* Results Rendering */&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
      &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;results&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;item&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;idx&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
          &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;ResultCard&lt;/span&gt; &lt;span class="na"&gt;key&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;idx&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="na"&gt;item&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;item&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;))&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;ResultCard&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="nx"&gt;item&lt;/span&gt; &lt;span class="p"&gt;}:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nl"&gt;item&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;any&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="c1"&gt;// Shows original vs localized tabs&lt;/span&gt;
    &lt;span class="c1"&gt;// ...&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Step 4: Page Assembly
&lt;/h2&gt;

&lt;p&gt;Finally, we string it all together in &lt;code&gt;app/page.tsx&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight tsx"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;React&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;react&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;Hero&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;@/components/Hero&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;LocalizationForm&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;@/components/LocalizationForm&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;Features&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;@/components/Features&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;default&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;Home&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;return &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"flex flex-col space-y-24"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"flex flex-col items-center justify-start space-y-12"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;Hero&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;LocalizationForm&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;Features&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Why the Apify Actor Integration Wins
&lt;/h2&gt;

&lt;p&gt;You might be wondering: &lt;em&gt;"Why can't I just use &lt;code&gt;fetch&lt;/code&gt; to get the HTML and send it to ChatGPT?"&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;You could try, but you would likely fail for three reasons. Integrating the &lt;strong&gt;AI Website Content Localizer &amp;amp; Scraper&lt;/strong&gt; Actor via Apify solves these specific engineering headaches:&lt;/p&gt;

&lt;h3&gt;
  
  
  1. The "Context" Problem
&lt;/h3&gt;

&lt;p&gt;Standard LLMs see text as a flat string. They don't know that "Home" is a navigation button or that "Get Started" is a primary CTA. If you strip the HTML, you lose the context. If you keep the HTML, you confuse the model with thousands of lines of markup.&lt;/p&gt;

&lt;p&gt;This &lt;a href="https://apify.com/eunit/ai-website-content-localizer-scraper" rel="noopener noreferrer"&gt;AI Website Content Localizer &amp;amp; Scraper&lt;/a&gt; Actor is smart. It extracts the semantics of the page, identifies the translatable nodes, and sends them to Lingo.dev, which is specifically trained to understand UI and content context. The result? Buttons stay short, and articles sound natural.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. The Scraping Battlefield
&lt;/h3&gt;

&lt;p&gt;Modern websites are hostile. They use hydration (client-side rendering), anti-bot protections (Cloudflare, CAPTCHAs), and complex DOM structures.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Without Apify&lt;/strong&gt;: You have to manage a fleet of headless browsers, rotate proxies, handle retries, and parse dynamic JavaScript.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;With Apify&lt;/strong&gt;: You make one API call. The Actor handles the headless browser infrastructure, proxy rotation, and Playwright execution for you.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  3. Serverless Scalability
&lt;/h3&gt;

&lt;p&gt;Translating a massive documentation site or a large e-commerce catalog requires significant computing power. If you run this logic in your own Next.js API route, you risk timeouts and server crashes.&lt;/p&gt;

&lt;p&gt;By offloading this to &lt;a href="https://console.apify.com/sign-up?fpr=eunit" rel="noopener noreferrer"&gt;Apify&lt;/a&gt;, you get a serverless queue system. You can submit up to 100 URLs at once, and the Apify platform scales up the necessary Actors to process them in parallel. Your Next.js app remains lightweight and responsive.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 5: Setup &amp;amp; Deploy
&lt;/h2&gt;

&lt;p&gt;We are ready to launch. Create a &lt;code&gt;.env.local&lt;/code&gt; file in your root folder:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nv"&gt;APIFY_API_TOKEN&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;your_apify_token
&lt;span class="nv"&gt;LINGO_API_KEY&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;your_lingo_key
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Get your &lt;code&gt;APIFY_API_TOKEN&lt;/code&gt; by signing up at &lt;a href="https://console.apify.com/sign-up?fpr=eunit" rel="noopener noreferrer"&gt;Apify&lt;/a&gt;. Apify gives you a free 5$ credit every month. This is more than enough to build your first app.&lt;/p&gt;

&lt;p&gt;To get your &lt;code&gt;LINGO_API_KEY&lt;/code&gt;, sign up at &lt;a href="https://lingo.dev" rel="noopener noreferrer"&gt;Lingo.dev&lt;/a&gt;. Lingo.dev gives you 10, 000 free words per month to try out new things.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu8d5pgidkqr6tte3pf77.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu8d5pgidkqr6tte3pf77.png" alt="Lingo.dev API usage" width="800" height="558"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once set, it is time to run your app.&lt;/p&gt;

&lt;h3&gt;
  
  
  Running your App
&lt;/h3&gt;

&lt;p&gt;Run your development server:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm run dev
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Open &lt;code&gt;http://localhost:3000&lt;/code&gt;, and you will see your global localization app, ready to scrape and translate the web.&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping Up
&lt;/h2&gt;

&lt;p&gt;We didn't just build a demo. We built a tool that addresses a real business problem: accessing global information. By using the &lt;strong&gt;&lt;a href="https://apify.com/eunit/ai-website-content-localizer-scraper" rel="noopener noreferrer"&gt;AI Website Content Localizer &amp;amp; Scraper on Apify&lt;/a&gt;&lt;/strong&gt;, we offloaded the most challenging aspects of scraping and AI context management, allowing us to focus on building a premium user experience.&lt;/p&gt;

&lt;p&gt;If you are building global applications, you need tools that scale with you. Check out the &lt;strong&gt;&lt;a href="https://apify.com/eunit/ai-website-content-localizer-scraper" rel="noopener noreferrer"&gt;AI Website Content Localizer &amp;amp; Scraper on Apify&lt;/a&gt;&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Happy building 🚀&lt;/p&gt;

</description>
      <category>ai</category>
      <category>globalization</category>
      <category>localization</category>
      <category>nextjs</category>
    </item>
    <item>
      <title>Building a Modern Digital Garden with Google AI: My New Year, New You Portfolio</title>
      <dc:creator>Emmanuel Uchenna</dc:creator>
      <pubDate>Sat, 03 Jan 2026 22:55:23 +0000</pubDate>
      <link>https://dev.to/eunit/building-a-modern-digital-garden-with-google-ai-my-new-year-new-you-portfolio-18l0</link>
      <guid>https://dev.to/eunit/building-a-modern-digital-garden-with-google-ai-my-new-year-new-you-portfolio-18l0</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/new-year-new-you-google-ai-2025-12-31"&gt;New Year, New You Portfolio Challenge Presented by Google AI&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  About me
&lt;/h2&gt;

&lt;p&gt;Hi, I’m &lt;a href="https://www.eunit.me/about" rel="noopener noreferrer"&gt;Emmanuel Uchenna&lt;/a&gt; - a software engineer, technical writer, and digital health advocate who is passionate about building technology that empowers people.&lt;/p&gt;

&lt;p&gt;My journey in tech began over five years ago, fueled by a curiosity about how lines of code could translate into meaningful human experiences. Today, I specialize in crafting clean, scalable user interfaces using React, Next.js, and the broader modern web ecosystem. But for me, code is just one part of the equation. I also love translating complex technical ideas into clear, engaging content through articles, documentation, and whitepapers.&lt;/p&gt;

&lt;p&gt;Beyond the terminal, I am deeply invested in the intersection of technology and healthcare. As a digital health advocate, I explore how software can be used to enhance patient outcomes and increase the accessibility of health information.&lt;/p&gt;

&lt;p&gt;My goal with this portfolio was simple yet ambitious: to create a platform that is fast, accessible, and truly reflects my current &lt;a href="https://www.eunit.me/about#journey" rel="noopener noreferrer"&gt;skillset&lt;/a&gt; and &lt;a href="https://www.eunit.me/design" rel="noopener noreferrer"&gt;style&lt;/a&gt;. I wanted a space that didn't just list my projects but demonstrated my philosophy of "premium simplicity" - a belief that the most effective digital experiences are those that get out of the user's way while providing delight through subtle interactions and solid performance.&lt;/p&gt;

&lt;h2&gt;
  
  
  Portfolio
&lt;/h2&gt;

&lt;p&gt;

&lt;/p&gt;
&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
    &lt;div class="c-embed__content"&gt;
        &lt;div class="c-embed__cover"&gt;
          &lt;a href="https://www.eunit.me/" class="c-link align-middle" rel="noopener noreferrer"&gt;
            &lt;img alt="" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Feunit.me%2Fopengraph-image%3Fd9f16d1a3759f919" height="auto" class="m-0"&gt;
          &lt;/a&gt;
        &lt;/div&gt;
      &lt;div class="c-embed__body"&gt;
        &lt;h2 class="fs-xl lh-tight"&gt;
          &lt;a href="https://www.eunit.me/" rel="noopener noreferrer" class="c-link"&gt;
            Emmanuel Uchenna | Software developer and technical writer
          &lt;/a&gt;
        &lt;/h2&gt;
          &lt;p class="truncate-at-3"&gt;
            Emmanuel Uchenna is a software engineer, technical writer, and digital health advocate passionate about building technology that empowers people. With over five years of experience, he specializes in crafting clean, scalable web and mobile applications with React, Next.js, and modern web tooling, while also translating complex technical ideas into clear, engaging content through articles, documentation, and whitepapers.
          &lt;/p&gt;
        &lt;div class="color-secondary fs-s flex items-center"&gt;
            &lt;img alt="favicon" class="c-embed__favicon m-0 mr-2 radius-0" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.eunit.me%2Ffavicon.ico%3Ffavicon.d40e508f.ico"&gt;
          eunit.me
        &lt;/div&gt;
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;




&lt;p&gt;&lt;em&gt;(Note: The above embed links to my portfolio deployed on Google Cloud Run. You can also visit the live site at &lt;a href="https://www.eunit.me/" rel="noopener noreferrer"&gt;eunit.me&lt;/a&gt;)&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  How I built it
&lt;/h2&gt;

&lt;p&gt;Building a portfolio in 2025 is an interesting challenge. The tools available to us have undergone significant evolution. For this project, I decided to lean heavily into the Google AI ecosystem to see how much it could accelerate my workflow without compromising on quality. The results were genuinely surprising, transforming what could have been a week-long slog into an inspired sprint.&lt;/p&gt;

&lt;h3&gt;
  
  
  The tech stack
&lt;/h3&gt;

&lt;p&gt;I chose a stack that emphasizes performance, scalability, and developer experience:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Core Framework&lt;/strong&gt;: Next.js (App Router) for a robust, server-side rendered foundation. The file-based routing and React Server Components allowed me to keep the application fast and SEO-friendly.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Styling&lt;/strong&gt;: Tailwind CSS. I adopted a utility-first approach to implement a strict &lt;a href="https://www.eunit.me/design" rel="noopener noreferrer"&gt;design system&lt;/a&gt;. This made handling responsive layouts and adopting dark mode incredibly seamless.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Database&lt;/strong&gt;: PostgreSQL. I needed a reliable relational database to manage dynamic content, such as tracking blog post views and managing newsletter subscriptions.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Deployment&lt;/strong&gt;: Google Cloud Run. I wanted a serverless, containerized deployment that could scale to zero when not in use but handle traffic spikes if a post went viral.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  The Deployment Pipeline: From Local to Global
&lt;/h3&gt;

&lt;p&gt;One of the requirements for this challenge was deploying to Google Cloud Run, and honestly, it was one of the smoothest parts of the process. I didn't want to "throw code over the wall"; I wanted a reproducible, containerized pipeline.&lt;/p&gt;

&lt;p&gt;I created a &lt;code&gt;Dockerfile&lt;/code&gt; that leverages a multi-stage build directly for Next.js. This keeps the final image lightweight, only around 80MB, by stripping out development dependencies and unused files.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight docker"&gt;&lt;code&gt;&lt;span class="c"&gt;# ... standard Next.js multi-stage build ...&lt;/span&gt;
&lt;span class="k"&gt;CMD&lt;/span&gt;&lt;span class="s"&gt; ["node", "server.js"]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Deploying was as simple as running a single command with the Gemini CLI helper to push the container to Google Artifact Registry and creating a simpler Cloud Run service. Now, whenever I push to my &lt;code&gt;main&lt;/code&gt; branch, a seamless CI/CD pipeline builds the container and updates the revision. This "git-push-to-deploy" workflow gives me the confidence to ship small, iterative updates frequently.&lt;/p&gt;

&lt;h3&gt;
  
  
  Leveraging Google AI tools
&lt;/h3&gt;

&lt;p&gt;This wasn't just a standard build; it was an AI-assisted development process. Here is a deep dive into how I used Google's tools to bring this vision to life:&lt;/p&gt;

&lt;h4&gt;
  
  
  1. Google's Antigravity (The AI-first IDE)
&lt;/h4&gt;

&lt;p&gt;Antigravity was the centerpiece of my development environment. It acted as my pair programmer throughout the entire process, fundamentally changing how I wrote code. Instead of constantly context-switching between my editor and a browser to look up documentation or debug errors, I stayed in the flow.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Refactoring with confidence&lt;/strong&gt;&lt;br&gt;
One of the most complex tasks was refactoring my custom UI components to be more reusable. For instance, I had multiple ad-hoc modal implementations scattered across the codebase—&lt;code&gt;ReviewModal&lt;/code&gt;, &lt;code&gt;DriverProfileModal&lt;/code&gt;, and others. I used Antigravity to analyze these disparate files. It suggested a unified &lt;code&gt;Modal&lt;/code&gt; architecture using &lt;code&gt;framer-motion&lt;/code&gt; for smooth entry and exit animations. It didn't just give me a snippet; it walked me through the implementation plan, ensuring I correctly handled prop drilling and accessibility (ARIA labels).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Intelligent debugging&lt;/strong&gt;&lt;br&gt;
During the build, I encountered a tricky TypeScript error with a Chakra UI &lt;code&gt;Text&lt;/code&gt; component regarding polymorphic props (&lt;code&gt;as&lt;/code&gt; prop conflicts). Usually, this would send me down a generic Stack Overflow rabbit hole. Instead, I asked Antigravity. It analyzed my specific component usage, explained &lt;em&gt;why&lt;/em&gt; the type inference was failing, and offered a precise fix that satisfied the TypeScript compiler without using &lt;code&gt;any&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Proactive optimization&lt;/strong&gt;&lt;br&gt;
Antigravity was like having a senior engineer looking over my shoulder. It proactively suggested checks for image optimization, reminding me to use the &lt;code&gt;sizes&lt;/code&gt; prop on my &lt;code&gt;next/image&lt;/code&gt; components to prevent layout shifts (CLS) and improve LCP scores.&lt;/p&gt;

&lt;h4&gt;
  
  
  2. AI Studio &amp;amp; Gemini Models
&lt;/h4&gt;

&lt;p&gt;While Antigravity handled the code, AI Studio dealt with the creative direction. I used it as my "Editor-in-Chief" to brainstorm the content strategy and structure of the site.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Content generation and refinement&lt;/strong&gt;&lt;br&gt;
I fed the model my previous bio and project descriptions and asked it to help me refine the tone to be more professional yet approachable. The "New Year, New You" theme resonated with me, so I utilized Gemini to help draft sections of the site that needed fresh text, particularly for my new "Health" section. It helped me articulate my dual passion for engineering and digital health in a way that felt cohesive rather than disjointed.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Idea generation&lt;/strong&gt;&lt;br&gt;
When I was stuck on how to present my "Research" papers, I used AI Studio to generate layout ideas. It suggested a card-based layout with abstract summaries, which I then implemented in code. It essentially bridged the gap between a vague idea and a concrete design specification.&lt;/p&gt;

&lt;h4&gt;
  
  
  3. Gemini CLI
&lt;/h4&gt;

&lt;p&gt;For repetitive tasks and automation, the Gemini CLI was a lifesaver. I used it to quickly scaffold standard boilerplate code for new pages and components. It sped up the "boring" parts of development so I could focus on the creative aspects, like the micro-interactions and layout adjustments.&lt;/p&gt;

&lt;h3&gt;
  
  
  Design decisions
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;The "Premium Simplicity" aesthetic&lt;/strong&gt;&lt;br&gt;
I opted for a strict &lt;strong&gt;black-and-white (B&amp;amp;W) aesthetic&lt;/strong&gt;. I believe that constraints breed creativity. By removing color as a primary tool for hierarchy, I was forced to rely on spacing, typography, and contrast to guide the user's eye. This resulted in a design that feels premium, modern, and uncluttered. It helps the actual content, my articles and projects, stand out without visual noise.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Component-driven architecture&lt;/strong&gt;&lt;br&gt;
I prioritized a component-driven architecture to ensure maintainability and flexibility. Every piece of the UI, from the &lt;code&gt;ProjectList&lt;/code&gt; to the &lt;code&gt;BlogCard&lt;/code&gt;, is a self-contained component. This modularity means that if I want to update the design of my cards next year, I only have to change it in one place.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Typography as interface&lt;/strong&gt;&lt;br&gt;
In a black-and-white design, typography does the heavy lifting. I chose a variable font that allows for subtle weight transitions on hover, creating a tactile feel without the need for color shifts. It’s accessible, performant, and sharp on high-DPI displays. I also ensured that all interactive elements have sufficient contrast ratios (meeting WCAG AAA standards), proving that "stylish" doesn't have to mean "unreadable."&lt;/p&gt;

&lt;h2&gt;
  
  
  What I'm most proud of
&lt;/h2&gt;

&lt;p&gt;There are a few aspects of this project that I am particularly proud of, as they represent significant personal and technical growth.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. The "Health" section&lt;/strong&gt;&lt;br&gt;
Integrating my work in &lt;a href="https://www.eunit.me/health" rel="noopener noreferrer"&gt;digital health&lt;/a&gt; into my developer portfolio was a big step. It’s a dedicated space for articles covering health and well-being topics, bridging the gap between my technical skills and my advocacy work. For a long time, I kept these two worlds separate; my GitHub profile showed one person, while my &lt;a href="https://www.eunit.me/blog/2025-year-in-review-the-wins-the-losses-and-the-work-no-one-saw" rel="noopener noreferrer"&gt;community work&lt;/a&gt; showed another.&lt;/p&gt;

&lt;p&gt;Merging them here makes &lt;a href="https://www.eunit.me/" rel="noopener noreferrer"&gt;the portfolio&lt;/a&gt; feel uniquely &lt;em&gt;mine&lt;/em&gt; and tells a more complete story of who I am.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Performance &amp;amp; accessibility&lt;/strong&gt;&lt;br&gt;
I didn't want just a pretty site; I wanted a fast one. With the help of Antigravity's suggestions, I optimized my images, refined my script loading strategies (for things like analytics), and minimized layout shifts. Achieving &lt;a href="https://pagespeed.web.dev/analysis/https-www-eunit-me/2bhikvi6k4?form_factor=desktop" rel="noopener noreferrer"&gt;high Lighthouse scores&lt;/a&gt; was a key milestone. It proves that you don't have to sacrifice performance for a modern aesthetic.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. The seamless AI workflow&lt;/strong&gt;&lt;br&gt;
Honesty, I am most proud of how effectively I integrated AI into the loop. Using Antigravity and Gemini felt less like using a tool and more like working with a collaborator who knows my codebase inside out. It allowed me to be more ambitious with my features, such as adding dynamic view counters and complex animations, and more confident in my code quality. It shifted my role from "writer of code" to "architect of solutions."&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping Up
&lt;/h2&gt;

&lt;p&gt;This challenge was the perfect excuse to finally ship the &lt;a href="https://www.eunit.me/blog/hello-world" rel="noopener noreferrer"&gt;v3 of my portfolio&lt;/a&gt;. It represents not just a visual refresh, but a "backend" refresh of my own skills and workflows. By embracing AI-first development with Google's tools, I've built something that sets the tone for my work in 2026.&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>googleaichallenge</category>
      <category>portfolio</category>
      <category>gemini</category>
    </item>
    <item>
      <title>How to scrape YouTube trends and popular channels</title>
      <dc:creator>Emmanuel Uchenna</dc:creator>
      <pubDate>Sat, 03 Jan 2026 17:10:49 +0000</pubDate>
      <link>https://dev.to/eunit/how-to-scrape-youtube-trends-and-popular-channels-4m07</link>
      <guid>https://dev.to/eunit/how-to-scrape-youtube-trends-and-popular-channels-4m07</guid>
      <description>&lt;p&gt;In today's world, where data is the new gold, it is important to have access to the data you need to make informed decisions. YouTube, as the world's second-largest search engine, is a repository of consumer insights, viral trends, and competitor strategies. But accessing this data at scale can be challenging.&lt;/p&gt;

&lt;p&gt;If you have ever tried to build a dashboard for viral content, conduct market research, or keep a pulse on what is hot around the globe, you have likely hit a wall. &lt;a href="https://developers.google.com/youtube/v3" rel="noopener noreferrer"&gt;The YouTube Data API&lt;/a&gt; comes with strict usage quotas that stifle your growth, while building your own scrapers often leads to a maintenance headache of broken selectors and IP bans.&lt;/p&gt;

&lt;p&gt;It doesn't have to be this way; there must surely be a better way to obtain this YouTube data. In this article, we will explore how you can use production-grade &lt;a href="https://www.apify.com?fpr=eunit" rel="noopener noreferrer"&gt;Apify Actors&lt;/a&gt; to scrape real-time YouTube trending videos and popular channel data without managing infrastructure, worrying about API quotas, or writing a single line of complex code.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why scrape YouTube trends?
&lt;/h2&gt;

&lt;p&gt;Before we dive into the "how," let's explore the "why." Scraping YouTube trends allows you to make data-driven decisions that can significantly impact your content strategy and marketing ROI. Here are the key reasons why top brands and creators are extracting this data:&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Trend analysis and content strategy
&lt;/h3&gt;

&lt;p&gt;Knowing what is popular right now allows you to ride the wave of viral content. By identifying trending topics, video formats, and relevant keywords, you can create timely, relevant, and engaging content that aligns with current audience interests. Instead of guessing what might work, you can base your creative decisions on hard data from millions of views.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Competitive intelligence
&lt;/h3&gt;

&lt;p&gt;Your competitors are likely already experimenting with new formats and strategies. By monitoring their most successful videos and analyzing their metadata (titles, tags, descriptions), you can &lt;a href="https://www.marketingexamined.com/blog/reverse-engineering-content-strategy" rel="noopener noreferrer"&gt;reverse-engineer their success&lt;/a&gt;. This will enable you to find gaps in the market that they are missing, or refine your own approach to outperform them in the algorithm.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Deep market research
&lt;/h3&gt;

&lt;p&gt;YouTube comments and engagement metrics offer an unfiltered view of consumer sentiment. Scraping this data enables you to analyze user behavior, understand pain points, and discover emerging needs before they become mainstream. This is crucial for product development and tailoring your messaging to resonate with your target audience.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. Global lead generation
&lt;/h3&gt;

&lt;p&gt;It is a no-brainer that trends vary drastically by region; as such, detailed scraping allows you to discover new opportunities and potential customers by analyzing what different audiences are searching for and engaging with across over 25 countries. Even if you are expanding into Brazil, India, or Japan, local trend data is invaluable.&lt;/p&gt;

&lt;h2&gt;
  
  
  The challenge with the official API and DIY scraping
&lt;/h2&gt;

&lt;p&gt;You might be asking, "Doesn't &lt;a href="https://developers.google.com/youtube/v3" rel="noopener noreferrer"&gt;YouTube have an API for this&lt;/a&gt;?" Yes, but it comes with significant limitations for serious data analysis:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Strict quotas:&lt;/strong&gt; The API has a daily credit limit that is easily exhausted by even modest data collection efforts. Scaling requires expensive enterprise agreements or complex quota management.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Limited availability:&lt;/strong&gt; Some granular data points, like specific trending categories in certain regions, can be difficult to access or require convoluted workarounds.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Maintenance:&lt;/strong&gt; If you choose to build your own scraper using tools like Selenium or Playwright, you face the constant battle of "cat and mouse." This is because YouTube frequently updates its HTML structure, which can cause custom scripts to break. Additionally, you need to manage rotating proxies and handle CAPTCHA to avoid IP bans.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The solution: Serverless YouTube scrapers
&lt;/h2&gt;

&lt;p&gt;This is where &lt;a href="https://apify.com/store?fpr=eunit" rel="noopener noreferrer"&gt;&lt;strong&gt;Apify Actors&lt;/strong&gt;&lt;/a&gt; come in. Actors are cloud-based programs that can perform any action in a web browser, from scraping data to automating workflows. They handle the heavy lifting, such as infrastructure, proxies, scaling, and anti-scraping protections, so that you can focus on the data.&lt;/p&gt;

&lt;p&gt;We will focus on two specific Actors designed to solve the YouTube data puzzle:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;&lt;a href="https://apify.com/eunit/youtube-trending-videos-by-categories" rel="noopener noreferrer"&gt;YouTube Trending Videos by Categories Scraper&lt;/a&gt;&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;&lt;a href="https://apify.com/eunit/youtube-popular-channels-scraper" rel="noopener noreferrer"&gt;YouTube Popular Channels Scraper&lt;/a&gt;&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;These tools operate on a &lt;strong&gt;Pay-Per-Event&lt;/strong&gt; model, meaning you only pay for the successful results you get, not for the time the server runs.&lt;/p&gt;

&lt;h3&gt;
  
  
  Tool 1: YouTube Trending Videos by Categories Scraper
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkx2fpyj5wbe8nfd96id0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkx2fpyj5wbe8nfd96id0.png" alt="YouTube Trending Videos by Categories Scraper" width="800" height="122"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://apify.com/eunit/youtube-trending-videos-by-categories" rel="noopener noreferrer"&gt;The YouTube Trending Videos by Categories Scraper&lt;/a&gt; is your go-to tool for tracking viral content with precision. While standard tools may provide a generic "trending" list, this scraper enables granular filtering.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Key features:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Category-specific scraping:&lt;/strong&gt; Don't just see what is trending overall. Drill down into specific niches like &lt;strong&gt;Music&lt;/strong&gt;, &lt;strong&gt;Gaming&lt;/strong&gt;, &lt;strong&gt;News &amp;amp; Politics&lt;/strong&gt;, &lt;strong&gt;Sports&lt;/strong&gt;, &lt;strong&gt;Film &amp;amp; Animation&lt;/strong&gt;, and more.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Global coverage:&lt;/strong&gt; access real-time trends from over 25 countries, including the US, UK, Brazil, India, Japan, France, and Germany.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Rich data extraction:&lt;/strong&gt; Get vastly more than just a video title. This Actor extracts the rank, full video metadata, channel name, view counts, like counts, comment counts, and thumbnail URLs.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This level of detail is perfect for social media managers who need to spot viral gaming clips in the UK or music trends in South Korea instantly.&lt;/p&gt;

&lt;h3&gt;
  
  
  Tool 2: YouTube Popular Channels Scraper
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp4i5zqnj7knkrr0jux4m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp4i5zqnj7knkrr0jux4m.png" alt="YouTube Popular Channels Scraper" width="800" height="116"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://apify.com/eunit/youtube-popular-channels-scraper" rel="noopener noreferrer"&gt;The YouTube Popular Channels Scraper&lt;/a&gt; is the specialized Actor you need if your goal is influencer marketing or channel discovery. It focuses on the creators behind the content.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Key features:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Influencer discovery:&lt;/strong&gt; Find out which channels are currently surging in popularity within a specific country or category.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Recent hits:&lt;/strong&gt; Instead of just channel stats, get a snapshot of their most recent trending videos. This helps you verify if a channel's popularity is current or fading.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Keyword intelligence:&lt;/strong&gt; Extract the "Popular Keywords" of the day. This is a goldmine for SEO specialists looking to optimize video titles and descriptions for search traffic.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Zero-to-data walkthrough: How to scrape YouTube trends in 3 minutes
&lt;/h2&gt;

&lt;p&gt;You don't need to be a Python wizard to get this data. Here is a step-by-step guide to downloading your first dataset using these scrapers on the &lt;a href="https://console.apify.com/sign-up?fpr=eunit" rel="noopener noreferrer"&gt;Apify platform&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1: Access the YouTube Trending Videos by Categories Actor
&lt;/h3&gt;

&lt;p&gt;Navigate to the &lt;a href="https://apify.com/eunit/youtube-trending-videos-by-categories" rel="noopener noreferrer"&gt;YouTube Trending Videos by Categories&lt;/a&gt; page on the &lt;a href="https://apify.com/store?fpr=eunit" rel="noopener noreferrer"&gt;Apify Store&lt;/a&gt;. Click the &lt;strong&gt;Try for free&lt;/strong&gt; button. If you don't have an account, you can sign up using your email or GitHub credentials.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Configure the input
&lt;/h3&gt;

&lt;p&gt;Once you are in the Apify Console, you will see a user-friendly input form. This is where you define exactly what you want to scrape.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr6d2hmun09jcp6gsbw7m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr6d2hmun09jcp6gsbw7m.png" alt="YouTube Trending Videos by Categories Actor input form" width="800" height="388"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Country:&lt;/strong&gt; Select your target region from the dropdown menu (e.g., &lt;code&gt;United Kingdom&lt;/code&gt; or &lt;code&gt;United States&lt;/code&gt;). Leaving this blank or selecting &lt;code&gt;World&lt;/code&gt; will retrieve global trends.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Category:&lt;/strong&gt; choose a specific vertical. For example, if you are analyzing the gaming market, select &lt;code&gt;Gaming&lt;/code&gt;. For a broad overview, you can select &lt;code&gt;All&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step 3: Run the YouTube Trending Videos by Categories Actor
&lt;/h3&gt;

&lt;p&gt;Click the green &lt;strong&gt;Start&lt;/strong&gt; button at the bottom of the page.&lt;/p&gt;

&lt;p&gt;The Actor will now spin up a cloud server, navigate to YouTube, extract the data using high-quality residential proxies to avoid detection, and process the results. This usually takes just a few seconds to a minute, depending on the amount of data.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 4: Export the data
&lt;/h3&gt;

&lt;p&gt;When the run finishes, you will see a status of "Succeeded." Click on the &lt;strong&gt;Export&lt;/strong&gt; button to view your data. You can download it in various formats depending on your needs:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Excel / CSV:&lt;/strong&gt; Perfect for opening in spreadsheets to perform quick pivot tables or charts.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;JSON:&lt;/strong&gt; Ideal if you are feeding this data into another application or database.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;HTML / XML:&lt;/strong&gt; Available for legacy integrations.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You now have a clean, structured dataset of the top trending videos, ready for analysis.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1ltlsixip5uza1n7b7ky.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1ltlsixip5uza1n7b7ky.png" alt="YouTube Trending Videos by Categories Actor output" width="800" height="467"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Zero-to-data walkthrough: How to scrape Popular Channels
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Step 1: Access the YouTube Popular Channels Scraper Actor
&lt;/h3&gt;

&lt;p&gt;Navigate to the &lt;a href="https://apify.com/eunit/youtube-popular-channels-scraper" rel="noopener noreferrer"&gt;YouTube Popular Channels Scraper&lt;/a&gt; page on the &lt;a href="https://apify.com/store?fpr=eunit" rel="noopener noreferrer"&gt;Apify Store&lt;/a&gt;. Click the &lt;strong&gt;Try for free&lt;/strong&gt; button. If you don't have an account, you can sign up using your email or GitHub credentials.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Configure the channels scraper input
&lt;/h3&gt;

&lt;p&gt;Once you are in the Apify Console, the input form allows you to specify your target audience.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Country:&lt;/strong&gt; Select your target region from the dropdown menu (e.g., &lt;code&gt;United Kingdom&lt;/code&gt; or &lt;code&gt;India&lt;/code&gt;). Leaving this blank or selecting &lt;code&gt;World&lt;/code&gt; will fetch global popular channels.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Category:&lt;/strong&gt; Choose a specific vertical. For example, if you are looking for gaming influencers, select &lt;code&gt;Gaming&lt;/code&gt;. For a broad overview, you can select &lt;code&gt;All&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F61qm25jz3fsbhix1cche.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F61qm25jz3fsbhix1cche.png" alt="YouTube Popular Channels Scraper input form" width="800" height="397"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 3: Run the YouTube Popular Channels Scraper
&lt;/h3&gt;

&lt;p&gt;Click the green &lt;strong&gt;Start&lt;/strong&gt; button at the bottom of the page.&lt;/p&gt;

&lt;p&gt;The Actor will now spin up a cloud server, navigate to YouTube, extract the popular channel data using high-quality residential proxies to avoid detection, and process the results. This usually takes just a few seconds to a minute, depending on the amount of data being processed.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 4: Export the channels data
&lt;/h3&gt;

&lt;p&gt;When the run finishes, you will see a status of "Succeeded." Click on the &lt;strong&gt;Export&lt;/strong&gt; button to view your data. You can download it in various formats depending on your needs:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Excel / CSV:&lt;/strong&gt; Perfect for opening in spreadsheets to perform quick pivot tables or charts.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;JSON:&lt;/strong&gt; Ideal if you are feeding this data into another application or database.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;HTML / XML:&lt;/strong&gt; Available for legacy integrations.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You now have a clean, structured dataset of the top popular channels and keywords, ready for analysis.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjfzptnjxyg8l07jdn4dx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjfzptnjxyg8l07jdn4dx.png" alt="YouTube Popular Channels Scraper output" width="800" height="458"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  For developers: Automate with Python
&lt;/h2&gt;

&lt;p&gt;If you are building a custom dashboard or feeding data into a machine learning model, you will want to automate this process. &lt;a href="https://docs.apify.com/api/client/python/" rel="noopener noreferrer"&gt;The &lt;code&gt;apify-client&lt;/code&gt; for Python&lt;/a&gt; makes this incredibly simple.&lt;/p&gt;

&lt;p&gt;First, install the client:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pip &lt;span class="nb"&gt;install &lt;/span&gt;apify-client
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, you can write a simple script to fetch trending gaming videos in the UK or any country of your choosing. You will find your API token in the &lt;strong&gt;Settings&lt;/strong&gt; &amp;gt; &lt;strong&gt;API &amp;amp; Integrations&lt;/strong&gt; section of the Apify Console.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pip &lt;span class="nb"&gt;install &lt;/span&gt;apify-client
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, you can write a simple script to fetch trending gaming videos in the UK or any country of your choosing. You will need your API token, which you can find in the &lt;strong&gt;Settings&lt;/strong&gt; &amp;gt; &lt;strong&gt;API &amp;amp; Integrations&lt;/strong&gt; section of the Apify Console.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;apify_client&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;ApifyClient&lt;/span&gt;

&lt;span class="c1"&gt;# 1. Initialize the ApifyClient with your API token
&lt;/span&gt;&lt;span class="n"&gt;client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ApifyClient&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;YOUR_API_TOKEN&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# 2. Prepare the Actor input
# We want trending Gaming videos in the UK
&lt;/span&gt;&lt;span class="n"&gt;run_input&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;country&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;united-kingdom&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;category&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;gaming&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;🎥 Fetching YouTube gaming trends for UK...&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# 3. Run the Actor
# We use the actor ID: eunit/youtube-trending-videos-by-categories
&lt;/span&gt;&lt;span class="n"&gt;run&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;actor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;eunit/youtube-trending-videos-by-categories&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;call&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;run_input&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;run_input&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# 4. Fetch and print Actor results from the run's dataset
&lt;/span&gt;&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Stats for run &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;run&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;id&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;:&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Iterate through the items in the dataset
&lt;/span&gt;&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;dataset&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;run&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;defaultDatasetId&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]).&lt;/span&gt;&lt;span class="nf"&gt;iterate_items&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="c1"&gt;# item is a dictionary containing all the scraped data
&lt;/span&gt;    &lt;span class="n"&gt;rank&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;rank&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;N/A&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;title&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;video_title&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Unknown&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;views&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;views&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;0&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;channel&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;channel_name&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Unknown&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;#&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;rank&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; | &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;title&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; by &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;channel&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; (&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;views&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; views)&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This script can be set to run on a schedule (using Cron or Apify Schedules), ensuring your database is always up-to-date with the latest trends without manual intervention.&lt;/p&gt;

&lt;h2&gt;
  
  
  Pricing: The pay-per-event advantage
&lt;/h2&gt;

&lt;p&gt;One of the biggest advantages of using these specific Apify Actors is &lt;a href="https://help.apify.com/en/articles/10700066-what-is-pay-per-event" rel="noopener noreferrer"&gt;the &lt;strong&gt;Pay-Per-Event&lt;/strong&gt; pricing model&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Traditional SaaS tools often charge hefty monthly subscriptions regardless of how much you use them. With Pay-Per-Event, you are charged a small, fixed amount for each result you successfully scrape.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Cost-effective:&lt;/strong&gt; If you only need to scrape trends once a week, you pay pennies. You don't waste money on idle server time.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Predictable:&lt;/strong&gt; You can easily calculate costs. If scraping 1,000 videos costs $X, then scraping 10,000 videos costs $10X. There are no hidden tiers or surprise overage fees.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scale effortlessly:&lt;/strong&gt; Need to scrape data for 20 different countries simultaneously? You can trigger 20 concurrent runs via the API. The cost per result remains the same, and Apify's infrastructure handles the load.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Legal and ethical considerations
&lt;/h2&gt;

&lt;p&gt;When scraping public data, it is important to operate ethically. These Actors are designed to extract publicly available information—data that any user could see by visiting the website without logging in. They do not access private videos, user passwords, or personal emails.&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping Up
&lt;/h2&gt;

&lt;p&gt;YouTube data is too valuable to be locked behind restrictive APIs or complex coding barriers. Whether you are a solo content creator looking to optimize your next thumbnail or a marketing agency analyzing global trends for a Fortune 500 client, having reliable, scalable access to this data is a game-changer.&lt;/p&gt;

&lt;p&gt;By using the &lt;strong&gt;&lt;a href="https://apify.com/eunit/youtube-trending-videos-by-categories" rel="noopener noreferrer"&gt;YouTube Trending Videos by Categories&lt;/a&gt;&lt;/strong&gt; and &lt;strong&gt;&lt;a href="https://apify.com/eunit/youtube-popular-channels-scraper" rel="noopener noreferrer"&gt;YouTube Popular Channels Scraper&lt;/a&gt;&lt;/strong&gt; Actors, you can bypass the technical hurdles and go straight to the insights.&lt;/p&gt;

&lt;p&gt;Happy scraping!&lt;/p&gt;

</description>
      <category>youtube</category>
      <category>apify</category>
      <category>trends</category>
      <category>scraping</category>
    </item>
    <item>
      <title>How to Scrape Twitter (X) Trends Without Breaking the Bank</title>
      <dc:creator>Emmanuel Uchenna</dc:creator>
      <pubDate>Fri, 02 Jan 2026 09:26:00 +0000</pubDate>
      <link>https://dev.to/eunit/how-to-scrape-twitter-x-trends-without-breaking-the-bank-bli</link>
      <guid>https://dev.to/eunit/how-to-scrape-twitter-x-trends-without-breaking-the-bank-bli</guid>
      <description>&lt;p&gt;If you want to know what the world is thinking right now, you look at Twitter (X). With &lt;a href="https://www.internetlivestats.com/twitter-statistics/" rel="noopener noreferrer"&gt;over 500 million tweets sent every single day&lt;/a&gt;, Twitter (X) is the undisputed pulse of the internet. &lt;a href="https://www.niemanlab.org/2020/03/twitter-the-most-news-friendly-social-platform-is-getting-a-little-bit-less-so-with-stories-like-fleets/" rel="noopener noreferrer"&gt;It is described as the place where news breaks&lt;/a&gt;, memes go viral, and public opinion shifts in real-time. In fact, nearly &lt;a href="https://www.bbc.com/news/articles/c93lzyxkklpo" rel="noopener noreferrer"&gt;half of all US social media users rely on X specifically to get their latest news&lt;/a&gt;. For marketers, researchers, and developers, this real-time data is a goldmine, but lately, it’s been buried behind an incredibly expensive paywall.&lt;/p&gt;

&lt;p&gt;Since the platform's overhaul, the days of easy, free API access are gone. Official enterprise tiers now command &lt;a href="https://mashable.com/article/twitter-elon-musk-paid-enterprise-api-access-pricing?utm_campaign=mash-com-tw-main-link&amp;amp;utm_contnet=tech&amp;amp;utm_medium=twitter&amp;amp;utm_source=social" rel="noopener noreferrer"&gt;prices that start at a staggering $42,000 a month&lt;/a&gt;. Practically overnight, individual developers and small teams were priced out of the conversation. And if you’ve tried building your own "DIY" scraper lately, you know it’s a constant battle against IP bans and fragile selectors that break every time the site updates its layout.&lt;/p&gt;

&lt;p&gt;If you’re done fighting with broken scripts or staring at impossible API invoices, this article will guide you towards scraping Twitter (X) trends without breaking the bank. We’re going to explore how you can scrape Twitter Trend data reliably and affordably in 2026 and beyond using the &lt;a href="https://apify.com/eunit/x-twitter-trends-scraper-scraper" rel="noopener noreferrer"&gt;&lt;strong&gt;Twitter (X) Trends Scraper&lt;/strong&gt;&lt;/a&gt; on Apify.&lt;/p&gt;

&lt;h2&gt;
  
  
  The "Now" Economy: Why You Can't Afford to Ignore Twitter Trends
&lt;/h2&gt;

&lt;p&gt;Before we dive into the &lt;em&gt;how&lt;/em&gt;, let's talk about the &lt;em&gt;why&lt;/em&gt;. Why are brands, hedge funds, and news agencies so obsessed with "Trending Topics"?&lt;/p&gt;

&lt;p&gt;Because social media moves so fast that if you blink, you're already behind. We’re currently living in a world where the rules are being rewritten every few weeks. &lt;a href="https://www.mirror.co.uk/3am/ai-influencers-new-normal-social-36004809" rel="noopener noreferrer"&gt;AI influencers with millions of followers are the new norm&lt;/a&gt;, raw audio-first spaces are replacing video curation, and one-click social shopping has fundamentally changed how we spend money.&lt;/p&gt;

&lt;p&gt;The old ways of measuring success are dying, too. We're seeing a pivot away from "likes" toward meaningful shares and saves, &lt;a href="https://onemanandhisblog.com/2017/09/finstas-instagram-identity/" rel="noopener noreferrer"&gt;while the rise of "finstas" (fake Instagram accounts)&lt;/a&gt; shows just how much people are craving raw, unfiltered connection over curated perfection. In this type of landscape marked by ephemeral content and overtly personalized niche communities, you either have real-time data or you’re invisible.&lt;/p&gt;

&lt;p&gt;This is why &lt;strong&gt;X (Twitter) is the global town square&lt;/strong&gt;. While other platforms are for highlights and professional posturing, X is where news breaks, memes are born, and brand reputations are either made or broken in real-time. If you aren't tracking the pulse of X, you're trying to navigate the digital landscape with a map from ten years ago, ultimately making you obsolete.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why Scrape Twitter?
&lt;/h2&gt;

&lt;p&gt;Monitoring trends isn't just about watching hashtags. Monitoring trends is also about decoding the world’s most active conversation in real-time. Here is how businesses, researchers, and creators are turning raw X data into a competitive advantage:&lt;/p&gt;

&lt;h3&gt;
  
  
  Master the Art of "Newsjacking"
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://www.newsjacking.com/what-is-newsjacking-newsjacking" rel="noopener noreferrer"&gt;Newsjacking&lt;/a&gt; is the skill of injecting your brand into a breaking story at the perfect moment. Think back to &lt;a href="https://www.forbes.com/sites/jenniferrooney/2013/02/04/behind-the-scenes-of-oreos-real-time-super-bowl-slam-dunk/" rel="noopener noreferrer"&gt;Oreo’s "You can still dunk in the dark" tweet&lt;/a&gt; during a Super Bowl blackout. That tweet wasn't luck; it was real-time awareness. To pull this off, you need to know what is trending the moment it happens. Speed is the only metric that matters here.&lt;/p&gt;

&lt;h3&gt;
  
  
  Hyper-Local Market Intelligence
&lt;/h3&gt;

&lt;p&gt;Global trends are interesting, but local trends are profitable. If you’re a retailer, knowing that &lt;code&gt;#WinterCoats&lt;/code&gt; is trending in New York while &lt;code&gt;#BeachVibes&lt;/code&gt; is dominant in Sydney allows for precision targeting. No matter where you are, whether in Lagos, London, or Los Angeles, having access to granular location data ensures your message hits the right people at the right time.&lt;/p&gt;

&lt;h3&gt;
  
  
  Unmasking Customer Behavior &amp;amp; Sentiment
&lt;/h3&gt;

&lt;p&gt;What are people &lt;em&gt;actually&lt;/em&gt; feeling? By scraping interactions and performing &lt;a href="https://www.ibm.com/think/topics/sentiment-analysis" rel="noopener noreferrer"&gt;sentiment analysis&lt;/a&gt;, you can move past raw numbers to understand customer pain points and expectations. This emotional intelligence helps you tailor your products and services to what your audience actually wants, rather than what you think they want.&lt;/p&gt;

&lt;h3&gt;
  
  
  Competitive Intelligence on Autopilot
&lt;/h3&gt;

&lt;p&gt;Monitoring your competitors’ activities reveals their marketing playbook in real-time. From spotting their latest product launches to seeing how their customers are reacting to a new policy, this data provides you with the tactical insights you need to stay one step ahead.&lt;/p&gt;

&lt;h3&gt;
  
  
  A Goldmine for Social &amp;amp; Academic Research
&lt;/h3&gt;

&lt;p&gt;For researchers studying everything from political movements to public health trends, X is the ultimate dataset. Automated scraping provides the scale and historical context needed to uncover patterns, track the spread of information, and gauge public opinion on a global scale—all backed by hard data.&lt;/p&gt;

&lt;h3&gt;
  
  
  Content Inspiration on Tap
&lt;/h3&gt;

&lt;p&gt;Writer's block is a productivity killer. By examining trending "tag clouds" and emerging discussions, content creators can instantly see what topics are currently resonating. This ensures your next post or video rides an existing wave of interest rather than struggling for attention in a vacuum.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Problem: The Great Data Wall of X
&lt;/h2&gt;

&lt;p&gt;The business case for this data is ironclad, but the reality is that actually accessing it has become a significant challenge.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Death of the Free API
&lt;/h3&gt;

&lt;p&gt;For a decade, the Twitter API was the playground of the internet. You could write a Python script in 10 minutes and stream tweets. Today, those gates are locked tight.&lt;/p&gt;

&lt;p&gt;X has replaced the once-open ecosystem with a tiered &lt;a href="https://docs.x.com/x-api/introduction" rel="noopener noreferrer"&gt;pricing structure&lt;/a&gt; that makes data access either trivial or prohibitively expensive:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Free ($0/mo):&lt;/strong&gt; Strictly for testing and bots that post content. You get a measly 100 posts/month in reads. That’s barely enough to refresh a single feed once.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Basic ($200/mo):&lt;/strong&gt; Aimed at hobbyists and prototypes. It bumps you up to 15,000 reads per month, which disappears in a flash if you’re trying to track shifting trends across different regions.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Pro ($5,000/mo):&lt;/strong&gt; This is where you get 1 million reads and full-archive search. It’s for scaling businesses, but the price tag is a massive barrier for most developers and small teams.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Enterprise:&lt;/strong&gt; For anything larger, you’re looking at custom solutions. With enterprise pricing rumored to start at $42,000 a month, it’s clear that a "Pay-to-Play" reality has replaced the "Golden Age" of data access.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you’re an indie developer or a small marketing team, these invoices aren't just an expense; they're definitely a death sentence for your project.&lt;/p&gt;

&lt;h3&gt;
  
  
  The "DIY Scraper" Trap
&lt;/h3&gt;

&lt;p&gt;"Fine," you think, "I'll just whip up a Python script with Selenium or Playwright and do it myself."&lt;/p&gt;

&lt;p&gt;Before you start writing that first line of code, you need to understand the technical gauntlet X has thrown down. Modern Twitter isn't just a website; it’s a system designed to keep automated traffic out. Here is why your "quick weekend project" will likely turn into a full-time maintenance nightmare:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Obfuscated and Dynamic DOM:&lt;/strong&gt; X doesn't use semantic classes like &lt;code&gt;trend-item&lt;/code&gt;. Instead, they use auto-generated, randomized strings such as &lt;code&gt;css-175oi2r r-18u37iz&lt;/code&gt;. These can change during a deployment or even based on your session. Your selectors will break frequently, requiring you to rewrite your parsing logic every week.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Aggressive Fingerprinting:&lt;/strong&gt; It’s not just about your IP address anymore. X analyzes your browser fingerprint, Canvas rendering, WebGL info, and even how you handle headers. If your headless browser looks like a bot, you’ll be met with a "Something went wrong" screen or a permanent block before you even fetch the trends.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;The Resource Drain of SPAs:&lt;/strong&gt; X is a heavy Single Page Application (SPA). To scrape it, you must run a full headless browser to execute JavaScript and handle infinite scrolling. This consumes massive amounts of CPU and RAM, especially when scaling. What starts "free" on your laptop quickly becomes a triple-digit server bill.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Evolving Bot Detection:&lt;/strong&gt; Between &lt;a href="https://www.cloudflare.com/learning/bots/what-is-bot-detection/" rel="noopener noreferrer"&gt;Cloudflare protection&lt;/a&gt; and internal behavioral analysis, X is constantly looking for patterns. If you don't vary your mouse movements, scroll speeds, and request intervals perfectly, your scraper will be flagged and throttled into oblivion.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Suddenly, that "free" DIY project is costing you 20+ hours a week in troubleshooting to keep the data flowing.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Solution: The Twitter (X) Trends Scraper on Apify
&lt;/h2&gt;

&lt;p&gt;Enter the &lt;strong&gt;&lt;a href="https://apify.com/eunit/x-twitter-trends-scraper" rel="noopener noreferrer"&gt;Twitter (X) Trends Scraper&lt;/a&gt;&lt;/strong&gt;. This isn't just another script; it's a production-grade Actor hosted on the Apify platform. It serves as a bridge between the chaos of the open web and the structured data needs of your business. It bypasses the complexity of the official API and the fragility of DIY scrapers.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvrhjvh0rgcezm7yi3h8r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvrhjvh0rgcezm7yi3h8r.png" alt="X (Twitter) Trends Scraper" width="800" height="460"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here is why it’s the superior choice for 2026 and beyond.&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Granular Location Targeting (The Secret Sauce)
&lt;/h3&gt;

&lt;p&gt;Most generic scrapers only provide options for "Worldwide" or "USA". This Actor goes deeper. Much deeper. It leverages &lt;strong&gt;Twitter (X)&lt;/strong&gt; data to allow you to &lt;a href="https://apify.com/eunit/x-twitter-trends-scraper/input-schema" rel="noopener noreferrer"&gt;select specific &lt;strong&gt;Cities&lt;/strong&gt; and &lt;strong&gt;Countries&lt;/strong&gt;&lt;/a&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Want to know what's buzzing in &lt;strong&gt;Kano, Nigeria&lt;/strong&gt;? You can.&lt;/li&gt;
&lt;li&gt;Need to compare trends in &lt;strong&gt;Birmingham, UK&lt;/strong&gt; vs &lt;strong&gt;Birmingham, Alabama&lt;/strong&gt;? Done.
This level of detail is critical for localized marketing campaigns and sociological research.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  2. No Proxies? No Problem
&lt;/h3&gt;

&lt;p&gt;If you were running a scraper locally, you'd need to buy a pool of Rotating Residential Proxies to avoid getting blocked. These can cost hundreds of dollars a month. The &lt;a href="https://apify.com/eunit/x-twitter-trends-scraper" rel="noopener noreferrer"&gt;Twitter Trend Scraper&lt;/a&gt; handles all of this infrastructure for you. When you run the &lt;a href="https://console.apify.com/sign-up?fpr=eunit" rel="noopener noreferrer"&gt;Actor&lt;/a&gt; on &lt;a href="https://www.apify.com?fpr=eunit" rel="noopener noreferrer"&gt;Apify&lt;/a&gt;, it utilizes their vast pool of IP addresses. You don't need to configure anything; you click "Run".&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Pay-Per-Event Pricing (The Fair Model)
&lt;/h3&gt;

&lt;p&gt;Subscription models are annoying. Why pay $99/month if you only need to scrape data once a week? The &lt;a href="https://apify.com/eunit/x-twitter-trends-ppe" rel="noopener noreferrer"&gt;Twitter Trend Scraper&lt;/a&gt; operates on a &lt;strong&gt;Pay-Per-Event&lt;/strong&gt; model. You are charged a tiny fee only when you successfully scrape a scraping run. If you run it 10 times, you pay for 10 runs. If you don't use it for a month, you pay $0. It scales perfectly with your needs, from a hobbyist project to an enterprise dashboard.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. Rich, Structured Data Output
&lt;/h3&gt;

&lt;p&gt;It doesn't just give you a list of hashtags. You get a comprehensive JSON dataset including:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Rank:&lt;/strong&gt; Is it #1 or #49?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Name:&lt;/strong&gt; The hashtag or keyword.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Tweet Volume:&lt;/strong&gt; "10K Tweets" vs "2M Tweets".&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Context:&lt;/strong&gt; The direct link to the search page.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;History:&lt;/strong&gt; Hourly timeline data.&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Example Output from the Actor
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"scraped_at"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2025-12-31T19:00:09.948042"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"country_input"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Worldwide"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"timeline"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"datetime"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Wed Dec 31 2025 18:08:19 GMT+0000 (Coordinated Universal Time)"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"timestamp"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"1767204499.747"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"trends"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"rank"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Happy New Year"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"link"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://twitter.com/search?q=Happy%20New%20Year"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"tweet_count"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2141151"&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"rank"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"#CDTVライブライブ"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"link"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://twitter.com/search?q=%23CDTV%E3%83%A9%E3%82%A4%E3%83%96%E3%83%A9%E3%82%A4%E3%83%96"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"tweet_count"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"269605"&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="err"&gt;//&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"datetime"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Wed Dec 31 2025 17:16:15 GMT+0000 (Coordinated Universal Time)"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"timestamp"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"1767201375.892"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"trends"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"rank"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Happy New Year"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"link"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://twitter.com/search?q=Happy%20New%20Year"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"tweet_count"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"1708901"&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"rank"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"新年早々"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"link"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://twitter.com/search?q=%E6%96%B0%E5%B9%B4%E6%97%A9%E3%80%85"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"tweet_count"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"80971"&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="err"&gt;//&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"datetime"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Wed Dec 31 2025 16:24:31 GMT+0000 (Coordinated Universal Time)"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"timestamp"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"1767198271.186"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"trends"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"rank"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Happy New Year"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"link"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://twitter.com/search?q=Happy%20New%20Year"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"tweet_count"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"1270217"&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"rank"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"#STARTOtoMOVE生配信"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"link"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://twitter.com/search?q=%23STARTOtoMOVE%E7%94%9F%E9%85%8D%E4%BF%A1"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"tweet_count"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"282088"&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="err"&gt;//&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"datetime"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Wed Dec 31 2025 16:24:18 GMT+0000 (Coordinated Universal Time)"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"timestamp"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"1767198258.898"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"trends"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"rank"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Happy New Year"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"link"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://twitter.com/search?q=Happy%20New%20Year"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"tweet_count"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"1270217"&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"rank"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"#STARTOtoMOVE生配信"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"link"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://twitter.com/search?q=%23STARTOtoMOVE%E7%94%9F%E9%85%8D%E4%BF%A1"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"tweet_count"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"282088"&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="err"&gt;//&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="err"&gt;//&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"tag_cloud"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[],&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"table_data"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[]&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  How to Get Started (In Under 5 Minutes)
&lt;/h2&gt;

&lt;p&gt;You don't need to be a coding wizard to use this Actor. Here is your Zero-to-Data walkthrough.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1: Access the Actor
&lt;/h3&gt;

&lt;p&gt;Head over to the &lt;a href="https://apify.com/eunit/x-twitter-trends-scraper" rel="noopener noreferrer"&gt;Twitter (X) Trends Scraper page on Apify&lt;/a&gt;. Click the &lt;strong&gt;"Try for free"&lt;/strong&gt; button to create your account.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzfj6fbnfwktgck5mujtk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzfj6fbnfwktgck5mujtk.png" alt="Twitter (X) Trends Scraper page on Apify" width="800" height="455"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Configure Your Input
&lt;/h3&gt;

&lt;p&gt;You will see a user-friendly interface.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Look for the &lt;strong&gt;Country&lt;/strong&gt; dropdown.&lt;/li&gt;
&lt;li&gt;Select your target. Let's say... &lt;code&gt;United Kingdom - London&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;(Optional) You can leave everything else as the default.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step 3: Run &amp;amp; Export
&lt;/h3&gt;

&lt;p&gt;Hit the green &lt;strong&gt;Save &amp;amp; Start&lt;/strong&gt; button at the bottom.&lt;br&gt;
Boom. The Actor will spin up, navigate the web, extract the data, and shut down.&lt;br&gt;
In a few seconds, you will see your results in the &lt;strong&gt;Output&lt;/strong&gt; tab. You can view them as a table or download them in &lt;strong&gt;JSON, CSV, Excel, or XML&lt;/strong&gt; formats.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9nev13l7tl04wzq3v1y8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9nev13l7tl04wzq3v1y8.png" alt="Twitter (X) Trends Scraper page on Apify" width="800" height="356"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  For the Developers: Automating the Pipeline
&lt;/h2&gt;

&lt;p&gt;If you &lt;em&gt;are&lt;/em&gt; a developer, you probably want to integrate this Actor into your own app. Maybe you're building a dashboard that alerts you when crypto coins start trending.&lt;/p&gt;

&lt;p&gt;You can control this Actor programmatically using the &lt;a href="https://docs.apify.com/api/client/python/" rel="noopener noreferrer"&gt;Apify Client&lt;/a&gt; for Python (or Node.js).&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;apify_client&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;ApifyClient&lt;/span&gt;

&lt;span class="c1"&gt;# 1. Initialize the client
&lt;/span&gt;&lt;span class="n"&gt;client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ApifyClient&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;YOUR_API_TOKEN&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# 2. Define your Input
&lt;/span&gt;&lt;span class="n"&gt;run_input&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;country&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;united-states/new-york&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="c1"&gt;# 3. Call the Actor
&lt;/span&gt;&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Fetching trends...&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;run&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;actor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;eunit/x-twitter-trends-scraper&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;call&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;run_input&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;run_input&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# 4. Fetch the results
&lt;/span&gt;&lt;span class="n"&gt;dataset_items&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;dataset&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;run&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;defaultDatasetId&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]).&lt;/span&gt;&lt;span class="nf"&gt;iterate_items&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;dataset_items&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Stats for &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;country_input&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;:&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;trend&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;timeline&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;trends&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;][:&lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;]:&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;#&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;trend&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;rank&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;trend&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;name&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; - &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;trend&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;tweet_count&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This snippet does in 10 lines what would take 500 lines of Selenium code to attempt (and fail) to do.&lt;/p&gt;

&lt;h2&gt;
  
  
  Is Scraping X Legal?
&lt;/h2&gt;

&lt;p&gt;Ethical scraping is important. This actor scrapes &lt;strong&gt;publicly available factual data&lt;/strong&gt; (trends). It does not scrape private profiles, log into accounts, or scrape personal data behind a login wall.&lt;br&gt;
Scraping public information is considered legal, but it is essential to review the Terms of Service of any site you interact with and ensure your usage complies with relevant regulations such as the GDPR.&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping Up
&lt;/h2&gt;

&lt;p&gt;Let’s be real: trying to keep up with Twitter (X) without automated data is like trying to catch a waterfall with a spoon. You might see a few drops, but the wave you need to ride has usually passed you by.&lt;/p&gt;

&lt;p&gt;The $42,000-per-month "pay-to-play" era was intended to render social listening for everyone but the biggest corporations. But as we've seen, you don't need a massive enterprise budget to get enterprise-grade insights.&lt;/p&gt;

&lt;p&gt;Stop wasting your time fighting with broken Selenium scripts or staring at invoices you can't justify. Focus on what you actually do best: analyzing the trends and making informed business decisions.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;a href="https://apify.com/eunit/x-twitter-trends-scraper" rel="noopener noreferrer"&gt;Try the Twitter (X) Trends Scraper today&lt;/a&gt;&lt;/strong&gt; and start turning the chaos of social media into a clear signal for your success.&lt;/p&gt;

</description>
      <category>socialmedia</category>
      <category>twitter</category>
      <category>scraping</category>
      <category>apify</category>
    </item>
    <item>
      <title>How To Automatically Submit Sitemap To Google Programmatically</title>
      <dc:creator>Emmanuel Uchenna</dc:creator>
      <pubDate>Mon, 29 Dec 2025 12:32:21 +0000</pubDate>
      <link>https://dev.to/eunit/how-to-automatically-submit-sitemap-to-google-programmatically-1gbn</link>
      <guid>https://dev.to/eunit/how-to-automatically-submit-sitemap-to-google-programmatically-1gbn</guid>
      <description>&lt;p&gt;Generating a sitemap is only the first half of the SEO battle. If you’ve read our previous article on &lt;a href="https://www.eunit.me/blog/how-to-generate-and-submit-an-xml-sitemap-the-ultimate-guide" rel="noopener noreferrer"&gt;how to create and submit an XML sitemap&lt;/a&gt;, you already know how to use the &lt;a href="https://apify.com/eunit/sitemap-generator" rel="noopener noreferrer"&gt;Fast Sitemap Generator&lt;/a&gt; to crawl your site and create a clean, search-engine-ready file. However, here is the frustrating reality: even after you’ve uploaded that sitemap to Google Search Console, you might still be waiting days, weeks, or even months for Google to crawl and index those new pages.&lt;/p&gt;

&lt;p&gt;If you are running a news website, an e-commerce store with changing inventory, or a platform that publishes time-sensitive content, you don't have time to wait for Google’s "natural selection" process. You need your content indexed &lt;strong&gt;now&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;In this article, we’ll walk you through how to take control of your indexing by automatically submitting your sitemap URLs to Google programmatically. We will cover the manual way (the "hard" way) using Python and the &lt;a href="https://developers.google.com/search/apis/indexing-api/v3/quickstart" rel="noopener noreferrer"&gt;Google Indexing API&lt;/a&gt;, and then we’ll show you the professional way to automate the entire pipeline using the &lt;a href="https://apify.com/eunit/google-indexing" rel="noopener noreferrer"&gt;Google Indexer &amp;amp; Instant SEO Submitter&lt;/a&gt; &lt;a href="https://console.apify.com/sign-up?fpr=eunit" rel="noopener noreferrer"&gt;Apify Actor&lt;/a&gt;.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Note&lt;/strong&gt;: This post was originally published in &lt;a href="https://www.eunit.me/blog/how-to-automatically-submit-sitemap-to-google-programmatically" rel="noopener noreferrer"&gt;Eunit.me&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Why Just Submitting A Sitemap Isn't Enough
&lt;/h2&gt;

&lt;p&gt;When you submit a sitemap in Google Search Console, you are essentially leaving a note on Google’s doorstep saying, "Hey, I have some new stuff in here whenever you're ready." Google will address it when it wants to. For most sites, this is fine. But if you have:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Frequently updated content&lt;/strong&gt;: Price changes, stock availability, or breaking news.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;A massive site&lt;/strong&gt;: Large sites often struggle with "crawl budget." You want to tell Google exactly which pages are the most important at the moment.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;A brand new domain&lt;/strong&gt;: New sites have zero authority and might sit in the "discovered - currently not indexed" purgatory for a long time.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The Google Indexing API addresses this issue. While originally intended for jobs and live broadcast content, SEO professionals have found it to be incredibly effective for almost any type of content looking for an "instant" crawl nudge.&lt;/p&gt;

&lt;h2&gt;
  
  
  Part 1: Setting Up The Google Cloud Infrastructure
&lt;/h2&gt;

&lt;p&gt;Before you can write a single line of code or use any automation tools, you need to navigate the &lt;a href="https://console.cloud.google.com/" rel="noopener noreferrer"&gt;Google Cloud Console&lt;/a&gt;. This is where most people tend to get stuck, so let's walk through it step by step.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu4tc103vitcg242vl84s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu4tc103vitcg242vl84s.png" alt="Google Cloud Console" width="800" height="403"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Create a Google Cloud Project
&lt;/h3&gt;

&lt;p&gt;Go to the &lt;a href="https://console.cloud.google.com/" rel="noopener noreferrer"&gt;Google Cloud Console&lt;/a&gt;. If you don't have a project yet, click on the project dropdown in the top-left corner and select &lt;strong&gt;New Project&lt;/strong&gt;. Give it a descriptive name, such as "SEO Indexing Automator."&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Enable the Indexing API
&lt;/h3&gt;

&lt;p&gt;Indexing is not enabled by default. You need to tell Google you intend to use it.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Navigate to &lt;strong&gt;APIs &amp;amp; Services &amp;gt; Library&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Search for "Web Search Indexing API" (sometimes just called "Indexing API").&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Enable&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Far9p2a94i4o4gi7wgley.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Far9p2a94i4o4gi7wgley.png" alt="Enable Indexing API" width="800" height="396"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Create a Service Account
&lt;/h3&gt;

&lt;p&gt;This is the "person" who will be acting on your behalf.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Now, go back to &lt;strong&gt;IAM &amp;amp; Admin &amp;gt; Service Accounts&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Create Service Account&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Give it a name like "indexing-bot" and a description.&lt;/li&gt;
&lt;li&gt;For the role, you can choose &lt;strong&gt;Owner&lt;/strong&gt; for simplicity during setup, or &lt;strong&gt;Project &amp;gt; Editor&lt;/strong&gt; for a more restricted approach.&lt;/li&gt;
&lt;li&gt;Once created, click on the email address of the service account. It should look something like &lt;code&gt;indexing-bot@your-project.iam.gserviceaccount.com&lt;/code&gt;. &lt;strong&gt;Copy this email; you need it for GSC later.&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7pr9v6is6ddlgxd2jja4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7pr9v6is6ddlgxd2jja4.png" alt="Create Service Account" width="800" height="413"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  4. Create and Download the JSON Key File
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Inside your service account details, go to the &lt;strong&gt;Keys&lt;/strong&gt; tab.&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Add Key &amp;gt; Create new key&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;JSON&lt;/strong&gt; and click &lt;strong&gt;Create&lt;/strong&gt;.
Your browser will download a file. Keep this safe! It contains the "password" to your indexing bot.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Part 2: Connect the bot to Google Search Console
&lt;/h2&gt;

&lt;p&gt;This is the step everyone misses. Even if you have the API enabled, Google won't let your service account submit URLs unless it has verified ownership of the site.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open &lt;a href="https://search.google.com/search-console" rel="noopener noreferrer"&gt;Google Search Console&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Select the property (website) you want to automate.&lt;/li&gt;
&lt;li&gt;Go to &lt;strong&gt;Settings &amp;gt; Users and permissions&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Add user&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Paste the service account email you copied earlier.&lt;/li&gt;
&lt;li&gt;Set the permission to &lt;strong&gt;Owner&lt;/strong&gt;. (Google requires owner-level permissions to use the Indexing API).&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Now the pipes are connected. Let's look at how to push the data.&lt;/p&gt;

&lt;h2&gt;
  
  
  Part 3: The Developer Way - Programmatically Submitting Urls
&lt;/h2&gt;

&lt;p&gt;If you want to build this yourself, you can write a script that parses your sitemap and sends each URL to the API. Here is a high-level overview of how you'd do it in Python.&lt;/p&gt;

&lt;h3&gt;
  
  
  The "DIY" Python script
&lt;/h3&gt;

&lt;p&gt;You’ll need a few libraries:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pip &lt;span class="nb"&gt;install &lt;/span&gt;requests google-auth google-auth-httplib2 lxml
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The script works by reading your sitemap XML and iterating through the &lt;code&gt;&amp;lt;loc&amp;gt;&lt;/code&gt; tags. For each URL, it makes a POST request to Google.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;requests&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;google.oauth2&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;service_account&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;google.auth.transport.requests&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;AuthorizedSession&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;lxml&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;etree&lt;/span&gt;

&lt;span class="c1"&gt;# Constants
&lt;/span&gt;&lt;span class="n"&gt;SERVICE_ACCOUNT_FILE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;./your-key.json&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;span class="n"&gt;SITEMAP_URL&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;https://example.com/sitemap.xml&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;span class="n"&gt;API_URL&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;https://indexing.googleapis.com/v3/urlNotifications:publish&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;submit_to_google&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;authed_session&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
 &lt;span class="n"&gt;data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;url&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;type&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;URL_UPDATED&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
 &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;authed_session&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;post&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;API_URL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;status_code&lt;/span&gt;

&lt;span class="c1"&gt;# 1. Auth
&lt;/span&gt;&lt;span class="n"&gt;credentials&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;service_account&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Credentials&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;from_service_account_file&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="n"&gt;SERVICE_ACCOUNT_FILE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; 
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="n"&gt;scopes&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;https://www.googleapis.com/auth/indexing&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;session&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;AuthorizedSession&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;credentials&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# 2. Parse Sitemap
&lt;/span&gt;&lt;span class="n"&gt;resp&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;requests&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;SITEMAP_URL&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;root&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;etree&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;fromstring&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;resp&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;content&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;urls&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;loc&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;loc&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;root&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;findall&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;.//{http://www.sitemaps.org/schemas/sitemap/0.9}loc&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)]&lt;/span&gt;

&lt;span class="c1"&gt;# 3. Submit
&lt;/span&gt;&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;url&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;urls&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
 &lt;span class="n"&gt;status&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;submit_to_google&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;session&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Submitted &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;url&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;status&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  The Limitations Of DIY Automation
&lt;/h3&gt;

&lt;p&gt;While the script works, it lacks several production features:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Rate limiting&lt;/strong&gt;: Google has strict quotas (usually 200 URLs per day). You need to handle 429 errors gracefully.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Incremental updates&lt;/strong&gt;: You don't want to submit 10,000 URLs every day if only 5 are new.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Reporting&lt;/strong&gt;: You need to know which URLs failed and why.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scheduling&lt;/strong&gt;: You have to host this script somewhere and set up a cron job.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Part 4: The Professional Way - Using The Apify Platform
&lt;/h2&gt;

&lt;p&gt;If you don't want to manage servers, handle complex authentication boilerplate, or write your own rate-limiting logic, you should use the &lt;a href="https://apify.com/eunit/google-indexing" rel="noopener noreferrer"&gt;Google Indexer &amp;amp; Instant SEO Submitter&lt;/a&gt; on the &lt;a href="https://apify.com/marketplace?fpr=eunit" rel="noopener noreferrer"&gt;Apify platform&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;This &lt;a href="https://apify.com/eunit/google-indexing" rel="noopener noreferrer"&gt;Actor&lt;/a&gt; is designed to handle all the heavy lifting for you. It can read your sitemap directly, handle authentication via your JSON key, and even integrate with other Actors to form a complete SEO pipeline.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxvtkwtx646g1qcabtedk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxvtkwtx646g1qcabtedk.png" alt="Google Indexer &amp;amp; Instant SEO Submitter" width="800" height="454"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Why Use The Google Indexing Actor?
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;No-Code configuration&lt;/strong&gt;: Paste your JSON key and your sitemap URL.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Dataset integration&lt;/strong&gt;: It can take a &lt;code&gt;dataset_id&lt;/code&gt; from a previous crawl (like from the &lt;a href="https://apify.com/eunit/sitemap-generator" rel="noopener noreferrer"&gt;Fast Sitemap Generator&lt;/a&gt;) and index those results immediately.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Error handling&lt;/strong&gt;: It automatically detects rate limits and waits or stops as needed to prevent account flag.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Pay-Per-Event pricing&lt;/strong&gt;: You only pay for successful submissions. It is incredibly cost-effective, starting at just $0.01 per 1,000 results.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Test mode&lt;/strong&gt;: You can run a dry-run to see which URLs would be submitted without actually calling the API or incurring costs.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Part 5: Building a Complete SEO Automation Pipeline
&lt;/h2&gt;

&lt;p&gt;The real power of &lt;a href="https://console.apify.com/sign-up?fpr=eunit" rel="noopener noreferrer"&gt;Apify&lt;/a&gt; comes when you connect these tools. Instead of manually generating a sitemap and then submitting it manually, you can automate the entire chain.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Workflow
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Step 1&lt;/strong&gt;: Use the &lt;a href="https://apify.com/eunit/sitemap-generator" rel="noopener noreferrer"&gt;Fast Sitemap Generator&lt;/a&gt; to crawl your site. This Actor will find all active pages and generate a sitemap. Check out our &lt;a href="https://www.eunit.me/blog/how-to-generate-and-submit-an-xml-sitemap-the-ultimate-guide" rel="noopener noreferrer"&gt;previous article detailing&lt;/a&gt; how to generate your free sitemap using the &lt;a href="https://apify.com/eunit/sitemap-generator" rel="noopener noreferrer"&gt;Fast Sitemap Generator&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Step 2&lt;/strong&gt;: The &lt;a href="https://apify.com/eunit/google-indexing" rel="noopener noreferrer"&gt;Google Indexer &amp;amp; Instant SEO Submitter&lt;/a&gt; takes the output of the first Actor.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Step 3&lt;/strong&gt;: It iterates through the discovered URLs and notifies Google.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  How to set this up on Apify
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Run the &lt;a href="https://apify.com/eunit/sitemap-generator" rel="noopener noreferrer"&gt;Fast Sitemap Generator&lt;/a&gt; with your &lt;strong&gt;Start URLs&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Copy the &lt;strong&gt;Dataset ID&lt;/strong&gt; from the run results.&lt;/li&gt;
&lt;li&gt;Open the &lt;a href="https://apify.com/eunit/google-indexing" rel="noopener noreferrer"&gt;Google Indexer &amp;amp; Instant SEO Submitter&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;In the &lt;strong&gt;Dataset ID&lt;/strong&gt; field, paste the ID you just copied.&lt;/li&gt;
&lt;li&gt;Add your &lt;strong&gt;Google Service Account JSON&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Start&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffoknmtm90lpiagk8uxxo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffoknmtm90lpiagk8uxxo.png" alt="Configuring and running the Fast Sitemap Generator" width="442" height="768"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgrr4z52fpvfj9uppyy7f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgrr4z52fpvfj9uppyy7f.png" alt="Configuring the Google Indexer &amp;amp; Instant SEO Submitter" width="800" height="464"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The Actor will now intelligently parse the XML generated by the first step and submit every unique URL it finds. Because it tracks sources, it ensures you aren't double-charged or submitting duplicates.&lt;/p&gt;

&lt;h2&gt;
  
  
  Best Practices For Programmatic Indexing
&lt;/h2&gt;

&lt;p&gt;To get the most out of these tools, keep these tips in mind:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Don't spam Google&lt;/strong&gt;: Use &lt;code&gt;URL_UPDATED&lt;/code&gt; only for truly new or significantly updated content. If you are testing, use the &lt;strong&gt;Test Mode&lt;/strong&gt; toggle in the Actor settings.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Match your canonicals&lt;/strong&gt;: Ensure the URLs in your sitemap are the exact canonical versions. If your site uses &lt;code&gt;https://www.&lt;/code&gt;, don't submit &lt;code&gt;https://&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Monitor the quotas&lt;/strong&gt;: Standard projects get 200 requests per day. If you have a massive site, you can request a quota increase from Google, but the &lt;a href="https://apify.com/eunit/google-indexing" rel="noopener noreferrer"&gt;Google Indexing Actor&lt;/a&gt; will help you manage this limit automatically by stopping when the limit is reached.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Check Search Console&lt;/strong&gt;: After running the automation, keep an eye on the "Indexing" report in GSC. You should see "Crawled - currently not indexed" change to "Indexed" much faster than usual.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Wrapping Up
&lt;/h2&gt;

&lt;p&gt;The days of waiting for Google to notice your hard work are over. By combining the power of the &lt;a href="https://apify.com/eunit/sitemap-generator" rel="noopener noreferrer"&gt;Fast Sitemap Generator&lt;/a&gt; and the &lt;a href="https://apify.com/eunit/google-indexing" rel="noopener noreferrer"&gt;Google Indexer &amp;amp; Instant SEO Submitter&lt;/a&gt;, you can build a professional-grade SEO engine that ensures your content is live in search results within hours, not weeks.&lt;/p&gt;

</description>
      <category>programming</category>
      <category>automation</category>
      <category>seo</category>
      <category>apify</category>
    </item>
    <item>
      <title>How to Generate and Submit an XML Sitemap: The Ultimate Guide</title>
      <dc:creator>Emmanuel Uchenna</dc:creator>
      <pubDate>Sun, 28 Dec 2025 23:59:42 +0000</pubDate>
      <link>https://dev.to/eunit/how-to-generate-and-submit-an-xml-sitemap-the-ultimate-guide-4e7k</link>
      <guid>https://dev.to/eunit/how-to-generate-and-submit-an-xml-sitemap-the-ultimate-guide-4e7k</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Note&lt;/em&gt;: This blog was originally published in &lt;a href="https://www.eunit.me/blog/how-to-generate-and-submit-an-xml-sitemap-the-ultimate-guide" rel="noopener noreferrer"&gt;Eunit.me&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Nowadays, creating a website is only half the worry for getting a business online. The other half of the concern, and arguably the more critical one, is ensuring that people can actually find your website. This is where &lt;a href="https://developers.google.com/search/docs/fundamentals/seo-starter-guide" rel="noopener noreferrer"&gt;&lt;strong&gt;Search Engine Optimization (SEO)&lt;/strong&gt;&lt;/a&gt; comes into play. While &lt;a href="https://www.conductor.com/academy/keyword-research/" rel="noopener noreferrer"&gt;keyword research&lt;/a&gt;, &lt;a href="https://mailchimp.com/resources/what-is-backlinking-and-why-is-it-important-for-seo/" rel="noopener noreferrer"&gt;backlink building&lt;/a&gt;, and content strategy are often the "glamour" stats of SEO, there is a fundamental technical element that serves as the bedrock of your site’s visibility, and that is &lt;strong&gt;The XML Sitemap&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;In this article, we will walk you through everything you need to know about sitemaps: what they are, why they are important for modern SEO, and most importantly, &lt;strong&gt;how to generate a sitemap automatically without writing a single line of code&lt;/strong&gt; using the &lt;a href="https://apify.com/eunit/sitemap-generator" rel="noopener noreferrer"&gt;Fast Sitemap Generator&lt;/a&gt; on &lt;a href="https://console.apify.com/sign-up?fpr=eunit" rel="noopener noreferrer"&gt;Apify&lt;/a&gt;. Finally, we’ll show you how to &lt;a href="https://apify.com/eunit/google-indexing" rel="noopener noreferrer"&gt;submit your sitemap&lt;/a&gt; to &lt;a href="https://search.google.com/search-console" rel="noopener noreferrer"&gt;Google Search Console&lt;/a&gt; to get your pages indexed faster than ever before.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is an XML Sitemap and Why Do You Need One?
&lt;/h2&gt;

&lt;p&gt;An &lt;strong&gt;XML Sitemap&lt;/strong&gt; (Extensible Markup Language) is a text file that lists all the URLs on your website that you want search engines to index. But it’s more than just a list; it provides crucial metadata about each URL, including:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Last Modified Date (&lt;code&gt;&amp;lt;lastmod&amp;gt;&lt;/code&gt;)&lt;/strong&gt;: Tells Google when the page was last updated, encouraging re-crawling of fresh content.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Change Frequency (&lt;code&gt;&amp;lt;changefreq&amp;gt;&lt;/code&gt;)&lt;/strong&gt;: A hint to crawlers about how often the page changes (e.g., "daily" for a news homepage vs. "yearly" for an "About Us" page).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Priority (&lt;code&gt;&amp;lt;priority&amp;gt;&lt;/code&gt;)&lt;/strong&gt;: A numerical value (0.0 to 1.0) indicating the relative importance of a page within your site.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  The Benefits of XML Sitemaps
&lt;/h3&gt;

&lt;p&gt;You might ask, "Doesn't Google crawl my site anyway?" Yes, but relying solely on Google's crawler (&lt;a href="https://developers.google.com/search/docs/crawling-indexing/googlebot" rel="noopener noreferrer"&gt;Googlebot&lt;/a&gt;) following links has limitations:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Isolated Pages&lt;/strong&gt;: If a page isn't linked to from another page (an "orphan page"), Googlebot can't find it. A sitemap lists it explicitly.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;New Websites&lt;/strong&gt;: New sites have few backlinks. Without a reputation, crawl budgets are low. A sitemap requests immediate attention.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Large Archives&lt;/strong&gt;: E-commerce sites or blogs with thousands of pages can confuse crawlers. A sitemap ensures deep pages aren't ignored.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Rich Media&lt;/strong&gt;: Specialized sitemaps (Video, Image) help your multimedia assets appear in Google Images and Video Search.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  The Old Way vs. The Automated Way
&lt;/h2&gt;

&lt;p&gt;Historically, creating a sitemap was a tedious task.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;The Manual Method&lt;/strong&gt;: Opening a text editor and hand-coding &lt;code&gt;&amp;lt;url&amp;gt;&lt;/code&gt; tags. For a 5-page site, it’s fine. For a 100-page site, it’s a nightmare. One typo breaks the file.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;CMS Plugins&lt;/strong&gt;: If you use WordPress, plugins like &lt;a href="https://yoast.com/" rel="noopener noreferrer"&gt;Yoast&lt;/a&gt; are great. But what if you have a custom React/Next.js &lt;a href="https://www.eunit.me/blog/hello-world" rel="noopener noreferrer"&gt;site like this one&lt;/a&gt;? Or a static HTML site? Or what if you want to audit a competitor's site structure? Plugins don't work there.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Desktop Crawlers&lt;/strong&gt;: Tools like &lt;a href="https://www.screamingfrog.co.uk/" rel="noopener noreferrer"&gt;Screaming Frog&lt;/a&gt; are powerful but resource-heavy. They tie up your computer, rely on your local internet connection (slow), and require manual exporting and uploading.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  The Solution: Cloud-Based Automation
&lt;/h3&gt;

&lt;p&gt;Enter &lt;a href="https://apify.com/store?fpr=eunit" rel="noopener noreferrer"&gt;&lt;strong&gt;Apify Actors&lt;/strong&gt;&lt;/a&gt;. Actors are serverless cloud programs that perform specific tasks. The Apify &lt;a href="https://apify.com/eunit/sitemap-generator" rel="noopener noreferrer"&gt;Fast Sitemap Generator&lt;/a&gt; is a specialized Actor developed to solve the sitemap problem effortlessly. Unlike desktop tools, it runs in the cloud, works on &lt;em&gt;any&lt;/em&gt; website, and is completely automated and fast.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to Generate an XML Sitemap with Apify
&lt;/h2&gt;

&lt;p&gt;Let’s get technical. We are going to use the Apify &lt;a href="https://apify.com/eunit/sitemap-generator" rel="noopener noreferrer"&gt;Fast Sitemap Generator&lt;/a&gt; Actor to crawl a website and produce a compliant XML sitemap, along with HTML and TXT versions for good measure.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://apify.com/eunit/sitemap-generator" rel="noopener noreferrer"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7g6dnboki6d2e4y643o7.png" alt="Fast Sitemap Generator on Apify" width="800" height="456"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1: Access the Tool
&lt;/h3&gt;

&lt;p&gt;Navigate to the &lt;a href="https://apify.com/eunit/sitemap-generator" rel="noopener noreferrer"&gt;Sitemap Generator Actor on Apify&lt;/a&gt;.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Note&lt;/strong&gt;: You will need an &lt;a href="https://console.apify.com/sign-up?fpr=eunit" rel="noopener noreferrer"&gt;Apify account&lt;/a&gt;. The free tier is generous enough for testing and small crawls. Create one by signing up on &lt;a href="https://console.apify.com/sign-up?fpr=eunit" rel="noopener noreferrer"&gt;Apify&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;This specific Actor uses a &lt;strong&gt;Direct Connection&lt;/strong&gt;, meaning it crawls directly from the data center without proxies. This ensures high speed and lower costs, though it requires your target site to be accessible to standard web traffic.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Configure Your Input
&lt;/h3&gt;

&lt;p&gt;Once you click &lt;strong&gt;"Try for free"&lt;/strong&gt; or &lt;strong&gt;"Run"&lt;/strong&gt;, you’ll be taken to the Apify Console. Here is where you tell the crawler what to do.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu3vefg9syuxpd59lrhqt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu3vefg9syuxpd59lrhqt.png" alt="Fast Sitemap Generator - Configuration" width="410" height="768"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Start URLs&lt;/strong&gt;:&lt;br&gt;
In the &lt;code&gt;Start URLs&lt;/code&gt; field, enter the homepage of the site you want to map.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;em&gt;Example&lt;/em&gt;: &lt;code&gt;https://www.your-website.com&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;2. Crawl Depth (&lt;code&gt;maxCrawlDepth&lt;/code&gt;)&lt;/strong&gt;:&lt;br&gt;
This determines how "deep" the crawler goes.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Depth 0&lt;/strong&gt;: Just the homepage.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Depth 1&lt;/strong&gt;: Homepage + pages linked directly from it.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Depth 3 (Default)&lt;/strong&gt;: Usually sufficient for most sites to find all content.&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Tip&lt;/em&gt;: Set this to &lt;code&gt;10+&lt;/code&gt; if your site has endless pagination or deep category structures.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;3. Filtering with Regex (&lt;code&gt;includePatterns&lt;/code&gt; / &lt;code&gt;excludePatterns&lt;/code&gt;)&lt;/strong&gt;:&lt;br&gt;
This is a superpower. You don't want to index your admin pages, cart pages, or user-specific accounts.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Exclude&lt;/strong&gt;: Add patterns like &lt;code&gt;.*/admin/.*&lt;/code&gt;, &lt;code&gt;.*/login.*&lt;/code&gt;, or &lt;code&gt;.*/cart.*&lt;/code&gt; to skip these.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Include&lt;/strong&gt;: Leave empty to crawl everything, or specify patterns to only map a blog section (e.g., &lt;code&gt;.*/blog/.*&lt;/code&gt;).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;4. Formats&lt;/strong&gt;:&lt;br&gt;
Select the outputs you need.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;xml&lt;/code&gt;: &lt;strong&gt;Essential&lt;/strong&gt; for search engines. Recommended.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;html&lt;/code&gt;: Great for a visible "Site Map" page for human visitors.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;txt&lt;/code&gt;: A simple list of URLs, often used for content audits or programmatic processing.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;
  
  
  Step 3: Run the Generator
&lt;/h3&gt;

&lt;p&gt;Click the green &lt;strong&gt;Start&lt;/strong&gt; button at the bottom. The Actor will now:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Launch&lt;/strong&gt;: Spin up a container in the cloud.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Crawl&lt;/strong&gt;: Systematically visit every link on your site, respecting your &lt;code&gt;robots.txt&lt;/code&gt; rules (unless you disabled that option).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Construct&lt;/strong&gt;: Build the XML structure with the correct &lt;code&gt;lastmod&lt;/code&gt; dates.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Save&lt;/strong&gt;: Store the files in a persistent Key-Value Store.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Why is this better?&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Pay-Per-Event Pricing&lt;/strong&gt;: You only pay for the pages successfully discovered. Efficiency is built in.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Speed&lt;/strong&gt;: It can process thousands of pages in minutes without using your computer's RAM.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;
  
  
  Step 4: Retrieve Your Sitemap
&lt;/h3&gt;

&lt;p&gt;Once the run shows "Succeeded", navigate to the &lt;strong&gt;Output&lt;/strong&gt; tab.&lt;br&gt;
You will see a Dataset containing the direct links to your generated files.&lt;/p&gt;

&lt;p&gt;It will look something like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"format"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"xml"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt; &lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"url"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://api.apify.com/v2/key-value-stores/YOUR_STORE_ID/records/sitemap.xml"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg4vux095jlh8ynywk3qd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg4vux095jlh8ynywk3qd.png" alt="Fast Sitemap Generator - Log" width="800" height="461"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click the link to download your &lt;code&gt;sitemap.xml&lt;/code&gt;.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Pro Tip&lt;/strong&gt;: You can also download the &lt;code&gt;sitemap.html&lt;/code&gt; file and upload it to your site to instantly create a helpful navigation page for users!&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  How to Submit Your Sitemap to Google Search Console
&lt;/h2&gt;

&lt;p&gt;Now that you have the file, you need to inform Google where it is located.&lt;/p&gt;

&lt;h3&gt;
  
  
  Phase A: Upload to Your Server
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Download the &lt;code&gt;sitemap.xml&lt;/code&gt; from Apify.&lt;/li&gt;
&lt;li&gt;Upload it to the &lt;strong&gt;root directory&lt;/strong&gt; of your website via FTP or your hosting file manager (e.g., &lt;code&gt;public_html&lt;/code&gt;).&lt;/li&gt;
&lt;li&gt;Your sitemap should be accessible at: &lt;code&gt;https://www.your-website.com/sitemap.xml&lt;/code&gt;.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Phase B: Submit to GSC
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Log in to &lt;a href="https://search.google.com/search-console" rel="noopener noreferrer"&gt;Google Search Console&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Select your property (website) from the dropdown.&lt;/li&gt;
&lt;li&gt;In the left sidebar, click on &lt;strong&gt;Sitemaps&lt;/strong&gt; (under the "Indexing" section).&lt;/li&gt;
&lt;li&gt;In the "Add a new sitemap" field, enter the filename (e.g., &lt;code&gt;sitemap.xml&lt;/code&gt;).&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Submit&lt;/strong&gt;.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Phase C: Verify
&lt;/h3&gt;

&lt;p&gt;Google will process the submission instantly. You should see a status of &lt;strong&gt;"Success"&lt;/strong&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;If you see "Could not fetch", wait a few hours or double-check that your URL is publicly accessible.&lt;/li&gt;
&lt;li&gt;Clicking on the submitted sitemap will show you the "Discovered URLs" count. Does this match the number of pages the Apify Actor found? If so, you’re golden!&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Advanced Automation: Set It and Forget It
&lt;/h2&gt;

&lt;p&gt;The beauty of using the Apify Sitemap Generator lies in its ability to automate tasks. Your website content changes, you add blog posts, remove products, and update pages. Your sitemap &lt;em&gt;must&lt;/em&gt; reflect these changes, or Google will be indexing "dead" content.&lt;/p&gt;

&lt;h3&gt;
  
  
  Use the &lt;a href="https://apify.com/eunit/google-indexing" rel="noopener noreferrer"&gt;Google Indexer &amp;amp; Instant SEO Submitter&lt;/a&gt; Actor
&lt;/h3&gt;

&lt;p&gt;The &lt;a href="https://apify.com/eunit/google-indexing" rel="noopener noreferrer"&gt;Google Indexer &amp;amp; Instant SEO Submitter&lt;/a&gt; Actor on Apify enables you to automatically submit your generated sitemap to Google programmatically, instead of having to go through the above-listed phases (A-C). We wrote a detailed guide on &lt;a href="https://eunit.dev/blog/how-to-automatically-submit-sitemap-to-google-programmatically" rel="noopener noreferrer"&gt;How to Submit Sitemap to Google Programmatically&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://apify.com/eunit/google-indexing" rel="noopener noreferrer"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv4jbp5gjnfki3ypmjzg9.png" alt="Google Indexer &amp;amp; Instant SEO Submitter" width="800" height="454"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping Up
&lt;/h2&gt;

&lt;p&gt;An XML sitemap is a small file with a massive impact. It is the bridge between your content and the search engines that deliver your audience. By moving away from manual creation and using automated, intelligent tools like the &lt;strong&gt;&lt;a href="https://apify.com/eunit/sitemap-generator" rel="noopener noreferrer"&gt;Sitemap Generator Actor&lt;/a&gt;&lt;/strong&gt;, you ensure that this bridge is always sturdy, accurate, and open for traffic.&lt;/p&gt;

&lt;p&gt;Happy Crawling!&lt;/p&gt;

</description>
      <category>seo</category>
      <category>sitemap</category>
      <category>actor</category>
      <category>apify</category>
    </item>
    <item>
      <title>How to Scrape Real Estate Data from Zillow in 2026 (Step-by-Step Guide)</title>
      <dc:creator>Emmanuel Uchenna</dc:creator>
      <pubDate>Fri, 19 Dec 2025 09:30:53 +0000</pubDate>
      <link>https://dev.to/eunit/how-to-scrape-real-estate-data-from-zillow-in-2026-step-by-step-guide-j13</link>
      <guid>https://dev.to/eunit/how-to-scrape-real-estate-data-from-zillow-in-2026-step-by-step-guide-j13</guid>
      <description>&lt;h1&gt;
  
  
  How to Scrape Real Estate Data from Zillow in 2026 (Step-by-Step Guide)
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://businessday.ng/opinion/article/big-data-for-bigger-sales-how-real-estate-professionals-are-leveraging-information-to-drive-results/" rel="noopener noreferrer"&gt;Data is the oxygen of the modern real estate market&lt;/a&gt;. Whether you are an investor looking for the next big opportunity, a realtor analyzing market trends, or a developer building a property aggregator, a student, or a professional, having access to accurate, up-to-date listing data is a game-changer. And when it comes to US real estate, &lt;a href="https://www.zillow.com/" rel="noopener noreferrer"&gt;&lt;strong&gt;Zillow&lt;/strong&gt;&lt;/a&gt; is the undisputed king of data for the real estate market.&lt;/p&gt;

&lt;p&gt;However, extracting bulk data from Zillow is notoriously difficult. The platform relies on complex map-based interfaces and employs sophisticated anti-scraping measures to block automated access. Zillow currently employs several measures to prevent scraping, including IP blocking, &lt;a href="https://www.humansecurity.com/platform/solutions/scraping/" rel="noopener noreferrer"&gt;CAPTCHA challenges (HUMAN Security)&lt;/a&gt;, and dynamic content loading.&lt;/p&gt;

&lt;p&gt;In this article, we will walk you through how to bypass these challenges and scrape thousands of rental listings, including prices, addresses, and full photo galleries- using the &lt;strong&gt;&lt;a href="https://apify.com/eunit/zillow-scraper" rel="noopener noreferrer"&gt;Zillow Rental Data Scraper&lt;/a&gt;&lt;/strong&gt; on the Apify platform.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmf4latknvc84687ktczb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmf4latknvc84687ktczb.png" alt="Zillow Rental Website" width="800" height="389"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Why scrape Zillow data?
&lt;/h2&gt;

&lt;p&gt;Web scraping transforms Zillow's vast data into structured datasets (Excel, CSV, JSON) for different downstream applications. It enables professionals to automate the collection of listings, price history, and market stats, unlocking insights that manual searching cannot match.&lt;/p&gt;

&lt;h3&gt;
  
  
  Key Use Cases
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Market Research&lt;/strong&gt;: Scraping real estate data allows you to monitor local trends, supply/demand balance, and regional pricing to gauge market health.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Investment Analysis&lt;/strong&gt;: Identify undervalued assets, calculate ROI, and track value appreciation to build profitable portfolios.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Competitor Intelligence&lt;/strong&gt;: Analyze how similar properties perform, track listings, and understand market positioning.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Location Analytics&lt;/strong&gt;: Assess neighborhood demographics, amenities, and property characteristics for development or relocation planning.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Proptech &amp;amp; App Development&lt;/strong&gt;: Feed real-time real estate data into new platforms, apps, or AI models for predictive analytics.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Rental Management&lt;/strong&gt;: Analyze rental demand, set optimal pricing, and understand tenant preferences.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  What Data Is Collected?
&lt;/h3&gt;

&lt;p&gt;The Zillow Rental Data Scraper extracts detailed information for every listing found in your search area:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Property Identity&lt;/strong&gt;: Full address, building name, Zillow Property ID (&lt;code&gt;zpid&lt;/code&gt;), and precise GPS coordinates.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Rental Details&lt;/strong&gt;: Current pricing, available units, bedroom/bathroom counts, and square footage.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Rich Media&lt;/strong&gt;: High-resolution image URLs, including the complete photo gallery for each property.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Listing Metadata&lt;/strong&gt;: Listing status (e.g., FOR_RENT), availability counts, and the direct URL to the Zillow listing page.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The challenge: Why is Zillow hard to scrape?
&lt;/h2&gt;

&lt;p&gt;Zillow does not offer a public API for bulk listing data. Additionally, it utizes a map-based search interface that loads data dynamically as you pan and zoom the map. Standard web scrapers often fail because:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Map-Based Pagination&lt;/strong&gt;: You can't just page through results; you often need to "move" a virtual map to find all listings in a city.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Sophisticated Anti-Scraping Security&lt;/strong&gt;: Zillow uses &lt;strong&gt;PerimeterX (HUMAN Security)&lt;/strong&gt;, an enterprise-grade solution that analyzes browser fingerprints, mouse movements, and keystrokes. It triggers a difficult CAPTCHA or blocks IPs immediately if it detects bot-like behavior.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;IP Blocking&lt;/strong&gt;: Standard data center IPs (like AWS or Google Cloud) are flagged and blocked almost instantly.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Complex Data Structure&lt;/strong&gt;: Vital information like high-resolution photos is often hidden behind dynamic user interactions (XHR requests).&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  The solution: Zillow Rental Data Scraper
&lt;/h2&gt;

&lt;p&gt;The &lt;a href="https://apify.com/eunit/zillow-scraper" rel="noopener noreferrer"&gt;&lt;strong&gt;Zillow Rental Data Scraper&lt;/strong&gt;&lt;/a&gt; is a powerful &lt;a href="https://apify.com/store?fpr=eunit" rel="noopener noreferrer"&gt;Apify Actor&lt;/a&gt; designed specifically to overcome these hurdles. Unlike generic scrapers, the &lt;a href="https://apify.com/eunit/zillow-scraper" rel="noopener noreferrer"&gt;Zillow Scraper&lt;/a&gt; mimics the Zillow map-view search behavior. You define a geographic "box" (using latitude and longitude), and the Actor systematically sweeps that area to capture every listing provided by Zillow.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Key capabilities:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Geographic Precision&lt;/strong&gt;: Scrape listings within a precise map area using a bounding box.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Rich Media Extraction&lt;/strong&gt;: Captures &lt;strong&gt;all available photo URLs&lt;/strong&gt; for each property, not just the thumbnail.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Granular Filters&lt;/strong&gt;: Apply min/max price, bedroom/bathroom counts, and property type filters (e.g., houses vs. apartments).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Bypassing Blocks&lt;/strong&gt;: Built to work seamlessly with &lt;strong&gt;Residential Proxies&lt;/strong&gt; to avoid detection.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Step-by-step guide: How to use the Zillow Scraper
&lt;/h2&gt;

&lt;p&gt;You don't need to write a single line of Python code to get this data. Follow these steps to start scraping in minutes.&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Find your target area
&lt;/h3&gt;

&lt;p&gt;First, you need to define the geographic area you want to scrape.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to &lt;a href="https://www.zillow.com" rel="noopener noreferrer"&gt;Zillow.com&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Search for your target city (e.g., "Austin, TX").&lt;/li&gt;
&lt;li&gt;Zoom the map to cover the exact area you are interested in.&lt;/li&gt;
&lt;li&gt;You will use the coordinates of this area (North-East and South-West corners) to tell the scraper where to look. You can find these in the URL or use a simple online bounding box tool.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftdy59ywn8pben4c339jv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftdy59ywn8pben4c339jv.png" alt="Zillow page showing Austin, TX map" width="800" height="382"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Set up the Actor on the Apify platform
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;a href="https://console.apify.com/sign-up?fpr=eunit" rel="noopener noreferrer"&gt;Log in&lt;/a&gt; to your Apify account or &lt;a href="https://console.apify.com/sign-up?fpr=eunit" rel="noopener noreferrer"&gt;create a new one&lt;/a&gt; if you don't have one.&lt;/li&gt;
&lt;li&gt;Navigate to the &lt;strong&gt;&lt;a href="https://apify.com/eunit/zillow-scraper" rel="noopener noreferrer"&gt;Zillow Rental Data Scraper&lt;/a&gt;&lt;/strong&gt; page.&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Try for free&lt;/strong&gt; to open the Actor console.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl41kxb8rygzsealqx36w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl41kxb8rygzsealqx36w.png" alt="Apify page showing Zillow Scraper" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Configure your input
&lt;/h3&gt;

&lt;p&gt;In the Input tab, you will specify exactly what the scarper should look for.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Search Parameters:&lt;/strong&gt;&lt;br&gt;
Set your filters to match your needs. For example, if you are looking for affordable 1-bedroom apartments:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Min Price&lt;/strong&gt;: $1,200&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Max Price&lt;/strong&gt;: $2,500&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Min Bed&lt;/strong&gt;: 1&lt;/li&gt;
&lt;li&gt;etc&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Geographic Boundary:&lt;/strong&gt;&lt;br&gt;
Enter the coordinates you found in Step 1. This ensures the scraper focuses only on your target market.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;ne_lat&lt;/code&gt; (North East Latitude)&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;sw_long&lt;/code&gt; (South West Longitude)&lt;/li&gt;
&lt;li&gt;etc.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Proxy Configuration (CRITICAL):&lt;/strong&gt;&lt;br&gt;
Zillow has strict security. To ensure your run succeeds:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Enable &lt;strong&gt;Proxy configuration&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Residential proxies&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;&lt;em&gt;Note: Using Datacenter proxies usually results in empty datasets because Zillow blocks them immediately.&lt;/em&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5w4qr69cnbhxwk7zjume.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5w4qr69cnbhxwk7zjume.png" alt="Apify console showing Zillow Scraper with proxy configuration" width="507" height="768"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  4. Run the scraper
&lt;/h3&gt;

&lt;p&gt;Click the &lt;strong&gt;Start&lt;/strong&gt; button. The Actor will spin up, navigate to Zillow using the residential proxies, apply your filters, and start extracting data.&lt;/p&gt;

&lt;p&gt;You can watch the logs to see progress in real-time. The scraper will output "Extracted details" for each property it finds.&lt;/p&gt;

&lt;h3&gt;
  
  
  5. Download your data
&lt;/h3&gt;

&lt;p&gt;Once the run is finished, go to the &lt;strong&gt;Storage&lt;/strong&gt; tab. You can preview your data or download it in your preferred format:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Excel&lt;/strong&gt;: Great for manual review and sorting.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;CSV&lt;/strong&gt;: Ideal for importing into databases or CRMs.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;JSON&lt;/strong&gt;: Best for developers integrating the data into apps.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Sample Output Data:&lt;/strong&gt;&lt;br&gt;
You will get a clean, structured dataset for every property:&lt;/p&gt;

&lt;h3&gt;
  
  
  Example Output
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"zpid"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"30.46529--97.60529"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"30.46529--97.60529"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"providerListingId"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"5c5nn5nycnc5q"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"imgSrc"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/7d1a61d3b011516c8a5204f1b398e474-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"hasImage"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"detailUrl"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://www.zillow.com/apartments/pflugerville-tx/the-beacon-at-pfluger-farm/CjjCGn/"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"statusType"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"FOR_RENT"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"statusText"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"The Beacon at Pfluger Farm"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"address"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"1300 Rauscher Rd, Pflugerville, TX"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"addressStreet"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"1300 Rauscher Rd # f52cfdc49"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"addressCity"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Pflugerville"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"addressState"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"TX"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"addressZipcode"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"78660"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"units"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"price"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"$1,214+"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"beds"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"1"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"roomForRent"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"price"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"$1,782+"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"beds"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"roomForRent"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"lotId"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;2748185226&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"latLong"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"latitude"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;30.46529&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"longitude"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;-97.60529&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"listCardRecommendation"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"flexFieldRecommendations"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"displayString"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"11 available units"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"contentType"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"frUnitsAvailable"&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"ctaRecommendations"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"displayString"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Request a tour"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"contentType"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"REQUEST_A_TOUR"&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"isSaved"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"buildingName"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"The Beacon at Pfluger Farm"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"isBuilding"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"canSaveBuilding"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"has3DModel"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"isFeaturedListing"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"isShowcaseListing"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"list"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"relaxed"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"carouselPhotosComposable"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"baseUrl"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/{photoKey}-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"communityBaseUrl"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"photoData"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/7d1a61d3b011516c8a5204f1b398e474-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/6ef9281fdb0d7e13b9fb90a3cec20987-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/1069c481833fdd6fa45c27fe7296ece7-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/3ae52944a1839e27510901aa70f03bb6-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/538f0be96c53927b7a1e3f536d05fed0-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/ba77a295f04bc79a14d93ed34dcc1e5f-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/2f5c50213efb1e57356fc2d9eb332089-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/bc2f048dd1d895baabd564e6f733b07c-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/90d8a596e849d8a67d3b05dcad9a038c-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/c3db6fee321ab87e0aec5574e2b96798-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/66f87c6c89345569fa9969bc7752e69c-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/9b4d966520512aafe6cc452a9e3c03a2-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/b3e544dcadcd50e9909c13cda614409c-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/88b022e258a3689392794869bce2c8b5-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/04aa9848fd82a6af259b8ee4057ddada-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/64f3e72700fa174491609240a2772348-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/e2024d6f788006f6f646e908030db1f6-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/7b9ec973819dcde5d354fe5fb396fb35-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/c2f38c90c4628d1a472a445b1b79b87f-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/010b8274d97cb58d03f43f33511e1c16-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/c83aab0ab0194622a262e019044b40eb-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/8488b9de4f2ac78b1ab8d661495205e1-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/3758a24dc4ff6f2df99d24a10b948499-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/0198c3f273d9369db18ccb371870397d-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/6245e857a3d8f784f3cf8d7e85b8b037-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/340b5363f062cc09a7876d1d6772fa96-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/ddd360f0f1db438298ea438c35afdc16-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/8b2392852d76970f4e38bb444dde9118-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/7c93c4adaeff5e718db6a1878515300a-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/402ff64e1592cff104968d216228fd15-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/41fc7e98b78f44a1b098479003c15c0f-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/1e96a1b3214f06e3352aa6e9267f9a7a-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/729f2ba54acab8ef7454af2fa4435239-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/f17991597d432633505cd1bc16b19017-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/c9e726647b8d47aa0afe17b5e88e1cd1-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/214b658c9de15f14982974d07663111d-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/e7f14898a3770034f6a713bcea34f4c7-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/4819e0d2edc6d2d32badeba4fdea044b-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/68406fbf2c016fd38fdec02e411b155d-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/2111ff9c000d2d5f3c37b1d838f902d6-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/e12b2a8952536f45e55c6716de2ef273-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/a0e7c07753f64ebbdf9e2b186e3f368f-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/f8168301283fe4477153eac9c1320fc8-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/1971944837586c1bca34fa72f088ef85-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/7f0cd466eb612c36955aabd3346020ae-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/f2159903758d257c046e8c4db8b9630e-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/eb3f16643202f129ec81c8997138f392-p_e.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"https://photos.zillowstatic.com/fp/87fbce9721b2c588e3d27080e3deacfa-p_e.jpg"&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"communityPhotoData"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"isStaticUrls"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"availabilityCount"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;11&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"marketingTreatments"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="s2"&gt;"trustedListing"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="s2"&gt;"paid"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="s2"&gt;"zillowRentalManager"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="s2"&gt;"multiFamilySalesListing"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="s2"&gt;"paidOrUnpaidMultifamily"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="s2"&gt;"paidMultifamily"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="s2"&gt;"multifamilyPremium"&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"listPriceIncludesRequiredMonthlyFees"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"isInstantTourEnabled"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"isContactable"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"isPaidBuilderNewConstruction"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Is scraping Zillow legal?
&lt;/h2&gt;

&lt;p&gt;Scraping publicly available factual data (like addresses, prices, and facts about a property) is generally considered legal in the US, provided you do not breach login barriers or copyright protections (like re-publishing creative descriptions or photos without permission).&lt;/p&gt;

&lt;p&gt;However, you must always respect Zillow's &lt;strong&gt;Terms of Service&lt;/strong&gt; and specific laws like &lt;strong&gt;GDPR&lt;/strong&gt; or &lt;strong&gt;CCPA&lt;/strong&gt; if personal data is involved. If you plan to use the data commercially or at scale, we recommend consulting with a legal professional.&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping Up
&lt;/h2&gt;

&lt;p&gt;We've covered why Zillow data is a goldmine for real estate professionals and the significant technical hurdles—like map-based pagination and sophisticated anti-scraping defenses (PerimeterX), that make extracting it so difficult.&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;Zillow Rental Data Scraper&lt;/strong&gt; on Apify offers a robust, no-code solution to bypass these barriers. By leveraging residential proxies and mimicking real user behavior, you can reliably harvest rich datasets, including listings, high-resolution photos, and pricing history.&lt;/p&gt;

&lt;p&gt;Ready to start scraping? &lt;strong&gt;&lt;a href="https://apify.com/eunit/zillow-scraper" rel="noopener noreferrer"&gt;Get the Zillow Rental Data Scraper&lt;/a&gt;&lt;/strong&gt;!&lt;/p&gt;

</description>
      <category>zillow</category>
      <category>realestate</category>
      <category>apify</category>
      <category>scraping</category>
    </item>
    <item>
      <title>How to Scrape LinkedIn Job Postings with Python: A Step-by-Step Guide</title>
      <dc:creator>Emmanuel Uchenna</dc:creator>
      <pubDate>Thu, 18 Dec 2025 19:17:45 +0000</pubDate>
      <link>https://dev.to/eunit/how-to-scrape-linkedin-job-postings-with-python-a-step-by-step-guide-5bi3</link>
      <guid>https://dev.to/eunit/how-to-scrape-linkedin-job-postings-with-python-a-step-by-step-guide-5bi3</guid>
      <description>&lt;p&gt;LinkedIn is the world's largest professional network, making it a goldmine for data on job market trends, company hiring patterns, and employment opportunities. Whether you are a recruiter, a market researcher, or a developer building a job board aggregator, automating the collection of this data can provide a significant competitive advantage.&lt;/p&gt;

&lt;p&gt;In this article, we will walk you through how to build a robust &lt;a href="https://apify.com/eunit/linkedin-jobs-scraper" rel="noopener noreferrer"&gt;&lt;strong&gt;LinkedIn Job Postings Scraper&lt;/strong&gt;&lt;/a&gt; using Python. We will cover how to handle common challenges, such as infinite scrolling and anti-bot protections, using the &lt;a href="https://apify.com/proxy" rel="noopener noreferrer"&gt;Apify Residential Proxies&lt;/a&gt;.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Tip:&lt;/strong&gt; Don't want to build it from scratch? specific Check out the ready-to-use production-grade &lt;a href="https://apify.com/eunit/linkedin-job-postings-scraper" rel="noopener noreferrer"&gt;LinkedIn Job Postings Scraper&lt;/a&gt; Actor on the &lt;a href="https://apify.com/store?fpr=eunit" rel="noopener noreferrer"&gt;Apify Store&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Why Scrape LinkedIn Job Data?
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwrer6yv4l7yc78c0g954.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwrer6yv4l7yc78c0g954.png" alt="LinkedIn Jobs page" width="800" height="431"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Scraping LinkedIn data opens up a wide range of powerful use cases:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Lead Generation:&lt;/strong&gt; Create highly targeted lists based on job titles, industries, or specific technical skills to drive personalized and effective outreach campaigns.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Competitor Intelligence:&lt;/strong&gt; Gain insights into the competitive landscape by tracking your competitors' hiring patterns, growth trajectories, and organizational structures.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Recruitment &amp;amp; Sourcing:&lt;/strong&gt; Go beyond basic keyword matching to discover passive candidates and build deep talent pipelines filled with the exact skills you need.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Market Research:&lt;/strong&gt; Monitor emerging industry trends, skill demand shifts, and the geographical distribution of talent.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Academic Studies:&lt;/strong&gt; Gather data for analyzing labor market dynamics, professional migration patterns, and economic correlations.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In short, if valuable data exists on a public LinkedIn page, &lt;a href="https://www.datacamp.com/blog/ethical-web-scraping" rel="noopener noreferrer"&gt;ethical web scraping&lt;/a&gt; is a scalable method for aggregating it for business or research insights.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Challenges of Scraping LinkedIn
&lt;/h2&gt;

&lt;p&gt;LinkedIn is known for its strict anti-scraping measures. If you try to scrape it with a simple script, you will likely face:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;IP Bans:&lt;/strong&gt; Frequent requests from the same IP address will trigger rate limits.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Infinite Scrolling:&lt;/strong&gt; Job lists load dynamically as you scroll, complicating pagination.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Login Walls:&lt;/strong&gt; Many pages require authentication, which risks flagging your personal account.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;To overcome these challenges, we will utilize the &lt;a href="https://docs.apify.com/sdk/python/" rel="noopener noreferrer"&gt;&lt;strong&gt;Apify SDK for Python&lt;/strong&gt;&lt;/a&gt; and &lt;a href="https://docs.apify.com/platform/proxy/residential-proxy" rel="noopener noreferrer"&gt;&lt;strong&gt;Residential Proxies&lt;/strong&gt;&lt;/a&gt;, which enable us to route requests through legitimate devices, making our traffic indistinguishable from real users.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;p&gt;Before we start, ensure you have:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Python 3.8+&lt;/strong&gt; installed on your machine.&lt;/li&gt;
&lt;li&gt;An &lt;strong&gt;Apify Account&lt;/strong&gt; (you can &lt;a href="https://console.apify.com/sign-up?fpr=eunit" rel="noopener noreferrer"&gt;sign up for free&lt;/a&gt;).&lt;/li&gt;
&lt;li&gt;Basic knowledge of &lt;a href="https://developer.mozilla.org/en-US/docs/Web/CSS/Guides/Selectors" rel="noopener noreferrer"&gt;&lt;strong&gt;CSS selectors&lt;/strong&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Step 1: Setting Up the Environment
&lt;/h2&gt;

&lt;p&gt;We will use the &lt;a href="https://docs.apify.com/sdk/python/" rel="noopener noreferrer"&gt;Apify Python SDK&lt;/a&gt; to manage our scraper's execution and storage. You can start by using the &lt;a href="https://docs.apify.com/cli" rel="noopener noreferrer"&gt;Apify CLI&lt;/a&gt; to create a new boilerplate project.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-g&lt;/span&gt; apify-cli
apify create linkedin-scraper &lt;span class="nt"&gt;-t&lt;/span&gt; python-start
&lt;span class="nb"&gt;cd &lt;/span&gt;linkedin-scraper
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Install the necessary Python libraries:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pip &lt;span class="nb"&gt;install &lt;/span&gt;apify httpx beautifulsoup4 httpx-socks
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;&lt;code&gt;apify&lt;/code&gt;&lt;/strong&gt;: For managing the Actor's lifecycle and storage.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;code&gt;httpx&lt;/code&gt;&lt;/strong&gt;: A modern, asynchronous HTTP client.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;code&gt;beautifulsoup4&lt;/code&gt;&lt;/strong&gt;: For parsing HTML content.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Step 2: Handling Infinite Scrolling and Pagination
&lt;/h2&gt;

&lt;p&gt;LinkedIn's job search page uses infinite scrolling. Instead of trying to simulate scroll events (which is slow and flaky), we can reverse-engineer the hidden internal API used by the frontend.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# The base URL pattern
&lt;/span&gt;&lt;span class="n"&gt;list_url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;https://www.linkedin.com/jobs-guest/jobs/api/seeMoreJobPostings/search&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This approach allows us to scrape thousands of jobs without ever needing to render the full page in a browser, significantly speeding up the process.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 3: Implementing Residential Proxies
&lt;/h2&gt;

&lt;p&gt;This is the most critical part. To avoid getting blocked, you must use high-quality proxies. The &lt;a href="https://apify.com/eunit/linkedin-job-postings-scraper" rel="noopener noreferrer"&gt;LinkedIn Job Postings Scraper&lt;/a&gt; is robustly designed to use Apify Residential Proxies when available, with a fallback mechanism for local testing.&lt;/p&gt;

&lt;p&gt;Here is how we configure the &lt;code&gt;httpx.AsyncClient&lt;/code&gt; to use a specific proxy country (e.g., US) to ensure we see relevant job data:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Handle Proxy Configuration
&lt;/span&gt;&lt;span class="n"&gt;proxy_country&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;actor_input&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;proxyCountry&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;proxy_url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;

&lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="c1"&gt;# Attempt to use specific Residential Proxies (e.g., US)
&lt;/span&gt;    &lt;span class="n"&gt;proxy_configuration&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;Actor&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;create_proxy_configuration&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="n"&gt;groups&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;RESIDENTIAL&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
        &lt;span class="n"&gt;country_code&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;proxy_country&lt;/span&gt;
    &lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;proxy_url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;proxy_configuration&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;new_url&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="k"&gt;except&lt;/span&gt; &lt;span class="nb"&gt;Exception&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;Actor&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;log&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;warning&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Residential proxy unavailable: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;. Falling back...&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="c1"&gt;# ... fallback logic handles local execution safely ...
&lt;/span&gt;
&lt;span class="c1"&gt;# Configure the HTTP client
&lt;/span&gt;&lt;span class="n"&gt;client_kwargs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;timeout&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;30.0&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;proxy_url&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;client_kwargs&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;proxy&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;proxy_url&lt;/span&gt;

&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="n"&gt;httpx&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;AsyncClient&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;client_kwargs&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="c1"&gt;# ... scraping logic ...
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;By allowing users to specify a &lt;code&gt;proxyCountry&lt;/code&gt;, the scraper can view job listings exactly as they appear to a user in that region.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 4: Scraping and Parsing Data
&lt;/h2&gt;

&lt;p&gt;Once we have the HTML, we use &lt;a href="https://pypi.org/project/beautifulsoup4/" rel="noopener noreferrer"&gt;BeautifulSoup&lt;/a&gt; to extract key details, such as the Job Title, Company Name, Location, and Posting Date.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;get_job_data&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;soup&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;title&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;soup&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;find&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;h2&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;top-card-layout__title&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;get_text&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;strip&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;company&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;soup&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;find&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;a&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;topcard__org-name-link&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;get_text&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;strip&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;location&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;soup&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;find&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;span&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;topcard__flavor--bullet&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;get_text&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;strip&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;job_url&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;soup&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;find&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;a&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;topcard__link&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;href&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
        &lt;span class="c1"&gt;# ... more fields ...
&lt;/span&gt;    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Running the Actor
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmc8u6bazw8if988etm7g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmc8u6bazw8if988etm7g.png" alt="LinkedIn Job Postings Scraper on Apify" width="800" height="439"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can run this scraper directly on the &lt;a href="https://apify.com/eunit/linkedin-job-postings-scraper" rel="noopener noreferrer"&gt;Apify platform&lt;/a&gt; (&lt;em&gt;recommended&lt;/em&gt;). Go to the &lt;a href="https://apify.com/eunit/linkedin-job-postings-scraper" rel="noopener noreferrer"&gt;LinkedIn Job Postings Scraper&lt;/a&gt; page.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to the &lt;a href="https://apify.com/eunit/linkedin-job-postings-scraper" rel="noopener noreferrer"&gt;LinkedIn Job Postings Scraper&lt;/a&gt; page.&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Try for free&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Fill in your input:

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Keywords&lt;/strong&gt;: &lt;code&gt;Software Engineer&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Location&lt;/strong&gt;: &lt;code&gt;United States&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Proxy Country&lt;/strong&gt;: &lt;code&gt;US&lt;/code&gt; (Recommended for US jobs)&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Hit &lt;strong&gt;Start&lt;/strong&gt;.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F60t918zq2op44bnv7dd5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F60t918zq2op44bnv7dd5.png" alt="Actor input page" width="800" height="443"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Output Example
&lt;/h3&gt;

&lt;p&gt;The data will be clean, structured, and ready for use.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"title"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"software Engineer I"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"company"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"The Walt Disney Company"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"job_url"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https: //www.linkedin.com/jobs/view/software-engineer-i-at-the-walt-disney-company-3970118620?trk=public_jobs_topcard-title"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"logo"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https: //media.licdn.com/dms/image/D560BAQGmPH1QqmCzkg/company-logo_100_100/0/1688494223337/fieldguide_inc_logo?e=2147483647&amp;amp;v=beta&amp;amp;t=KqO_GS4C7oH3nVoyvtDCXbbhAvj2OeLbfty0yJYZCbc"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"date_posted"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"4 days ago"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"pay_range"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"$98,000.00/yr - $131,300.00/yr"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"location"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"seattle, WA"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"organization"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"The Walt Disney Company"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"number_of_applicants"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Be among the first 25 applicants"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"job_description"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"On any given day at Disney Entertainment &amp;amp; ESPN Technology, we’re reimagining ways to create magical viewing experiences for the world’s most beloved stories while also transforming Disney’s media business for the future. Whether that’s evolving our streaming and digital products in new and immersive ways, powering worldwide advertising and distribution to maximize flexibility and efficiency, or delivering Disney’s unmatched entertainment and sports content, every day is a moment to make a difference to partners and to hundreds of millions of people around the world.&lt;/span&gt;&lt;span class="se"&gt;\n\n\n\n&lt;/span&gt;&lt;span class="s2"&gt;You will be working on the Developer Experience team under the Consumer Software Engineering organization to build out efficiency tools that bring benefits across developer, QA, product and UX teams. Your contribution will have a broad impact, and it will not only save engineers time and allow them to deliver product features faster, but also help the company save a lot of financial resources. At the same time, you will enjoy fast personal growth while solving exciting and ambitious technical problems!&lt;/span&gt;&lt;span class="se"&gt;\n\n\n\n&lt;/span&gt;&lt;span class="s2"&gt;You will have the opportunity to work closely with client developers to build industry leading solutions to enable engineers to remotely access and control streaming devices. You will also build an innovative tool for managing and automating streaming hardware and for creating device labs for Disney engineering teams. Your day to day involves gathering feature requests from users, developing and testing your solutions, and deploying them into production.&lt;/span&gt;&lt;span class="se"&gt;\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n&lt;/span&gt;&lt;span class="s2"&gt;Bachelor’s degree in Computer Science, Information Systems, Software, Electrical or Electronics Engineering, or comparable field of study, and/or equivalent work experience&lt;/span&gt;&lt;span class="se"&gt;\n\n&lt;/span&gt;&lt;span class="s2"&gt;The hiring range for this position in Seattle, WA and NY, NY is $98,000 to $131,300 per year based on a 40 hour work week. The amount of hours scheduled per week may vary based on business needs. The base pay actually offered will take intoaccount internal equity and also may vary depending on the candidate’s geographic region, job-related knowledge, skills, and experience among other factors. A bonusand/or long-term incentive units may be provided as part of the compensation package, in addition to the full range of medical, financial, and/or other benefits,dependent on the level and position offered."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"seniority_level"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Mid-Senior level"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"employment_type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Full-time"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"job_function"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Information Technology"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"industries"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Entertainment Providers"&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"title"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"software Engineer"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"company"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Fieldguide"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"job_url"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https: //www.linkedin.com/jobs/view/software-engineer-at-fieldguide-3961092714?trk=public_jobs_topcard-title"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"logo"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https: //media.licdn.com/dms/image/D560BAQGmPH1QqmCzkg/company-logo_100_100/0/1688494223337/fieldguide_inc_logo?e=2147483647&amp;amp;v=beta&amp;amp;t=KqO_GS4C7oH3nVoyvtDCXbbhAvj2OeLbfty0yJYZCbc"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"date_posted"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2 weeks ago"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"pay_range"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"$125,000.00/yr - $167,000.00/yr"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"location"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"san Francisco, CA"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"organization"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Fieldguide"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"number_of_applicants"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Be among the first 25 applicants"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"job_description"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"About Us:&lt;/span&gt;&lt;span class="se"&gt;\n\n&lt;/span&gt;&lt;span class="s2"&gt;Fieldguide is establishing a new state of trust for global commerce and capital markets through automating and streamlining the work of assurance and audit practitioners specifically within cybersecurity, privacy, and ESG (Environmental, Social, Governance). Put simply, we build software for the people who enable trust between businesses.&lt;/span&gt;&lt;span class="se"&gt;\n\n&lt;/span&gt;&lt;span class="s2"&gt;We’re based in San Francisco, CA, but built as a remote-first company that enables you to do your best work from anywhere. We&lt;/span&gt;&lt;span class="se"&gt;\"&lt;/span&gt;&lt;span class="s2"&gt;re backed by top investors including Bessemer Venture Partners, 8VC, Floodgate, Y Combinator, DNX Ventures, Global Founders Capital, Justin Kan, Elad Gil, and more.&lt;/span&gt;&lt;span class="se"&gt;\n\n&lt;/span&gt;&lt;span class="s2"&gt;We value diversity — in backgrounds and in experiences. We need people from all backgrounds and walks of life to help build the future of audit and advisory. Fieldguide’s team is inclusive, driven, humble and supportive. We are deliberate and self-reflective about the kind of team and culture that we are building, seeking teammates that are not only strong in their own aptitudes but care deeply about supporting each other&lt;/span&gt;&lt;span class="se"&gt;\"&lt;/span&gt;&lt;span class="s2"&gt;s growth.&lt;/span&gt;&lt;span class="se"&gt;\n\n&lt;/span&gt;&lt;span class="s2"&gt;As an early stage start-up employee, you’ll have the opportunity to build out the future of business trust. We make audit practitioners’ lives easier by eliminating up to 50% of their work and giving them better work-life balance. If you share our values and enthusiasm for building a great culture and product, you will find a home at Fieldguide.&lt;/span&gt;&lt;span class="se"&gt;\n\n\n\n&lt;/span&gt;&lt;span class="s2"&gt;As a Software Engineer at Fieldguide, you’ll be an early member of the team, taking a front-row seat as we build both the company and the engineering organization to tackle the massive and archaic audit and advisory industry.&lt;/span&gt;&lt;span class="se"&gt;\n\n\n\n\n\n\n\n\n\n\n\n&lt;/span&gt;&lt;span class="s2"&gt;Fieldguide is a values-based company. Our values are:&lt;/span&gt;&lt;span class="se"&gt;\n\n\n\n\n\n\n\n&lt;/span&gt;&lt;span class="s2"&gt;Compensation Range: $125K - $167K"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"seniority_level"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Entry level"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"employment_type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Full-time"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"job_function"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Engineering and Information Technology"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"industries"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"software Development"&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="err"&gt;//&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;more&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;jobs&lt;/span&gt;&lt;span class="w"&gt; 
&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you need the data in formats other than JSON, Apify allows you to export your dataset to various formats, including CSV, Excel, XML, and more.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fumtngqzg1dyzube1t1r0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fumtngqzg1dyzube1t1r0.png" alt="Export data to multiple formats" width="800" height="547"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping Up
&lt;/h2&gt;

&lt;p&gt;Building your own scraper is a great learning experience, but maintaining it against the constant changes to the LinkedIn website and anti-scraping measures can be a full-time job.&lt;/p&gt;

&lt;p&gt;If you need a reliable, maintenance-free solution that handles proxy rotation, scaling, and data parsing for you, try the &lt;strong&gt;&lt;a href="https://apify.com/eunit/linkedin-job-postings-scraper" rel="noopener noreferrer"&gt;LinkedIn Job Postings Scraper&lt;/a&gt;&lt;/strong&gt; on &lt;a href="https://www.apify.com?fpr=eunit" rel="noopener noreferrer"&gt;Apify&lt;/a&gt; today. It is designed to be fast, efficient, and easy to integrate into your existing workflows.&lt;/p&gt;

&lt;h3&gt;
  
  
  Resources
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://apify.com/eunit/linkedin-job-postings-scraper" rel="noopener noreferrer"&gt;LinkedIn Job Postings Scraper&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.apify.com/sdk/python" rel="noopener noreferrer"&gt;Apify Python SDK Documentation&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://blog.apify.com/is-web-scraping-legal/" rel="noopener noreferrer"&gt;Is Web Scraping Legal?&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>scraping</category>
      <category>jobs</category>
      <category>linkedin</category>
      <category>apify</category>
    </item>
  </channel>
</rss>
