<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Boon</title>
    <description>The latest articles on DEV Community by Boon (@boo_n).</description>
    <link>https://dev.to/boo_n</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/boo_n"/>
    <language>en</language>
    <item>
      <title>I built a Shopify scraper that detects apps + pulls products in one API call</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Sat, 02 May 2026 11:50:01 +0000</pubDate>
      <link>https://dev.to/boo_n/i-built-a-shopify-scraper-that-detects-apps-pulls-products-in-one-api-call-5a8b</link>
      <guid>https://dev.to/boo_n/i-built-a-shopify-scraper-that-detects-apps-pulls-products-in-one-api-call-5a8b</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;TL;DR&lt;/strong&gt; — Existing Shopify app detectors (Koala Inspector, ShopScan, Fera, BuiltWith) are Chrome extensions or SaaS dashboards. None do batch. I had 1,200 stores to qualify and View Source + Cmd-F was killing my afternoons, so I shipped an Apify actor that takes a list of Shopify URLs and returns the full app stack (Klaviyo, Yotpo, Judge.me, Loox, ReCharge…) + product catalog + reviews in JSON. No headless browser, ~$0.005 per store, 1,000 stores in 25 minutes. Live here → &lt;a href="https://apify.com/kazkn/shopify-scraper-apps-spy" rel="noopener noreferrer"&gt;Shopify Scraper – Apps Spy + Reviews&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;




&lt;h2&gt;
  
  
  The afternoon that broke me
&lt;/h2&gt;

&lt;p&gt;Six weeks ago I was prospecting for a B2B side-project. The hypothesis: "Shopify stores running Klaviyo + a paid reviews app are the right ICP — they spend money on retention tooling, so they will pay for ours."&lt;/p&gt;

&lt;p&gt;To validate, I needed a list of Shopify stores &lt;strong&gt;and&lt;/strong&gt; their installed apps.&lt;/p&gt;

&lt;p&gt;The Shopify App Store does not give you that. The "stores using X" databases do, but the public ones are stale and the good ones are paid SaaS at $99–499/month for filters I did not need.&lt;/p&gt;

&lt;p&gt;So I did what every founder does at 11 PM: I opened View Source on a competitor list, hit &lt;code&gt;Cmd-F&lt;/code&gt;, and started typing &lt;code&gt;klaviyo&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;It worked. Sort of. I did 40 stores in two hours, then stopped, because I had a list of 1,200.&lt;/p&gt;

&lt;p&gt;That night I wrote the first version of what is now &lt;a href="https://apify.com/kazkn/shopify-scraper-apps-spy" rel="noopener noreferrer"&gt;Shopify Scraper – Apps Spy + Reviews&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I actually wanted
&lt;/h2&gt;

&lt;p&gt;Every "Shopify scraper" I found online did one of two things:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Scraped a single store's products via &lt;code&gt;/products.json&lt;/code&gt; — table-stakes, dozens of free Apify actors do it.&lt;/li&gt;
&lt;li&gt;Spawned a headless browser to fingerprint a marketing site — slow, expensive, and brittle.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;I wanted three things in one pass:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Full &lt;strong&gt;product catalog&lt;/strong&gt; (titles, prices, variants, images, vendor, tags) — nothing exotic.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;App detection&lt;/strong&gt;: which third-party Shopify apps are installed (email, reviews, subscriptions, popups, search).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Reviews&lt;/strong&gt; when a reviews app is detected — pull them via the public API, not by parsing widgets.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And I wanted it to be &lt;strong&gt;cheap&lt;/strong&gt;, because I had ~1,200 stores in my first batch and I planned to run it monthly.&lt;/p&gt;

&lt;h2&gt;
  
  
  The "no headless browser" decision
&lt;/h2&gt;

&lt;p&gt;The thing nobody tells you about Shopify scraping is that you almost never need a headless browser. The signals you want for app detection live in three places, and all three are reachable with a plain HTTPS GET:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;The HTML of the homepage&lt;/strong&gt;. Shopify apps inject &lt;code&gt;&amp;lt;script&amp;gt;&lt;/code&gt; tags from their own CDN. &lt;code&gt;cdn.judge.me&lt;/code&gt;, &lt;code&gt;cdn.yotpo.com&lt;/code&gt;, &lt;code&gt;loox.io/widget&lt;/code&gt;, &lt;code&gt;klaviyo.com/onsite&lt;/code&gt; — you grep the HTML and you know.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;code&gt;/products.json&lt;/code&gt;&lt;/strong&gt;. Shopify exposes the full catalog at this path on every store, paginated 250 items at a time. No auth, no headless. (You hit a soft rate limit around 2 req/s per IP, which is fine if you queue politely.)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;App-specific public endpoints&lt;/strong&gt;. Judge.me has a JSON reviews endpoint. Yotpo too. Same for Loox, Stamped, Reviews.io. Once you know which app is installed, you go straight to its API — no DOM parsing.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The whole actor is built around that observation. No Puppeteer, no Playwright, no proxy farm. Just &lt;code&gt;got-scraping&lt;/code&gt;, &lt;code&gt;cheerio&lt;/code&gt;, and &lt;code&gt;p-queue&lt;/code&gt; to keep concurrency civilized.&lt;/p&gt;

&lt;p&gt;The result is that scanning a single store costs ~3–6 HTTPS requests and runs in &lt;strong&gt;2 to 8 seconds&lt;/strong&gt; depending on catalog size. Cost on Apify infra: about $0.005 per store for the "tech stack only" mode.&lt;/p&gt;

&lt;h2&gt;
  
  
  The architecture (it is small on purpose)
&lt;/h2&gt;

&lt;p&gt;I'll be honest — I almost over-engineered this. My first draft had Redis for de-dup, a queue, retry logic with exponential backoff, and a state machine. Then I deleted all of it.&lt;/p&gt;

&lt;p&gt;Here is what shipped:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;src/
├── main.js                   # orchestration (p-queue, per-store flow)
├── crawlers/
│   ├── products.js           # /products.json + sitemap fallback
│   ├── apps.js               # detect apps from homepage HTML
│   └── reviews.js            # per-app reviews fetchers
└── lib/
    ├── normalize.js          # canonicalize URLs, normalize product schema
    ├── schemas.js            # zod validation for outputs
    └── billing.js            # Apify pay-per-event charges
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;A run goes:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Canonicalize the store URL (handles &lt;code&gt;www&lt;/code&gt;, custom domains, &lt;code&gt;*.myshopify.com&lt;/code&gt;).&lt;/li&gt;
&lt;li&gt;Fetch the homepage once. Confirm it is Shopify (the &lt;code&gt;x-shopify-stage&lt;/code&gt; header is a giveaway).&lt;/li&gt;
&lt;li&gt;From the same HTML, run the app detectors. Each detector is ~10 lines of regex matching against script tags + meta tags + inline JSON.&lt;/li&gt;
&lt;li&gt;Fetch &lt;code&gt;/products.json?page=N&lt;/code&gt; until you hit the cap or run out of products.&lt;/li&gt;
&lt;li&gt;If the user asked for reviews and an installed reviews app was detected, fan out to that app's public reviews API.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;That is it. The whole thing is ~900 lines of JavaScript. I run it with &lt;code&gt;node --test&lt;/code&gt; for unit tests against snapshots and a &lt;code&gt;tests/smoke-products.mjs&lt;/code&gt; that hits 5 real stores end-to-end.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I learned about app detection
&lt;/h2&gt;

&lt;p&gt;The regex-against-HTML approach has one trap. Shopify themes minify, version, and CDN-rewrite their assets, so you cannot match on a single string. The Klaviyo loader, for example, ships under at least four URL patterns I have seen:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;code&gt;static.klaviyo.com/onsite/js/klaviyo.js&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;static-tracking.klaviyo.com/onsite/js/...&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;a.klaviyo.com/media/...&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;inline &lt;code&gt;_learnq&lt;/code&gt; queue calls&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You match &lt;strong&gt;any&lt;/strong&gt; of those, and you call it Klaviyo. Same logic for every other app — every detector is an array of patterns, OR'd together, returning a single boolean. I wrote a snapshot test per app with a real store HTML page so a Klaviyo URL change does not silently break detection.&lt;/p&gt;

&lt;p&gt;The detectors I shipped on day one:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Email/SMS&lt;/strong&gt;: Klaviyo, Omnisend, Postscript, Mailchimp, Attentive&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Reviews&lt;/strong&gt;: Yotpo, Judge.me, Loox, Stamped, Reviews.io, Okendo&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Subscriptions&lt;/strong&gt;: ReCharge, Bold, Loop, Skio&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Popups &amp;amp; SMS capture&lt;/strong&gt;: Privy, Justuno, Klaviyo Forms&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Search &amp;amp; discovery&lt;/strong&gt;: Searchanise, Boost, Algolia&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Loyalty&lt;/strong&gt;: Smile.io, Yotpo Loyalty, LoyaltyLion&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you tell me an app I missed, I add a detector. Each one is a 15-minute job.&lt;/p&gt;

&lt;h2&gt;
  
  
  The pay-per-event pricing problem
&lt;/h2&gt;

&lt;p&gt;Apify lets you charge per event instead of per compute minute. For a scraper that runs in seconds, this is the right model — your customer pays for the rows they get, not for compute time.&lt;/p&gt;

&lt;p&gt;The mistake I made on my first push was leaving Apify's default &lt;code&gt;dataset_item&lt;/code&gt; event on. Combined with my custom &lt;code&gt;product_extracted&lt;/code&gt; event, every product was being charged twice. I caught it in monetization review and removed the synthetic event.&lt;/p&gt;

&lt;p&gt;The pricing I landed on:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;store_analyzed&lt;/code&gt; — $0.003 per store (covers detection + products fetch)&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;product_extracted&lt;/code&gt; — $0.0005 per product&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;apps_detected&lt;/code&gt; — $0.001 per store at standard+&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;review_extracted&lt;/code&gt; — $0.0003 per review&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A 500-product store with reviews costs roughly &lt;strong&gt;$0.30&lt;/strong&gt; end to end. For comparison, the SaaS competitors charge $99 or more for similar lookups, batched and capped.&lt;/p&gt;

&lt;h2&gt;
  
  
  What surprised me
&lt;/h2&gt;

&lt;p&gt;Three things, in order of how badly I underestimated them.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. &lt;code&gt;/products.json&lt;/code&gt; is more honest than the storefront.&lt;/strong&gt; It exposes products that are unpublished from the theme but still live (out-of-stock holdovers, B2B-only SKUs). Useful for trend research. Sometimes shocking.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Reviews-app detection is a lead signal.&lt;/strong&gt; A store on Judge.me Free plan vs. Yotpo Premium tells you a lot about their stage. I ended up using this internally to prioritize cold outbound — different pitch for a $30/month stack vs. a $1,200/month stack.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. People want this as an MCP server.&lt;/strong&gt; Two of my first three users asked if they could query it from Claude / ChatGPT. I have it on the roadmap. (My &lt;a href="https://apify.com/kazkn/gpt-crawler-mcp" rel="noopener noreferrer"&gt;GPT Crawler MCP&lt;/a&gt; and &lt;a href="https://apify.com/kazkn/vinted-mcp-server" rel="noopener noreferrer"&gt;Vinted MCP Server&lt;/a&gt; are the two MCP actors I shipped first; the Shopify one is next.)&lt;/p&gt;

&lt;h2&gt;
  
  
  How to use it in one minute
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// On Apify, paste this in the actor input box&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;store_urls&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;https://allbirds.com&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;https://gymshark.com&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;extract_level&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;standard&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;  &lt;span class="c1"&gt;// products + apps stack&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;max_products_per_store&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;250&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Output (one record per product, with &lt;code&gt;apps_detected&lt;/code&gt; attached):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"store_domain"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"allbirds.com"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"product_title"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Wool Runner"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"price"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;110&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"available"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"vendor"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Allbirds"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"apps_detected"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"email"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"Klaviyo"&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"reviews"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"Yotpo"&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"subscriptions"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[],&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"search"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"Searchanise"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"product_url"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://allbirds.com/products/mens-wool-runners"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you want reviews, set &lt;code&gt;extract_level: "full"&lt;/code&gt; and a &lt;code&gt;max_reviews_per_product&lt;/code&gt;. The actor will route to the correct reviews API based on what was detected.&lt;/p&gt;

&lt;p&gt;Direct link, free $5 credit on Apify, no account-creation drama: &lt;strong&gt;&lt;a href="https://apify.com/kazkn/shopify-scraper-apps-spy" rel="noopener noreferrer"&gt;Shopify Scraper – Apps Spy + Reviews&lt;/a&gt;&lt;/strong&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  FAQ
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Is scraping &lt;code&gt;/products.json&lt;/code&gt; allowed?
&lt;/h3&gt;

&lt;p&gt;Shopify exposes &lt;code&gt;/products.json&lt;/code&gt; publicly on every store by default. Stores that disable it (rare) return 404 and the actor logs a skip. The actor never authenticates, never bypasses access controls, and respects standard rate limits.&lt;/p&gt;

&lt;h3&gt;
  
  
  What about reCAPTCHA or Cloudflare?
&lt;/h3&gt;

&lt;p&gt;For the standard catalog and app-detection flow, no. &lt;code&gt;/products.json&lt;/code&gt; and the homepage HTML are not gated. For some reviews APIs, very high request volumes can trigger rate-limits — the actor backs off and retries with jitter.&lt;/p&gt;

&lt;h3&gt;
  
  
  How is this different from Koala Inspector, ShopScan or BuiltWith?
&lt;/h3&gt;

&lt;p&gt;Koala Inspector, ShopScan and Fera are excellent Chrome extensions for one-store lookups, but none of them do batch — you cannot paste 500 URLs and get a CSV back. BuiltWith is a generic tech-stack tool with broad coverage but its Shopify-app detection is shallow and you cannot pull products in the same call. This actor is purpose-built for Shopify and runs in batch via API: deeper app detection (subscriptions, reviews, popups, search, loyalty), full product catalog, and reviews — all in one pass, billed pay-per-event.&lt;/p&gt;

&lt;h3&gt;
  
  
  How long does a 1,000-store scan take?
&lt;/h3&gt;

&lt;p&gt;About 25 minutes at default concurrency, costing ~$3 of Apify credits at the &lt;code&gt;standard&lt;/code&gt; level. A &lt;code&gt;full&lt;/code&gt; run with reviews is closer to an hour and ~$15 depending on review volume.&lt;/p&gt;

&lt;h3&gt;
  
  
  Can I get one record per variant instead of per product?
&lt;/h3&gt;

&lt;p&gt;Yes. Set &lt;code&gt;include_variants: true&lt;/code&gt; in the input and the dataset returns one row per SKU with size/color/price/availability normalized.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is next
&lt;/h2&gt;

&lt;p&gt;I want to add three things, in order:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Revenue estimation&lt;/strong&gt; at the &lt;code&gt;pro&lt;/code&gt; tier — based on review velocity and product velocity, both of which are observable.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;MCP server mode&lt;/strong&gt; so you can query it from Claude desktop / Cursor.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Theme detection&lt;/strong&gt; — useful for agency outbound, less useful for me, but I keep being asked.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;If you use it and something breaks, ping me — I am the only maintainer and I read every issue. The actor is on Apify Store at &lt;a href="https://apify.com/kazkn/shopify-scraper-apps-spy" rel="noopener noreferrer"&gt;kazkn/shopify-scraper-apps-spy&lt;/a&gt;.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Tags: #shopify #ecommerce #api #indiehackers&lt;/em&gt;&lt;/p&gt;

</description>
      <category>shopify</category>
      <category>ecommerce</category>
      <category>api</category>
      <category>indiehackers</category>
    </item>
    <item>
      <title>How to Batch-Scrape 10 Vinted Search URLs in One Run: A Reseller's Workflow</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Fri, 24 Apr 2026 15:29:40 +0000</pubDate>
      <link>https://dev.to/boo_n/how-to-batch-scrape-10-vinted-search-urls-in-one-run-a-resellers-workflow-4ed1</link>
      <guid>https://dev.to/boo_n/how-to-batch-scrape-10-vinted-search-urls-in-one-run-a-resellers-workflow-4ed1</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgi37z1th3g2d3jrx6hlc.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgi37z1th3g2d3jrx6hlc.jpg" alt="Vinted Turbo Scraper Batch Workflow" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Two weeks ago, a reseller I work with asked me a surprisingly common question:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"I track 12 different brands across Vinted. That means 12 different search URLs, 12 different filter combinations, 12 different countries depending on where inventory is cheapest. Right now I run them one by one. How do I automate all of this without writing orchestration code?"&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The short answer: &lt;strong&gt;paste all 12 URLs at once&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;The tool: &lt;strong&gt;&lt;a href="https://apify.com/kazkn/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Vinted Turbo Scraper&lt;/a&gt;&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;This article breaks down the exact batch workflow I set up for him — and why it's the most underused feature of the Actor.&lt;/p&gt;

&lt;h2&gt;
  
  
  📺 Watch the Tutorial
&lt;/h2&gt;

&lt;p&gt;  &lt;iframe src="https://www.youtube.com/embed/rWtZVDMflbo"&gt;
  &lt;/iframe&gt;
&lt;/p&gt;

&lt;h2&gt;
  
  
  Why Batch URLs Matter (More Than Speed)
&lt;/h2&gt;

&lt;p&gt;Speed is great. But for serious resellers and market researchers, &lt;strong&gt;throughput&lt;/strong&gt; is what actually moves the needle.&lt;/p&gt;

&lt;p&gt;Here's the reality of single-URL scraping:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Workflow&lt;/th&gt;
&lt;th&gt;Time per URL&lt;/th&gt;
&lt;th&gt;Total for 10 URLs&lt;/th&gt;
&lt;th&gt;Manual Steps&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Run manually, one by one&lt;/td&gt;
&lt;td&gt;3–5 min&lt;/td&gt;
&lt;td&gt;30–50 min&lt;/td&gt;
&lt;td&gt;10 separate runs, 10 separate exports&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Batch all URLs in one run&lt;/td&gt;
&lt;td&gt;3–5 min total&lt;/td&gt;
&lt;td&gt;3–5 min&lt;/td&gt;
&lt;td&gt;1 run, 1 export&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Most people never think about batching because:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Vinted's own site doesn't suggest it&lt;/li&gt;
&lt;li&gt;Most Python scrapers on GitHub are built for single requests&lt;/li&gt;
&lt;li&gt;Apify Actors often look like they only accept one input&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Vinted Turbo Scraper accepts an array of URLs.&lt;/strong&gt; Not one. An array. Paste 10, get 10 datasets merged into a single export.&lt;/p&gt;

&lt;h2&gt;
  
  
  Real-World Use Cases
&lt;/h2&gt;

&lt;p&gt;Before diving into the how-to, here's who actually uses this feature:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Multi-brand resellers&lt;/strong&gt; — tracking Nike, Adidas, Jordan, New Balance, and vintage Patagonia simultaneously across the same country domain.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Cross-country arbitrageurs&lt;/strong&gt; — monitoring the same keyword across Vinted.fr, Vinted.de, Vinted.nl, and Vinted.pl to spot price gaps.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Category monitors&lt;/strong&gt; — running separate searches for Men's Sneakers, Women's Sneakers, and Kids' Sneakers with different size filters.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Deal hunters&lt;/strong&gt; — setting up 15–20 hyper-specific filter combos (brand + price + size + condition + color) and running them all nightly via Apify's scheduler.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to Batch-Scrape Vinted URLs: Step-by-Step
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Step 1: Prepare your search URLs
&lt;/h3&gt;

&lt;p&gt;This is identical to single-URL mode. For each segment you want to track:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to Vinted (any country domain)&lt;/li&gt;
&lt;li&gt;Apply filters: brand, size, price range, condition, color, category&lt;/li&gt;
&lt;li&gt;Copy the URL from your browser bar&lt;/li&gt;
&lt;li&gt;Repeat for every segment you want&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Example URL set for a sneaker reseller:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://www.vinted.fr/catalog?search_text=jordan&amp;amp;price_from=50&amp;amp;price_to=100&amp;amp;size_from=42&amp;amp;size_to=44&amp;amp;status_id=6
https://www.vinted.fr/catalog?search_text=nike&amp;amp;price_from=30&amp;amp;price_to=80&amp;amp;size_from=40&amp;amp;size_to=45&amp;amp;status_id=6
https://www.vinted.de/catalog?search_text=jordan&amp;amp;price_from=40&amp;amp;price_to=90&amp;amp;status_id=6
https://www.vinted.nl/catalog?search_text=nike&amp;amp;price_from=35&amp;amp;price_to=85&amp;amp;status_id=6
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Each URL already contains the exact filters you need. No rebuilding required.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Configure the Actor for batch mode
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Open &lt;a href="https://apify.com/kazkn/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Vinted Turbo Scraper&lt;/a&gt; on the Apify Store&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Try for free&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;In the Input tab, locate the &lt;code&gt;searchURLs&lt;/code&gt; field&lt;/li&gt;
&lt;li&gt;Paste your URLs — one per line, or as a JSON array:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Plain text input:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://www.vinted.fr/catalog?search_text=jordan...
https://www.vinted.fr/catalog?search_text=nike...
https://www.vinted.de/catalog?search_text=jordan...
https://www.vinted.nl/catalog?search_text=nike...
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;JSON input (for API/programmatic runs):&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"searchURLs"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.fr/catalog?search_text=jordan&amp;amp;price_from=50&amp;amp;price_to=100"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.de/catalog?search_text=jordan&amp;amp;price_from=40&amp;amp;price_to=90"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.nl/catalog?search_text=nike&amp;amp;price_from=35&amp;amp;price_to=85"&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 3: Run and export
&lt;/h3&gt;

&lt;p&gt;Click &lt;strong&gt;Start&lt;/strong&gt;. The Actor processes each URL sequentially but without the overhead of browser restart between URLs. Results from all URLs are merged into a single output dataset.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Export options:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;CSV&lt;/strong&gt; — flat table, one listing per row, with an added &lt;code&gt;source_url&lt;/code&gt; column showing which search URL each listing came from&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;JSON&lt;/strong&gt; — structured object with all fields + source attribution&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Excel&lt;/strong&gt; — same as CSV, formatted for Excel&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Google Sheets&lt;/strong&gt; — live push to a shared spreadsheet&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;API&lt;/strong&gt; — consume via the Apify API, perfect for custom pipelines&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step 4: Automate with scheduling (optional but recommended)
&lt;/h3&gt;

&lt;p&gt;Apify has a built-in scheduler. I set mine to run every day at 6 AM CET:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to the Actor's &lt;strong&gt;Schedules&lt;/strong&gt; tab in Apify Console&lt;/li&gt;
&lt;li&gt;Create a new schedule: &lt;strong&gt;Daily at 06:00&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Paste your URL array into the schedule input&lt;/li&gt;
&lt;li&gt;Set the destination: Google Sheets, webhook, or just store in Apify's key-value store&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Every morning, I wake up to a fresh dataset in my Google Sheet. Zero manual work.&lt;/p&gt;

&lt;h2&gt;
  
  
  What the Output Actually Looks Like
&lt;/h2&gt;

&lt;p&gt;When you batch URLs, the Actor adds one critical field to every record:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"url"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.fr/items/987654321-nike-air-max-90"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"title"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Nike Air Max 90"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"price"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;55.00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"currency"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"EUR"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"brand"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Nike"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"size"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"43"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"condition"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Good"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"seller_username"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"paris_sneakers"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"location"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Paris, France"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"thumbnail"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://images1.vinted.net/..."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"source_url"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.fr/catalog?search_text=nike&amp;amp;price_from=30&amp;amp;price_to=80"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"scraped_at"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2026-04-24T06:03:15.000Z"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Key field:&lt;/strong&gt; &lt;code&gt;source_url&lt;/code&gt; tells you exactly which search query produced each listing. This is essential when you're running 10+ URLs simultaneously and need to segment results later.&lt;/p&gt;

&lt;h2&gt;
  
  
  Cost Breakdown for Batch Workflows
&lt;/h2&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Scenario&lt;/th&gt;
&lt;th&gt;URLs per run&lt;/th&gt;
&lt;th&gt;Avg listings per URL&lt;/th&gt;
&lt;th&gt;Total results&lt;/th&gt;
&lt;th&gt;Cost per run&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Small batch (3 brands, 1 country)&lt;/td&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;400&lt;/td&gt;
&lt;td&gt;1,200&lt;/td&gt;
&lt;td&gt;$1.80&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Medium batch (5 brands, 2 countries)&lt;/td&gt;
&lt;td&gt;10&lt;/td&gt;
&lt;td&gt;350&lt;/td&gt;
&lt;td&gt;3,500&lt;/td&gt;
&lt;td&gt;$5.25&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Large batch (15+ segments)&lt;/td&gt;
&lt;td&gt;15&lt;/td&gt;
&lt;td&gt;300&lt;/td&gt;
&lt;td&gt;4,500&lt;/td&gt;
&lt;td&gt;$6.75&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;With Apify's $5 monthly free credits, a small batch workflow costs you &lt;strong&gt;nothing&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;For active resellers running daily large batches: $6.75 × 30 days = ~$200/month. The time saved is roughly 2–3 hours per day of manual scraping, monitoring, and data cleaning. At any consulting rate, the ROI is immediate.&lt;/p&gt;

&lt;h2&gt;
  
  
  Common Batch Pitfalls (and How to Avoid Them)
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Pitfall 1: Mixing country domains in one run&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;The Actor handles &lt;code&gt;.fr&lt;/code&gt;, &lt;code&gt;.de&lt;/code&gt;, &lt;code&gt;.nl&lt;/code&gt;, &lt;code&gt;.pl&lt;/code&gt;, &lt;code&gt;.es&lt;/code&gt;, &lt;code&gt;.it&lt;/code&gt;, &lt;code&gt;.be&lt;/code&gt;, etc. natively. No issue. But be aware that pricing will be in local currencies — EUR for Eurozone, GBP for UK, PLN for Poland. Normalize in your export pipeline or spreadsheet.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Pitfall 2: URLs with too few filters&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;An overly broad URL (e.g., just &lt;code&gt;search_text=vintage&lt;/code&gt; with no price/size filters) can return 10,000+ results per URL. The Actor caps per-URL extraction to prevent runaway costs, but it's better to filter aggressively on Vinted first.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Pitfall 3: Identical URLs&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Deduplication happens at the dataset level, but identical URLs waste compute. Check your URL list before running.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Pitfall 4: Expecting real-time inventory&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Vinted listings appear and disappear fast — especially underpriced items. A scheduled run every 6 hours catches most opportunities without hitting rate limits.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Integrating Batch Results Into Your Workflow
&lt;/h2&gt;

&lt;p&gt;Here's the exact stack I use with a reseller partner:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Step&lt;/th&gt;
&lt;th&gt;Tool&lt;/th&gt;
&lt;th&gt;Purpose&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;1. Scrape&lt;/td&gt;
&lt;td&gt;Vinted Turbo Scraper (Apify)&lt;/td&gt;
&lt;td&gt;Pull 15 URLs nightly&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;2. Normalize&lt;/td&gt;
&lt;td&gt;Google Sheets (Apify integration)&lt;/td&gt;
&lt;td&gt;Clean + currency unification&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3. Alert&lt;/td&gt;
&lt;td&gt;n8n&lt;/td&gt;
&lt;td&gt;Price-drop notifications via Telegram&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;4. Act&lt;/td&gt;
&lt;td&gt;Manual&lt;/td&gt;
&lt;td&gt;Buy the listing, message seller, relist&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;The entire pipeline from "run Actor" to "receive Telegram alert for a €45 Jordan 1" is under 5 minutes.&lt;/p&gt;

&lt;h2&gt;
  
  
  FAQ
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Q: What's the maximum number of URLs per run?&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Tested comfortably up to 50 URLs per run. Beyond that, splitting into multiple runs is safer for stability. Each URL is processed sequentially, so it's more about total runtime than URL count.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Q: Does batch mode cost more?&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;No. Pricing is $0.0015 per result, regardless of how many URLs produced that result. 1,000 listings from 1 URL = $1.50. 1,000 listings from 10 URLs = $1.50.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Q: Can I batch across different Vinted country domains?&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Yes. Mix &lt;code&gt;.fr&lt;/code&gt;, &lt;code&gt;.de&lt;/code&gt;, &lt;code&gt;.nl&lt;/code&gt;, &lt;code&gt;.pl&lt;/code&gt;, &lt;code&gt;.es&lt;/code&gt;, &lt;code&gt;.it&lt;/code&gt;, &lt;code&gt;.be&lt;/code&gt;, etc. in the same input array. The Actor handles domain detection automatically.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Q: What if one URL fails?&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Failed URLs are logged in the Actor's run log. Successful URLs still produce their dataset. No total-run failure from a single bad URL.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  TL;DR
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Vinted Turbo Scraper accepts &lt;strong&gt;multiple search URLs in one run&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Results merge into a single export with &lt;code&gt;source_url&lt;/code&gt; attribution&lt;/li&gt;
&lt;li&gt;Schedule daily runs, push to Google Sheets, build alert pipelines&lt;/li&gt;
&lt;li&gt;Pricing is per-result, not per-URL — batching is free in terms of cost structure&lt;/li&gt;
&lt;li&gt;Supports all 26 Vinted country domains in a single batch&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;👉 &lt;strong&gt;&lt;a href="https://apify.com/kazkn/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Launch Vinted Turbo Scraper&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Related tools:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://apify.com/kazkn/vinted-smart-scraper" rel="noopener noreferrer"&gt;Vinted Smart Scraper&lt;/a&gt; — Deep seller analytics + cross-country price comparison&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://apify.com/kazkn/vinted-mcp-server" rel="noopener noreferrer"&gt;Vinted MCP Server&lt;/a&gt; — Natural language Vinted queries via AI&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://dev.to/boo_n/how-i-scrape-1000-vinted-listings-in-under-2-minutes-without-writing-a-single-line-of-code-1lol"&gt;How I Scrape 1,000 Vinted Listings in Under 2 Minutes&lt;/a&gt; — Speed-focused guide with video tutorial&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>apify</category>
      <category>vinted</category>
      <category>webscraping</category>
      <category>automation</category>
    </item>
    <item>
      <title>How I Scrape 1,000 Vinted Listings in Under 2 Minutes (Without Writing a Single Line of Code)</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Fri, 24 Apr 2026 15:14:21 +0000</pubDate>
      <link>https://dev.to/boo_n/how-i-scrape-1000-vinted-listings-in-under-2-minutes-without-writing-a-single-line-of-code-1lol</link>
      <guid>https://dev.to/boo_n/how-i-scrape-1000-vinted-listings-in-under-2-minutes-without-writing-a-single-line-of-code-1lol</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbqif5dmjabo326j8094v.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbqif5dmjabo326j8094v.jpg" alt="Vinted Turbo Scraper Speed Test" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Two months ago, a friend running a sneaker resale business called me with a very specific problem:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"I need to pull every Jordan 1 listing under €80 in France, Germany, and Spain. I don't know Python. I don't have proxies. And I don't have three days to fight Cloudflare. How do I do this in under five minutes?"&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;I pointed him to a tool I'd built exactly for this: the &lt;strong&gt;Vinted Turbo Scraper&lt;/strong&gt; on Apify. He had his first CSV ready before he finished his coffee.&lt;/p&gt;

&lt;h2&gt;
  
  
  📺 Watch the Tutorial in Action
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://www.youtube.com/watch?v=rWtZVDMflbo" rel="noopener noreferrer"&gt;https://www.youtube.com/watch?v=rWtZVDMflbo&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Speed Benchmark Nobody Talks About
&lt;/h2&gt;

&lt;h2&gt;
  
  
  The Speed Benchmark Nobody Talks About
&lt;/h2&gt;

&lt;p&gt;Most Vinted scrapers on GitHub advertise "fast" extraction. What that usually means is:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;40–120 seconds to authenticate + parse a single page&lt;/li&gt;
&lt;li&gt;Another 30–60 seconds per subsequent page&lt;/li&gt;
&lt;li&gt;A 403 ban after 200–400 requests because you're hammering Vinted's CDN with a single residential IP&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;With &lt;strong&gt;Vinted Turbo Scraper&lt;/strong&gt;, I consistently hit &lt;strong&gt;~500 items per minute&lt;/strong&gt; on country-specific searches (confirmed across Vinted.fr, Vinted.de, Vinted.nl, and Vinted.pl). A filtered search URL returning ~1,000 listings completes in &lt;strong&gt;under 2 minutes&lt;/strong&gt; from URL paste to CSV download.&lt;/p&gt;

&lt;p&gt;Here's what a typical run looks like:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Search URL&lt;/th&gt;
&lt;th&gt;Filters Applied&lt;/th&gt;
&lt;th&gt;Listings Scraped&lt;/th&gt;
&lt;th&gt;Total Runtime&lt;/th&gt;
&lt;th&gt;Cost&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;vinted.fr/catalog?search_text=jordan&amp;amp;price_from=50&amp;amp;price_to=80&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Brand: Jordan, Price €50–€80&lt;/td&gt;
&lt;td&gt;1,024&lt;/td&gt;
&lt;td&gt;1 min 58 s&lt;/td&gt;
&lt;td&gt;$1.54&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;vinted.de/catalog?search_text=nike&amp;amp;size_from=43&amp;amp;size_to=44&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Brand: Nike, Size 43–44&lt;/td&gt;
&lt;td&gt;876&lt;/td&gt;
&lt;td&gt;1 min 42 s&lt;/td&gt;
&lt;td&gt;$1.31&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;vinted.nl/catalog?search_text=vintage&amp;amp;status_id=6&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Status: Available, Keyword: vintage&lt;/td&gt;
&lt;td&gt;2,341&lt;/td&gt;
&lt;td&gt;4 min 12 s&lt;/td&gt;
&lt;td&gt;$3.51&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Key insight:&lt;/strong&gt; The architecture skips the heavy DOM-rendering overhead. Instead of crawling each listing page with a full browser, Turbo extracts structured data directly from Vinted's internal API endpoints — the same endpoints the mobile app hits. That's where the speed comes from.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Who This Is For (and Who It's Not For)
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Use Turbo if you:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Need filtered Vinted data &lt;strong&gt;now&lt;/strong&gt;, not next week&lt;/li&gt;
&lt;li&gt;Don't want to manage proxy rotation, sessions, or CAPTCHA solving&lt;/li&gt;
&lt;li&gt;Are a reseller, researcher, or analyst who needs structured exports (JSON / CSV / Excel / Google Sheets)&lt;/li&gt;
&lt;li&gt;Want to &lt;strong&gt;batch multiple search URLs&lt;/strong&gt; in a single run (I'll cover this in detail in my next article)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Use something else if you:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Need deep seller analytics, trending product tracking, or cross-country price comparison → that's &lt;a href="https://apify.com/kazkn/vinted-smart-scraper" rel="noopener noreferrer"&gt;Vinted Smart Scraper&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Want to build and maintain your own scraping infrastructure from scratch (you genuinely enjoy fighting anti-bot teams)&lt;/li&gt;
&lt;li&gt;Need to scrape private messages or PII (we don't do that, and you shouldn't either)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The Actual Setup. No Code Required.
&lt;/h2&gt;

&lt;p&gt;This is the exact workflow I showed my friend. It takes under three minutes:&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1: Build your search URL on Vinted
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Go to &lt;a href="https://www.vinted.com" rel="noopener noreferrer"&gt;vinted.com&lt;/a&gt; (or any country domain: .fr, .de, .nl, .pl, etc.)&lt;/li&gt;
&lt;li&gt;Apply any filters you want: brand, size, price range, item condition, category, color&lt;/li&gt;
&lt;li&gt;Copy the URL from your browser bar&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;That's it.&lt;/strong&gt; Every filter you applied is encoded directly in that URL. No need to rebuild anything in a form.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Paste into the Actor on Apify
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Open the &lt;a href="https://apify.com/kazkn/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Vinted Turbo Scraper Actor&lt;/a&gt; on the Apify Store&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Try for free&lt;/strong&gt; (you get $5 Platform Credits every month)&lt;/li&gt;
&lt;li&gt;Paste your Vinted search URL into the input field&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Start&lt;/strong&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fapify.com%2Fkazkn%2Fvinted-turbo-scraper" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fapify.com%2Fkazkn%2Fvinted-turbo-scraper" alt="Input screenshot" width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 3: Download your dataset
&lt;/h3&gt;

&lt;p&gt;Once the status shows &lt;strong&gt;Succeeded&lt;/strong&gt;, go to the &lt;strong&gt;Output&lt;/strong&gt; tab:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Click &lt;strong&gt;Export&lt;/strong&gt; → &lt;strong&gt;CSV&lt;/strong&gt; or &lt;strong&gt;JSON&lt;/strong&gt; for direct download&lt;/li&gt;
&lt;li&gt;Or click &lt;strong&gt;Google Sheets&lt;/strong&gt; to push to a live spreadsheet&lt;/li&gt;
&lt;li&gt;Or grab the &lt;strong&gt;API URL&lt;/strong&gt; to consume results programmatically in Python, Node.js, or any automation tool&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The output structure is flat and predictable — no nested JSON hell:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"url"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.fr/items/123456789-jordan-1-mid-chicago"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"title"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Jordan 1 Mid 'Chicago'"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"price"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;75.00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"currency"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"EUR"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"brand"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Jordan"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"size"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"44"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"condition"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Very good"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"description"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Barely worn. No creases."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"seller_username"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"sneakerhead_paris"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"seller_url"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.fr/member/987654321-sneakerhead_paris"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"location"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Paris, France"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"thumbnail"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://images1.vinted.net/t/01_01234abc.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"images"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"https://images1.vinted.net/..."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://images2.vinted.net/..."&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"scraped_at"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2026-04-24T10:15:30.000Z"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Performance Under the Hood
&lt;/h2&gt;

&lt;p&gt;Here's why this outperforms home-rolled Python scripts:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Factor&lt;/th&gt;
&lt;th&gt;Typical Python Scraper&lt;/th&gt;
&lt;th&gt;Vinted Turbo Scraper&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Authentication&lt;/td&gt;
&lt;td&gt;Manual cookie/session mgmt&lt;/td&gt;
&lt;td&gt;Auto bootstrap via Playwright (one-time)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Proxy rotation&lt;/td&gt;
&lt;td&gt;Manual 3rd-party proxy pools&lt;/td&gt;
&lt;td&gt;Built-in Apify residential proxies&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Rate-limit protection&lt;/td&gt;
&lt;td&gt;None / naive sleep()&lt;/td&gt;
&lt;td&gt;Adaptive backoff + request shaping&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Data extraction&lt;/td&gt;
&lt;td&gt;Regex / BeautifulSoup on HTML&lt;/td&gt;
&lt;td&gt;Direct API endpoint parsing (JSON native)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Export&lt;/td&gt;
&lt;td&gt;Custom script required&lt;/td&gt;
&lt;td&gt;CSV, JSON, Excel, Sheets, or API out of the box&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Maintenance&lt;/td&gt;
&lt;td&gt;You own every breakage&lt;/td&gt;
&lt;td&gt;Managed by the Actor, updated when Vinted changes&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h2&gt;
  
  
  The Economics
&lt;/h2&gt;

&lt;p&gt;Pricing is straightforward: &lt;strong&gt;$0.0015 per result&lt;/strong&gt; (so $1.50 per 1,000 listings).&lt;/p&gt;

&lt;p&gt;With the free $5 monthly Apify credits, you can scrape roughly &lt;strong&gt;3,300 listings per month at zero cost&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;For a small reseller running 20 searches per week at ~500 results each, that's $15/week = &lt;strong&gt;$60/month&lt;/strong&gt; for data that would take 10+ hours to collect manually.&lt;/p&gt;

&lt;h2&gt;
  
  
  FAQ
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Q: Is this legal? Are you scraping private data?&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;No. We only extract publicly visible listing data — the same information any visitor sees on a Vinted search page. No private messages, no login-required data, no PII.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Q: Will I get IP-banned?&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;The Actor runs on Apify's infrastructure with residential proxy rotation. In 30 days of active usage, we maintained a &lt;strong&gt;90.8% success rate&lt;/strong&gt; across 121 runs. Individual bans are handled transparently — failed runs are retried automatically.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Q: Can I schedule this to run daily?&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Yes. Apify's scheduler lets you set up recurring runs. I use it to monitor specific keywords ("vintage Patagonia", "Nike Dunk") and get fresh data pushed to a Google Sheet every morning.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Q: What countries are supported?&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;All 26 Vinted country domains: France, Germany, Netherlands, Poland, Spain, Italy, UK, Belgium, Czech Republic, Austria, Portugal, Lithuania, Luxembourg, Slovakia, Hungary, Romania, Bulgaria, Greece, Croatia.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Try It
&lt;/h2&gt;

&lt;p&gt;If you have a filtered Vinted search URL open right now, copy it and test the Actor:&lt;/p&gt;

&lt;p&gt;👉 &lt;strong&gt;&lt;a href="https://apify.com/kazkn/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Launch Vinted Turbo Scraper&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;For technical documentation, API examples in Python/JS, and integration guides, check the &lt;a href="https://github.com/Boo-n/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Actor's full README&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Related Tools
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://apify.com/kazkn/vinted-smart-scraper" rel="noopener noreferrer"&gt;Vinted Smart Scraper&lt;/a&gt;&lt;/strong&gt; — Cross-country price comparison, seller analysis, trending products&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://apify.com/kazkn/vinted-mcp-server" rel="noopener noreferrer"&gt;Vinted MCP Server&lt;/a&gt;&lt;/strong&gt; — Query Vinted data with natural language via Claude/Cursor&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;&lt;em&gt;Built by &lt;a href="https://apify.com/kazkn" rel="noopener noreferrer"&gt;KazKN&lt;/a&gt;. Questions? Open an &lt;a href="https://github.com/Boo-n/vinted-turbo-scraper/issues" rel="noopener noreferrer"&gt;issue&lt;/a&gt; or DM on &lt;a href="https://twitter.com/DataKazKN" rel="noopener noreferrer"&gt;X&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>apify</category>
      <category>vinted</category>
      <category>webscraping</category>
      <category>automation</category>
    </item>
    <item>
      <title>The Hybrid Vinted Scraping Architecture That Outperforms Pure Browser Crawls</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Thu, 23 Apr 2026 11:30:16 +0000</pubDate>
      <link>https://dev.to/boo_n/the-hybrid-vinted-scraping-architecture-that-outperforms-pure-browser-crawls-593b</link>
      <guid>https://dev.to/boo_n/the-hybrid-vinted-scraping-architecture-that-outperforms-pure-browser-crawls-593b</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdq5vkqt7zlqp0ga64ctt.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdq5vkqt7zlqp0ga64ctt.jpg" alt="Cover" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  The Hybrid Vinted Scraping Architecture That Outperforms Pure Browser Crawls
&lt;/h1&gt;

&lt;p&gt;When you scrape Vinted at scale, you quickly hit a wall.&lt;/p&gt;

&lt;p&gt;Not a firewall metaphor. A literal one. Datadome. Cloudflare. Aggressive rate limits. Token rotation that invalidates your session mid-crawl. And if you are still running headless Chromium for every single request, you are burning proxy credits and clock cycles for no reason.&lt;/p&gt;

&lt;p&gt;After months of iteration — and enough failed runs to fill a datacenter — the architecture that actually works is &lt;strong&gt;hybrid&lt;/strong&gt;: use a real browser only where Vinted forces you to, then switch to lightweight HTTP for the actual data extraction.&lt;/p&gt;

&lt;p&gt;This is how Vinted Turbo Scraper implements that hybrid model, what makes it faster than pure-browser approaches, and why the architecture is the real product.&lt;/p&gt;




&lt;h2&gt;
  
  
  Why Pure Browser Crawling Is a Trap
&lt;/h2&gt;

&lt;p&gt;Most tutorials tell you to fire up Playwright or Puppeteer, navigate to a Vinted search page, scroll endlessly, and extract DOM nodes. This works for five items. It collapses at scale.&lt;/p&gt;

&lt;p&gt;Here is why:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Problem&lt;/th&gt;
&lt;th&gt;Browser-Only Impact&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Proxy cost&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Every image, font, and JS asset loads through your proxy. Bandwidth is not free.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Memory bloat&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Chromium instances chew 200-500MB each. At concurrency 5, you are eating gigabytes.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Fingerprint fatigue&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Datadome profiles browser behavior. Repeating the same navigation pattern = flag.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Session decay&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Cookies and tokens expire. A pure browser crawl does not gracefully re-authenticate.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Speed ceiling&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Rendering a full React-powered catalog page takes 2-5 seconds. Per page.&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;A pure browser crawl is not "robust." It is expensive, slow, and detectable.&lt;/p&gt;

&lt;p&gt;The insight is simple: &lt;strong&gt;Vinted serves catalog data via an internal JSON API.&lt;/strong&gt; Once you have a valid session cookie, you can query that API directly with HTTP requests. No rendering. No DOM traversal. No asset loading.&lt;/p&gt;

&lt;p&gt;The challenge is getting that cookie in the first place.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Hybrid Model: Browser for Session, HTTP for Extraction
&lt;/h2&gt;

&lt;p&gt;Vinted Turbo Scraper uses a two-phase approach:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Phase One: Session initialization via Playwright&lt;/strong&gt; — Navigate to the target catalog page once, let Datadome validate the browser fingerprint, capture cookies, and grab the user agent string.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Phase Two: HTTP API extraction via got-scraping&lt;/strong&gt; — Use the captured session to fire lightweight JSON API requests, paginating through results at ~200 items per minute.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This is not theoretical. Here is how the crawler initialization blocks media assets to keep proxy usage minimal:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="nx"&gt;preNavigationHooks&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
    &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="nx"&gt;page&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;page&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;route&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;**/*&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;route&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="kd"&gt;type&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;route&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;request&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;resourceType&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
            &lt;span class="c1"&gt;// Block images, media, fonts to save proxy bandwidth&lt;/span&gt;
            &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;image&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;media&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;font&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;includes&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kd"&gt;type&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="nx"&gt;route&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;abort&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="k"&gt;catch&lt;/span&gt;&lt;span class="p"&gt;(()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{});&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="nx"&gt;route&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;continue&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="k"&gt;catch&lt;/span&gt;&lt;span class="p"&gt;(()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{});&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;});&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;By aborting image and font requests before they hit the proxy, we cut bandwidth consumption by roughly 70%. On metered residential proxies, that translates directly to cost savings.&lt;/p&gt;




&lt;h2&gt;
  
  
  Translating Vinted Search URLs into API Calls
&lt;/h2&gt;

&lt;p&gt;Vinted search URLs encode filter parameters in query strings: &lt;code&gt;catalog[]&lt;/code&gt;, &lt;code&gt;brand_id[]&lt;/code&gt;, &lt;code&gt;size_id[]&lt;/code&gt;, &lt;code&gt;color_id[]&lt;/code&gt;, &lt;code&gt;status[]&lt;/code&gt;, and more.&lt;/p&gt;

&lt;p&gt;The internal API expects these same values but with slightly different parameter names and array bracket syntax. The Turbo Scraper extracts and rewrites these parameters automatically:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;translateToApiUrl&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;urlStr&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;domain&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;u&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;URL&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;urlStr&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;params&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;URLSearchParams&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;u&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;searchParams&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;arrayMaps&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Record&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;catalog[]&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;catalog_ids&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;color_id[]&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;color_ids&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;size_id[]&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;size_ids&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;status[]&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;status_ids&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;brand_id[]&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;brand_ids&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;};&lt;/span&gt;

    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;STRIP&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Set&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;search_id&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;time&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;search_by_image_uuid&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;search_by_image_id&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;currency&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;page&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;per_page&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;
    &lt;span class="p"&gt;]);&lt;/span&gt;

    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;apiParams&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;URLSearchParams&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;accumulated&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Record&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;[]&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{};&lt;/span&gt;

    &lt;span class="k"&gt;for &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;k&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;v&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;of&lt;/span&gt; &lt;span class="nx"&gt;params&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;entries&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;STRIP&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;has&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;k&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; &lt;span class="k"&gt;continue&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
        &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;arrayMaps&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;k&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;accumulated&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;arrayMaps&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;k&lt;/span&gt;&lt;span class="p"&gt;]])&lt;/span&gt; &lt;span class="nx"&gt;accumulated&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;arrayMaps&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;k&lt;/span&gt;&lt;span class="p"&gt;]]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[];&lt;/span&gt;
            &lt;span class="nx"&gt;accumulated&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;arrayMaps&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;k&lt;/span&gt;&lt;span class="p"&gt;]].&lt;/span&gt;&lt;span class="nf"&gt;push&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;v&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="nx"&gt;apiParams&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;set&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;k&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;v&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="c1"&gt;// Critical fix: append brackets for multi-value arrays&lt;/span&gt;
    &lt;span class="k"&gt;for &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;key&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;vals&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;of&lt;/span&gt; &lt;span class="nb"&gt;Object&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;entries&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;accumulated&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;for &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;v&lt;/span&gt; &lt;span class="k"&gt;of&lt;/span&gt; &lt;span class="nx"&gt;vals&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="nx"&gt;apiParams&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;key&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;[]`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;v&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="s2"&gt;`https://www.&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;domain&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;/api/v2/catalog/items?&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;apiParams&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;toString&lt;/span&gt;&lt;span class="p"&gt;()}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This translator is the bridge between the URL your user copies from their browser and the internal API endpoint that returns raw JSON. Without it, you would need users to manually map catalog IDs — which defeats the purpose of a "zero-config" scraper.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Human-Friendly Mapping Layer
&lt;/h2&gt;

&lt;p&gt;Vinted uses numeric IDs for filters. Users do not know that "Nike" maps to brand ID 53 or that "new with tags" maps to status ID 6.&lt;/p&gt;

&lt;p&gt;The actor maintains internal dictionaries that resolve plain text to these IDs:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;BRAND_MAP&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Record&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kr"&gt;number&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;nike&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;53&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;zara&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;12&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;h&amp;amp;m&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;7&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;adidas&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;14&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;levis&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;ralph lauren&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;88&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;calvin klein&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;33&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;guess&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;35&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;puma&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;15&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;vans&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;16&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;converse&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;17&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;tommy hilfiger&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;94&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;lacoste&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;93&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;the north face&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;114&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;asics&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;631&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;new balance&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;267&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;carhartt&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;362&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;dickies&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;1007&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;CONDITION_MAP&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Record&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kr"&gt;number&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;neuf avec étiquette&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;6&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;new&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;6&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;new_with_tags&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;6&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;neuf sans étiquette&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;new_without_tags&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;très bon état&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;very_good&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;bon état&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;good&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;satisfaisant&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;satisfactory&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;SIZE_MAP&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Record&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kr"&gt;number&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;35&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;54&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;36&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;55&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;37&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;56&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;38&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;57&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;39&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;58&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;40&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;59&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;41&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;60&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;42&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;61&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;43&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;62&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;44&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;63&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;45&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;64&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;46&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;65&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;47&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;66&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;xxs&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;205&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;xs&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;206&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;s&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;207&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;m&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;208&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;l&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;209&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;xl&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;210&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;xxl&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;211&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This lets users pass intuitive inputs like &lt;code&gt;["Nike", "Adidas"]&lt;/code&gt; or &lt;code&gt;["new", "very_good"]&lt;/code&gt; instead of reverse-engineering Vinted's internal taxonomy. The actor falls back to raw numeric IDs for anything not in the map, so power users are not constrained either.&lt;/p&gt;




&lt;h2&gt;
  
  
  HTTP Extraction Loop: Where the Speed Lives
&lt;/h2&gt;

&lt;p&gt;Once the session cookie is captured, the actor switches to &lt;code&gt;got-scraping&lt;/code&gt; for the heavy lifting:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;res&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;gotScraping&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;url&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;apiReqUrl&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;responseType&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;json&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="nx"&gt;proxyUrl&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;headers&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;User-Agent&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;userAgent&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Accept&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;application/json, text/plain, */*&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Accept-Language&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;fr-FR,fr;q=0.9,en-US;q=0.8,en;q=0.7&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Cookie&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;cookieStr&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Referer&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`https://www.&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;domain&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;/`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;X-Money-Object-Enabled&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;true&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Sec-Fetch-Dest&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;empty&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Sec-Fetch-Mode&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;cors&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Sec-Fetch-Site&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;same-origin&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="na"&gt;timeout&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;request&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;15000&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;Sec-Fetch-*&lt;/code&gt; headers are not decoration. They signal to Vinted's edge that this is a same-origin AJAX request, not an external scraper. Combined with a matching &lt;code&gt;Referer&lt;/code&gt; and the validated &lt;code&gt;Cookie&lt;/code&gt; string, the request sails through.&lt;/p&gt;

&lt;p&gt;Each page returns 96 items. The loop paginates until &lt;code&gt;data.pagination.current_page &amp;gt;= data.pagination.total_pages&lt;/code&gt; or the &lt;code&gt;maxItems&lt;/code&gt; limit is hit.&lt;/p&gt;

&lt;p&gt;Result: &lt;strong&gt;~200 items per minute&lt;/strong&gt; sustained, with a memory footprint under 512MB per worker.&lt;/p&gt;




&lt;h2&gt;
  
  
  Input Schema Deep Dive
&lt;/h2&gt;

&lt;p&gt;The actor accepts minimal but precise JSON input. Here is the exact schema:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"maxItems"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"proxyConfiguration"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"useApifyProxy"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"apifyProxyGroups"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"RESIDENTIAL"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"startUrls"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.co.uk/catalog?catalog[]=1844&amp;amp;brand_ids[]=53&amp;amp;size_ids[]=207&amp;amp;status_ids[]=6&amp;amp;price_from=20&amp;amp;price_to=50&amp;amp;currency=GBP&amp;amp;order=price_low_to_high"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Field&lt;/th&gt;
&lt;th&gt;Type&lt;/th&gt;
&lt;th&gt;Required&lt;/th&gt;
&lt;th&gt;Description&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;startUrls&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;string or array&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Yes&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;One or more Vinted search URLs. Supports batch processing.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;maxItems&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;number&lt;/td&gt;
&lt;td&gt;No (default: 100)&lt;/td&gt;
&lt;td&gt;Cap on results per run. Use for cost control.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;proxyConfiguration&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;object&lt;/td&gt;
&lt;td&gt;No (recommended)&lt;/td&gt;
&lt;td&gt;Defaults to Apify residential proxies. Essential for Datadome evasion.&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;You can pass multiple URLs as a comma-separated string or an array of objects with &lt;code&gt;url&lt;/code&gt; keys. The actor processes them sequentially in a single run, combining outputs into one unified dataset.&lt;/p&gt;




&lt;h2&gt;
  
  
  Integration Patterns: From Scraper to Pipeline
&lt;/h2&gt;

&lt;p&gt;Raw data is worthless without a destination. The actor integrates with Apify's ecosystem for downstream automation:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Destination&lt;/th&gt;
&lt;th&gt;Trigger&lt;/th&gt;
&lt;th&gt;Use Case&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Google Sheets&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Apify integration&lt;/td&gt;
&lt;td&gt;Live inventory tracking&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Slack&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Webhook&lt;/td&gt;
&lt;td&gt;Alert team on new listings&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Airtable&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Zapier/Make bridge&lt;/td&gt;
&lt;td&gt;Visual database for resellers&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Custom API&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Dataset webhook&lt;/td&gt;
&lt;td&gt;Push to your own backend&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;CSV/Excel&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Manual download&lt;/td&gt;
&lt;td&gt;One-off market analysis&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;For recurring monitoring, pair the actor with Apify Scheduler. Set it to run every 15 minutes against a filtered search URL and pipe results to a Slack channel or Google Sheet. You catch new listings before manual browsers refresh the page.&lt;/p&gt;




&lt;h2&gt;
  
  
  Real-World Performance Benchmarks
&lt;/h2&gt;

&lt;p&gt;Here are observed numbers from production runs across different proxy tiers:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Proxy Type&lt;/th&gt;
&lt;th&gt;Speed&lt;/th&gt;
&lt;th&gt;Reliability&lt;/th&gt;
&lt;th&gt;Cost per 1k Items&lt;/th&gt;
&lt;th&gt;Best For&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Apify Proxy (Datacenter)&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;~300 items/min&lt;/td&gt;
&lt;td&gt;Low (blocks after ~500)&lt;/td&gt;
&lt;td&gt;~$0.30&lt;/td&gt;
&lt;td&gt;Quick tests&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Apify Proxy (Residential)&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;~200 items/min&lt;/td&gt;
&lt;td&gt;High (rarely blocked)&lt;/td&gt;
&lt;td&gt;~$1.50&lt;/td&gt;
&lt;td&gt;Production runs&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Custom Proxy&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Variable&lt;/td&gt;
&lt;td&gt;Depends on quality&lt;/td&gt;
&lt;td&gt;Variable&lt;/td&gt;
&lt;td&gt;Power users&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;The residential proxy is the sweet spot: fast enough for real-time workflows, reliable enough for continuous monitoring, and priced predictably per result.&lt;/p&gt;




&lt;h2&gt;
  
  
  Architecture Comparison: Browser vs Hybrid vs Pure HTTP
&lt;/h2&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Approach&lt;/th&gt;
&lt;th&gt;Speed&lt;/th&gt;
&lt;th&gt;Cost&lt;/th&gt;
&lt;th&gt;Reliability&lt;/th&gt;
&lt;th&gt;Complexity&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Pure Browser&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;~20-40 items/min&lt;/td&gt;
&lt;td&gt;High (full asset load)&lt;/td&gt;
&lt;td&gt;Medium (detectable patterns)&lt;/td&gt;
&lt;td&gt;Low&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Pure HTTP&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;~300+ items/min&lt;/td&gt;
&lt;td&gt;Minimal&lt;/td&gt;
&lt;td&gt;Low (session requires bootstrapping)&lt;/td&gt;
&lt;td&gt;High&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Hybrid (Turbo)&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;~200 items/min&lt;/td&gt;
&lt;td&gt;Low (blocked assets)&lt;/td&gt;
&lt;td&gt;High (session + retry logic)&lt;/td&gt;
&lt;td&gt;Medium&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Pure HTTP is fastest on paper, but without a valid session cookie, every request returns a 403. The hybrid approach trades absolute speed for &lt;strong&gt;operational reliability&lt;/strong&gt; — the metric that actually matters when you are running automated workflows.&lt;/p&gt;




&lt;h2&gt;
  
  
  When to Use Turbo vs Smart Scraper
&lt;/h2&gt;

&lt;p&gt;Vinted Turbo Scraper is part of a two-tool ecosystem. Choose based on your use case:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Feature&lt;/th&gt;
&lt;th&gt;Turbo Scraper&lt;/th&gt;
&lt;th&gt;Smart Scraper&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;URL-based input&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;No (form-based)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Batch URL processing&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Cross-country comparison&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Seller analysis&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Sold items tracking&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Trending discovery&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Price monitoring&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Yes (cross-border)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Speed&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Faster&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Slower (richer data)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Cost&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Lower&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Higher&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Use Turbo&lt;/strong&gt; when you have a Vinted search URL ready and need structured data fast. &lt;strong&gt;Use Smart&lt;/strong&gt; when you are doing deep market intelligence, seller profiling, or cross-country arbitrage.&lt;/p&gt;




&lt;h2&gt;
  
  
  Anti-Ban Mechanisms Beyond Proxies
&lt;/h2&gt;

&lt;p&gt;Proxy rotation is table stakes. The actor adds three additional layers:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Request fingerprint rotation via Crawlee&lt;/strong&gt; — Built-in proxy configuration rotates IPs per session.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Aggressive retry with exponential backoff&lt;/strong&gt; — &lt;code&gt;maxRequestRetries: 5&lt;/code&gt; with a 30-second handler timeout.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Graceful session recycling&lt;/strong&gt; — If an HTTP request fails with a 403, the Playwright session is refreshed before retry.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The output is a clean JSON schema with optional lightweight mode:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;8464268321&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"title"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Levi black skinny jeans 33&lt;/span&gt;&lt;span class="se"&gt;\"&lt;/span&gt;&lt;span class="s2"&gt; waist"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"url"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.co.uk/items/8464268321-levi-black-skinny-jeans-33-waist"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"price"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;20&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"currency"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"GBP"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"brand"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Levi's"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"size"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"M / UK 12-14"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"condition"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"New with tags"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"photos"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"..."&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"favouriteCount"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"seller"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;73959532&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"username"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"maxi83199"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"profileUrl"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.co.uk/member/73959532-maxi83199"&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"scrapedAt"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2026-03-24T10:25:41.604Z"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Structured. Timestamped. Ready for pipelines.&lt;/p&gt;




&lt;h2&gt;
  
  
  FAQ: Technical Details
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Q: Does this use headless browsers for every request?&lt;/strong&gt;&lt;br&gt;
A: No. Only for initial session bootstrap. Data extraction uses lightweight HTTP requests via &lt;code&gt;got-scraping&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Q: How many items can I extract per run?&lt;/strong&gt;&lt;br&gt;
A: The &lt;code&gt;maxItems&lt;/code&gt; parameter lets you cap runs. We have tested up to 10,000 items in a single run without memory issues.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Q: Is there a Vinted API this connects to?&lt;/strong&gt;&lt;br&gt;
A: Vinted does not offer a public API for catalog data. This actor acts as a practical alternative by reverse-engineering the internal endpoints.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Q: Will my IP get banned?&lt;/strong&gt;&lt;br&gt;
A: With residential proxies and the hybrid architecture, blocks are rare. The actor implements retry logic and session refresh for edge cases.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Q: Can I run this on a schedule?&lt;/strong&gt;&lt;br&gt;
A: Yes, via Apify Scheduler or cron triggers. Ideal for monitoring new listings.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Q: What output formats are available?&lt;/strong&gt;&lt;br&gt;
A: JSON (structured), CSV, Excel, or direct API export to integrations.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Honest Bottom Line
&lt;/h2&gt;

&lt;p&gt;No scraper is "unbanable." Platforms evolve. What the hybrid architecture buys you is &lt;strong&gt;time&lt;/strong&gt; — time between Vinted deploying a new detection mechanism and you pushing an update.&lt;/p&gt;

&lt;p&gt;Because this is packaged as an Apify Actor, that update propagates to every user instantly. No pip upgrade. No breaking dependency chains. No "works on my machine."&lt;/p&gt;

&lt;p&gt;If you are still maintaining a custom Python Selenium script that breaks every two weeks, you are not scraping Vinted. You are debugging Vinted.&lt;/p&gt;

&lt;p&gt;Switch to infrastructure that was built to survive the platform, not chase it.&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;Ready to extract Vinted data at scale?&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Actor: &lt;a href="https://apify.com/actors/IV3WPdQlMFG1cwXuK" rel="noopener noreferrer"&gt;Vinted Turbo Scraper on Apify&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Pricing: $1.50 per 1,000 results. No subscription. Free plan covers thousands of items.&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;&lt;em&gt;Questions about the architecture or want to integrate this into a pipeline? Drop a comment below.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>apify</category>
      <category>vinted</category>
      <category>scraping</category>
      <category>automation</category>
    </item>
    <item>
      <title>Ne Jamais Rater une Bonne Affaire sur Vinted : Le Système Automatisé en 2026</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Sat, 11 Apr 2026 20:42:15 +0000</pubDate>
      <link>https://dev.to/boo_n/ne-jamais-ratar-une-bonne-affaire-sur-vinted-le-systeme-automatise-en-2026-4bil</link>
      <guid>https://dev.to/boo_n/ne-jamais-ratar-une-bonne-affaire-sur-vinted-le-systeme-automatise-en-2026-4bil</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk8aw6cg793r2wnpmcnad.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk8aw6cg793r2wnpmcnad.jpeg" alt="Cover" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  🚫 Ne Jamais Rater une Bonne Affaire sur Vinted : Le Système Automatisé en 2026
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Tl;dr :&lt;/strong&gt; Si tu passes 30 minutes par jour à refresh Vinted pour trouver cette veste Carhartt en taille M à moins de 30€ — ton temps vaut plus que ça. Voici le système sans-code qui te notifie en temps réel, sans risque de ban, sans infrastructure, et sans rien payer en plus de ton Apify.&lt;/p&gt;




&lt;h3&gt;
  
  
  Le problème est simple, la solution est nulle part
&lt;/h3&gt;

&lt;p&gt;Vinted, c'est le flea market de 85 millions d'utilisateurs en Europe. Le problème ? Les bonnes affaires disparaissent en moins de 15 minutes. Tu postes un message pour demander la taille, le vendeur a déjà conclu avec quelqu'un d'autre.&lt;/p&gt;

&lt;p&gt;Tu as deux options :&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Passer 3h/jour à scroller&lt;/strong&gt; — c'est ce que font 95% des utilisateurs. Inefficace, chronophage, frustrant.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Automatiser&lt;/strong&gt; — mais construire son propre scraper en 2026, c'est s'exposer à la protection Datadome de Vinted, aux blocs Cloudflare, aux bans IP, et aux headers qui changent chaque semaine.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;J'ai testé la voie #2 pendant 3 mois. Mon script Python a tenu 11 jours avant le premier 403. Le script corrigé a tenu 6 jours. À chaque mise à jour de Vinted, je recommençais de zéro.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;La solution : arrêter de construire l'infrastructure quand quelqu'un l'a déjà faite.&lt;/strong&gt;&lt;/p&gt;




&lt;h3&gt;
  
  
  L'architecture sans-code qui fonctionne en 2026
&lt;/h3&gt;

&lt;p&gt;Le système repose sur deux briques :&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Apify Vinted Turbo Scraper&lt;/strong&gt; → extraction fiable des listings avec les bons headers et le bypass Datadome intégré&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Telegram Bot API&lt;/strong&gt; → notification push en temps réel sur ton téléphone&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Le flux :&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[Marque + Prix Max + Taille] → [Apify Actor] → [JSON propre] → [Telegram Bot] → [Notification 🔔]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Pas de serveur. Pas de cron custom. Pas de maintenance.&lt;/p&gt;




&lt;h3&gt;
  
  
  Étape 1 : Configure le Turbo Scraper sur Apify
&lt;/h3&gt;

&lt;p&gt;Le Turbo Scraper te permet de filtrer par :&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"searchTerms"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"Carhartt"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Arc'teryx"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Nike ACG"&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"priceMin"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"priceMax"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;35&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"sizeIds"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"m"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"38"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"40"&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"country"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"fr"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"sortBy"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"created_desc"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;L'actor renvoie un JSON structuré :&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"title"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Veste Carhartt WIP Detroit"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"price"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;29&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"size"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"M"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"url"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.fr/items/12345678"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"seller"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"thriftking_92"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"created"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2026-04-11T14:32:00Z"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"photo"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://images.vinted.com/..."&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Tu récupères ce JSON via le webhook de l'Actor ou via API endpoint.&lt;/p&gt;




&lt;h3&gt;
  
  
  Étape 2 : Envoie les résultats sur Telegram
&lt;/h3&gt;

&lt;p&gt;2 options, selon ton niveau :&lt;/p&gt;

&lt;h4&gt;
  
  
  Option A : Zapier / Make (zéro code)
&lt;/h4&gt;

&lt;p&gt;Connecte le webhook Apify → Zapier → Telegram Bot. 10 minutes chrono, fonctionne pour 95% des cas.&lt;/p&gt;

&lt;h4&gt;
  
  
  Option B : 20 lignes de Node.js (contrôle total)
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;TelegramBot&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;node-telegram-bot-api&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;axios&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;axios&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;bot&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;TelegramBot&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;TELEGRAM_TOKEN&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;polling&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="nx"&gt;app&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;post&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;/webhook&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;items&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

  &lt;span class="k"&gt;for &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;item&lt;/span&gt; &lt;span class="k"&gt;of&lt;/span&gt; &lt;span class="nx"&gt;items&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;msg&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;`
🔔 *Nouvelle trouvaille Vinted !*

*{item.title}*
💰 {item.price}€ — Taille {item.size}
👤 {item.seller}
🔗 {item.url}
    `&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;bot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sendMessage&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;CHAT_ID&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;msg&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;parse_mode&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Markdown&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;disable_web_page_preview&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;false&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;

  &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;status&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;ok&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;items_processed&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;items&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Déploie ça sur Railway, Render ou ton VPS. Coût : ~5€/mois max.&lt;/p&gt;




&lt;h3&gt;
  
  
  Étape 3 : Automatise le scheduling
&lt;/h3&gt;

&lt;p&gt;Tu n'as pas besoin de tourner ton script 24/7. Configure un run Apify planifié :&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Schedule : toutes les 30 minutes
Coût : ~0.02€ par run (actor compute)
Résultat : notifications en temps réel sans server costs
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Le coût mensuel réel :&lt;/strong&gt; 1-3€ pour 1 440 runs/mois. Ton café coûte plus cher.&lt;/p&gt;




&lt;h3&gt;
  
  
  Ce que j'ai découvert après 6 mois d'usage
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Les meilleures deals (marques streetwear, vêtements de sport) postés le matin sont souvent pris avant 10h. &lt;strong&gt;Schedule à 6h30, 7h30 et 8h30.&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Filtrer par &lt;code&gt;created_desc&lt;/code&gt; uniquement te donne les listings des 30 dernières minutes. Plus large = plus de bruit.&lt;/li&gt;
&lt;li&gt;Le paramètre &lt;code&gt;sizeIds&lt;/code&gt; est clé : Vinted ne filtre pas toujours correctement côté client. Ton actor doit le faire en post-processing.&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  Le piège à éviter en 2026
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Ne construis pas ton propre parser HTTP.&lt;/strong&gt; Vinted a déployé en 2025-2026 une couche Datadome de 4ème génération qui détecte les headers Selenium, les patterns de navigation automatisés et les IPs de data centers en moins de 3 requêtes.&lt;/p&gt;

&lt;p&gt;Le Vinted Turbo Scraper sur Apify utilise des IPs résidentielles rotatives et des fingerprints browsers réels. C'est la différence entre 1h de dev + 2 jours de maintenance versus 10 minutes de config + 0 maintenance.&lt;/p&gt;




&lt;h3&gt;
  
  
  Tu veux tester en 2 minutes ?
&lt;/h3&gt;

&lt;p&gt;Voici le lien direct vers l'actor Apify :&lt;/p&gt;

&lt;p&gt;👉 &lt;strong&gt;&lt;a href="https://apify.com/boo_n/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Vinted Turbo Scraper — Apify Store&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Premiers 3€ de compute offerts pour les nouveaux comptes. C'est suffisant pour tester le système complet.&lt;/p&gt;

&lt;p&gt;Si tu veux une config Telegram clé-en-main avec le scheduling automatique, contacte-moi en commentaire — je partage le repo GitHub avec la stack complète (Node.js + Railway + Telegram).&lt;/p&gt;

&lt;p&gt;Les deals n'attendent pas. Automatise ou regarde-les partir.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;[Cet article est écrit à titre informatif. Vérifie les Conditions d'Utilisation de Vinted et la législation locale avant d'automatiser la récupération de données.]&lt;/em&gt;&lt;/p&gt;

</description>
      <category>vinted</category>
      <category>automation</category>
      <category>apify</category>
      <category>telegram</category>
    </item>
    <item>
      <title>Why building a Vinted scraper from scratch is a trap in 2026</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Wed, 08 Apr 2026 09:23:29 +0000</pubDate>
      <link>https://dev.to/boo_n/why-building-a-vinted-scraper-from-scratch-is-a-trap-in-2026-4om7</link>
      <guid>https://dev.to/boo_n/why-building-a-vinted-scraper-from-scratch-is-a-trap-in-2026-4om7</guid>
      <description>&lt;p&gt;If you're a data extraction developer or just someone trying to build a Vinted new listings alert system, you've probably noticed something over the past few months: Vinted's anti-bot protection has become completely paranoid.&lt;/p&gt;

&lt;p&gt;I tried building my own Vinted scraper in Python last week to monitor some specific vintage deals. Total disaster.&lt;/p&gt;

&lt;p&gt;Here is what happens if you try the DIY route right now:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Pure HTTP requests (Requests, HTTPX)&lt;/strong&gt;: Instant 403 Forbidden. Their Cloudflare/Datadome setup immediately flags the TLS fingerprint of standard libraries.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Headless Browsers (Playwright/Puppeteer)&lt;/strong&gt;: It works briefly, but it's incredibly slow and consumes massive amounts of RAM. Plus, Vinted will eventually flag your residential proxy IP if you don't rotate perfectly.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;After burning through two different proxy providers and getting blocked anyway, I gave up on maintaining my own infrastructure for this. &lt;/p&gt;

&lt;p&gt;While looking for an alternative, I stumbled upon an Apify Vinted actor called &lt;a href="https://apify.com/kazkn/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Vinted Turbo Scraper&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;It essentially acts as a hybrid engine—it bypasses the Datadome checks natively and just returns clean JSON. &lt;/p&gt;

&lt;p&gt;Instead of fighting with headers and proxies, this is literally all the code I run now:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;apify_client&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;ApifyClient&lt;/span&gt;

&lt;span class="n"&gt;client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ApifyClient&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;YOUR_API_TOKEN&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;run&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;actor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;IV3WPdQlMFG1cwXuK&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;call&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;run_input&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;searchUrl&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;https://www.vinted.com/catalog?search_text=carhartt&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;maxItems&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;50&lt;/span&gt;
&lt;span class="p"&gt;})&lt;/span&gt;

&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;dataset&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;run&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;defaultDatasetId&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]).&lt;/span&gt;&lt;span class="nf"&gt;iterate_items&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;title&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;price&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It’s significantly cheaper than running my own headless cluster and I don't have to deal with WAF bypasses anymore. If you need to scrape Vinted listings efficiently, don't reinvent the wheel.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>How to monitor Vinted automatically for new listings (Without getting IP banned)</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Tue, 07 Apr 2026 22:09:34 +0000</pubDate>
      <link>https://dev.to/boo_n/how-to-monitor-vinted-automatically-for-new-listings-without-getting-ip-banned-44j</link>
      <guid>https://dev.to/boo_n/how-to-monitor-vinted-automatically-for-new-listings-without-getting-ip-banned-44j</guid>
      <description>&lt;p&gt;If you're into flipping clothes or just trying to snag the best vintage deals, you already know that speed is everything. A good deal on Vinted is gone in literally seconds.&lt;/p&gt;

&lt;p&gt;A few weeks ago, I decided to build a simple Python script to send a Discord notification whenever a specific brand was uploaded in my size. I thought it would take an hour. I was wrong.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Datadome Nightmare
&lt;/h2&gt;

&lt;p&gt;If you've ever tried to build a &lt;strong&gt;vinted scraper&lt;/strong&gt;, you've probably hit a wall of 403 Forbidden errors. Vinted uses heavy anti-bot protections to block automated traffic.&lt;/p&gt;

&lt;p&gt;I tried everything:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Rotating free proxies (instantly blocked)&lt;/li&gt;
&lt;li&gt;Premium residential proxies (worked for a bit, then got flagged)&lt;/li&gt;
&lt;li&gt;Playwright/Puppeteer (way too slow and resource-heavy to run 24/7 on my small VPS)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To truly &lt;strong&gt;monitor vinted automatically&lt;/strong&gt;, you need something that handles TLS fingerprinting natively. I was about to give up on my &lt;strong&gt;vinted new listings alert&lt;/strong&gt; project when I stumbled across a pre-built solution.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Discovery
&lt;/h2&gt;

&lt;p&gt;Instead of reinventing the wheel and fighting anti-bot systems daily, I found an &lt;strong&gt;apify vinted actor&lt;/strong&gt; that handles the heavy lifting. It's called Vinted Turbo Scraper.&lt;/p&gt;

&lt;p&gt;Here is the tool I use now: &lt;a href="https://apify.com/kazkn/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Vinted Turbo Scraper&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It uses a hybrid approach: it uses a real browser context to bypass the WAF, grabs the session tokens, and then uses fast HTTP requests to extract the data at scale.&lt;/p&gt;

&lt;h2&gt;
  
  
  How I set up my Discord Alert
&lt;/h2&gt;

&lt;p&gt;Using this actor, my code went from a 300-line messy Puppeteer script to a simple API call.&lt;/p&gt;

&lt;p&gt;I just set up a webhook in Discord, and use a simple cron job that hits the Apify API every 5 minutes. The actor returns clean JSON with all the new items matching my search URL.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// Simple example of how clean the data is&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;items&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;items&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;item&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;title&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;item&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;title&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;price&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;item&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;price&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;brand&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;item&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;brand&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;url&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;item&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;url&lt;/span&gt;
&lt;span class="p"&gt;}));&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you're a developer trying to build a &lt;strong&gt;vinted vintage deals automation&lt;/strong&gt; pipeline, save yourself the headache. Stop fighting WAFs and use the right tools for the job.&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>python</category>
      <category>scraping</category>
      <category>automation</category>
    </item>
    <item>
      <title>Scraping Vinted in 2026: Why your Python script keeps getting 403 Errors</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Mon, 06 Apr 2026 22:19:08 +0000</pubDate>
      <link>https://dev.to/boo_n/scraping-vinted-in-2026-why-your-python-script-keeps-getting-403-errors-30mi</link>
      <guid>https://dev.to/boo_n/scraping-vinted-in-2026-why-your-python-script-keeps-getting-403-errors-30mi</guid>
      <description>&lt;p&gt;If you've tried to build a &lt;strong&gt;vinted scraper&lt;/strong&gt; recently using &lt;code&gt;requests&lt;/code&gt; or &lt;code&gt;BeautifulSoup&lt;/code&gt; in Python, you probably hit a brick wall. Specifically, a &lt;code&gt;403 Forbidden&lt;/code&gt; wall.&lt;/p&gt;

&lt;p&gt;I spent the weekend trying to &lt;strong&gt;scrape vinted&lt;/strong&gt; to get notifications for some vintage jackets I was hunting. My IP got banned within 10 requests. Vinted uses Datadome and Cloudflare to aggressively block basic scraping attempts.&lt;/p&gt;

&lt;h3&gt;
  
  
  The problem with DIY Vinted Automation
&lt;/h3&gt;

&lt;p&gt;When you try to monitor new listings automatically, Vinted's WAF checks your TLS fingerprint. Standard HTTP libraries (like Python's &lt;code&gt;requests&lt;/code&gt; or Node's &lt;code&gt;axios&lt;/code&gt;) leak signatures that scream "I am a bot".&lt;/p&gt;

&lt;p&gt;You can try rotating proxies or using Playwright/Puppeteer, but Playwright is too heavy to run a fast loop (you want alerts in seconds, not minutes). I was basically out of memory running 5 browser tabs just for one search query.&lt;/p&gt;

&lt;h3&gt;
  
  
  The bypass I found
&lt;/h3&gt;

&lt;p&gt;After getting tired of dealing with TLS fingerprinting and headless browser crashes, I looked for managed solutions. I stumbled upon this &lt;strong&gt;vinted turbo scraper&lt;/strong&gt; actor on Apify:&lt;br&gt;
&lt;a href="https://apify.com/kazkn/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Vinted Turbo Scraper on Apify&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It handles the Datadome bypass natively. It uses a hybrid approach:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;It uses a real browser to fetch the initial tokens and solve challenges.&lt;/li&gt;
&lt;li&gt;It switches to lightweight HTTP requests for the actual data extraction.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This means you get the bypass rate of a real browser but the speed and low cost of an API. It literally returns JSON data from any Vinted search URL instantly.&lt;/p&gt;

&lt;p&gt;If you are a &lt;strong&gt;vinted data extraction developer&lt;/strong&gt; or just trying to set up a &lt;strong&gt;vinted new listings alert&lt;/strong&gt;, stop wasting your time fighting Datadome. Just use the Apify actor and plug the JSON into your Discord webhook or database.&lt;/p&gt;

&lt;p&gt;It completely saved my weekend project. If you've found other ways to bypass the 403s without spending hundreds on residential proxies, let me know below!&lt;/p&gt;

</description>
      <category>python</category>
      <category>webscraping</category>
      <category>automation</category>
      <category>api</category>
    </item>
    <item>
      <title>How to Track Vinted Price Drops for Reselling (Automation)</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Sun, 05 Apr 2026 22:26:45 +0000</pubDate>
      <link>https://dev.to/boo_n/how-to-track-vinted-price-drops-for-reselling-automation-1clc</link>
      <guid>https://dev.to/boo_n/how-to-track-vinted-price-drops-for-reselling-automation-1clc</guid>
      <description>&lt;p&gt;When reselling on Vinted, finding underpriced items is only half the battle. The other half is tracking when a seller drops their price so you can instantly send an offer.&lt;/p&gt;

&lt;p&gt;If you try to monitor multiple closets or search queries manually, you'll lose out to faster buyers. &lt;/p&gt;

&lt;p&gt;I used to run a custom Python script to track my favorite items, but Vinted's Cloudflare protection (and Datadome) made it a nightmare to maintain. You get a &lt;code&gt;403 Forbidden&lt;/code&gt; error unless your proxy and TLS fingerprint are perfect.&lt;/p&gt;

&lt;p&gt;Instead of maintaining my own scraper and rotating proxies, I found a tool that handles the bypassing for me: the &lt;a href="https://apify.com/kazkn/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Vinted Turbo Scraper&lt;/a&gt; on Apify.&lt;/p&gt;

&lt;h3&gt;
  
  
  Why it works
&lt;/h3&gt;

&lt;p&gt;It runs a hybrid architecture. It grabs valid CSRF tokens with a real browser session in the background, then uses raw HTTP requests to fetch the data at crazy speeds. You never have to worry about getting blocked.&lt;/p&gt;

&lt;h3&gt;
  
  
  How I track price drops:
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;I pass my target Vinted search URLs into the scraper.&lt;/li&gt;
&lt;li&gt;I set the Apify actor to run on a schedule (e.g., every 10 minutes).&lt;/li&gt;
&lt;li&gt;I push the clean JSON output to a Google Sheet using Make.com (formerly Integromat).&lt;/li&gt;
&lt;li&gt;A simple formula compares the new &lt;code&gt;price&lt;/code&gt; field with the previous data. If it drops below my target threshold, it sends me a Discord ping.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;If you're building any kind of vinted automation, price monitor, or alert system, don't waste time fighting WAFs. Just use a maintained data extraction tool.&lt;/p&gt;

</description>
      <category>webscraping</category>
      <category>python</category>
      <category>automation</category>
      <category>ecommerce</category>
    </item>
    <item>
      <title>How to Build a Vinted to Telegram Alert Bot in 2026 (Zero Code)</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Sun, 05 Apr 2026 05:21:03 +0000</pubDate>
      <link>https://dev.to/boo_n/how-to-build-a-vinted-to-telegram-alert-bot-in-2026-zero-code-13p9</link>
      <guid>https://dev.to/boo_n/how-to-build-a-vinted-to-telegram-alert-bot-in-2026-zero-code-13p9</guid>
      <description>&lt;p&gt;If you're flipping vintage clothes or reselling items from Vinted, you already know that good deals disappear in seconds. Refreshing the app manually is a waste of time.&lt;/p&gt;

&lt;p&gt;I wanted to build a simple Telegram bot to send me a notification the exact second a specific brand (like Carhartt or Nike) was posted in my size.&lt;/p&gt;

&lt;p&gt;I initially tried coding a Python scraper using &lt;code&gt;requests&lt;/code&gt; and &lt;code&gt;BeautifulSoup&lt;/code&gt;, but Vinted's Cloudflare and Datadome protection blocked me with &lt;code&gt;403 Forbidden&lt;/code&gt; errors immediately. Playwright worked, but it was incredibly slow and heavy to run 24/7 on a cheap VPS.&lt;/p&gt;

&lt;p&gt;Instead of building from scratch, I found the &lt;a href="https://apify.com/kazkn/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Vinted Turbo Scraper&lt;/a&gt; on Apify.&lt;/p&gt;

&lt;h3&gt;
  
  
  How it works
&lt;/h3&gt;

&lt;p&gt;This tool handles all the proxy rotation and TLS fingerprinting for you. You just give it a standard Vinted search URL, and it returns clean JSON data.&lt;/p&gt;

&lt;p&gt;To connect it to Telegram without coding:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to Make.com or Zapier.&lt;/li&gt;
&lt;li&gt;Set up a trigger: Apify -&amp;gt; Watch for Actor Run.&lt;/li&gt;
&lt;li&gt;Add an action: Telegram -&amp;gt; Send a Message.&lt;/li&gt;
&lt;li&gt;Map the fields (Title, Price, URL, Image) from the Apify JSON output to your Telegram message format.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;That's it. You can schedule the Apify actor to run every 5 minutes. Whenever it finds new items matching your search URL, your Telegram bot instantly pings your phone.&lt;/p&gt;

&lt;p&gt;It's by far the easiest vinted automation setup I've tested, and it completely bypasses the anti-bot headache.&lt;/p&gt;

</description>
      <category>webscraping</category>
      <category>telegram</category>
      <category>automation</category>
      <category>nocode</category>
    </item>
    <item>
      <title>Never miss a Vinted deal: How to monitor listings automatically (2026)</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Sun, 05 Apr 2026 04:20:07 +0000</pubDate>
      <link>https://dev.to/boo_n/never-miss-a-vinted-deal-how-to-monitor-listings-automatically-2026-1j30</link>
      <guid>https://dev.to/boo_n/never-miss-a-vinted-deal-how-to-monitor-listings-automatically-2026-1j30</guid>
      <description>&lt;p&gt;If you are flipping clothes, you already know that the best vintage deals on Vinted are gone in seconds.&lt;/p&gt;

&lt;p&gt;I used to refresh the app manually like a maniac. Then I tried building a Python script to scrape Vinted listings, but my IP got instantly banned by Datadome. Vinted's security is no joke right now.&lt;/p&gt;

&lt;p&gt;After looking around for a workaround, I stumbled upon the &lt;a href="https://apify.com/kazkn/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Vinted Turbo Scraper&lt;/a&gt; on Apify.&lt;/p&gt;

&lt;p&gt;It’s honestly a game changer. You just paste your Vinted search URL (e.g., filtered for Nike, size L, under 20€), and it extracts all the new listings into clean JSON.&lt;/p&gt;

&lt;p&gt;Since it handles all the proxy rotation and bypasses Cloudflare automatically, you never get blocked. I hooked it up to a simple Discord webhook, so now I get pinged the second a new item drops.&lt;/p&gt;

&lt;p&gt;If you need to monitor vinted automatically and want a reliable apify vinted actor, save yourself the headache of building it from scratch.&lt;/p&gt;

</description>
      <category>webscraping</category>
      <category>automation</category>
      <category>ecommerce</category>
      <category>python</category>
    </item>
    <item>
      <title>How to Bypass Vinted 403 Errors &amp; Cloudflare (2026 Fix)</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Sat, 04 Apr 2026 13:06:46 +0000</pubDate>
      <link>https://dev.to/boo_n/how-to-bypass-vinted-403-errors-cloudflare-2026-fix-408g</link>
      <guid>https://dev.to/boo_n/how-to-bypass-vinted-403-errors-cloudflare-2026-fix-408g</guid>
      <description>&lt;p&gt;If you've tried to scrape Vinted recently using Python (&lt;code&gt;requests&lt;/code&gt;) or Node.js (&lt;code&gt;axios&lt;/code&gt;), you've probably hit a wall of 403 Forbidden errors or Cloudflare/Datadome blocks.&lt;/p&gt;

&lt;p&gt;Vinted's anti-bot system is extremely aggressive. If you try to pull data from their internal API, you need the right &lt;code&gt;x-csrf-token&lt;/code&gt; and flawless TLS fingerprinting. A standard headless Playwright setup will get flagged almost instantly unless you're heavily patching the browser.&lt;/p&gt;

&lt;p&gt;After spending days trying to rotate residential proxies and tweak headers, I found a much cleaner solution that bypasses the headache entirely.&lt;/p&gt;

&lt;p&gt;Instead of fighting the WAF yourself, you can just use the &lt;a href="https://apify.com/kazkn/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Vinted Turbo Scraper&lt;/a&gt; on Apify.&lt;/p&gt;

&lt;h3&gt;
  
  
  Why it works
&lt;/h3&gt;

&lt;p&gt;It uses a hybrid asymmetric architecture: it runs a real browser session in the background just to harvest the valid CSRF tokens and cookies, and then uses fast HTTP requests to do the actual data extraction.&lt;/p&gt;

&lt;p&gt;This means you get the speed of HTTP scraping without getting blocked by Datadome.&lt;/p&gt;

&lt;h3&gt;
  
  
  How to use it
&lt;/h3&gt;

&lt;p&gt;You just feed it a Vinted search URL (like &lt;code&gt;https://www.vinted.com/catalog?search_text=vintage+nike&lt;/code&gt;), and it outputs clean JSON.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;123456789&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"title"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Vintage Nike Hoodie"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"price"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"25.00"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"currency"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"EUR"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"brand_title"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Nike"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"url"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.com/items/123456789-vintage-nike-hoodie"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It's highly optimized for speed, so you can poll it frequently if you're building a Discord sniper bot or an alert system. It handles all the proxy rotation, TLS spoofing, and token refresh logic under the hood.&lt;/p&gt;

&lt;p&gt;If you are tired of debugging &lt;code&gt;403 Forbidden&lt;/code&gt; responses, give this vinted automation tool a try. It completely saved my current project.&lt;/p&gt;

</description>
      <category>webscraping</category>
      <category>python</category>
      <category>javascript</category>
      <category>automation</category>
    </item>
  </channel>
</rss>
