DEV Community

Cover image for How to Export Depop Listings to a Shopify CSV Using Python
Robert N. Gutierrez
Robert N. Gutierrez

Posted on

How to Export Depop Listings to a Shopify CSV Using Python

For many resellers, moving from a marketplace like Depop to a dedicated Shopify store is a major milestone. However, growth often hits a technical wall: inventory migration. Manually copy-pasting titles, descriptions, and high-resolution images for hundreds of items is a bottleneck that prevents you from scaling.

This guide solves that problem by building an automated pipeline. We’ll use Python to scrape your Depop listings, transform that data into a Shopify-compatible format using Pandas, and generate a CSV file ready for instant import.

Prerequisites

Before starting, ensure you have the following:

  • Python 3.8+ installed on your machine.
  • A ScrapeOps API Key: Depop uses aggressive anti-bot measures. We use the ScrapeOps proxy aggregator to bypass these blocks. You can sign up for a free ScrapeOps account here.
  • Pandas & Playwright: These libraries handle data manipulation and browser automation.

Setup Instructions

Clone the Depop.com-Scrapers repository, which contains the core scraping logic:

git clone https://github.com/scraper-bank/Depop.com-Scrapers.git
cd Depop.com-Scrapers/python/playwright
pip install playwright playwright-stealth pandas
playwright install chromium
Enter fullscreen mode Exit fullscreen mode

Step 1: Extracting Product Data from Depop

To get the data, we’ll use the Playwright implementation from the repository. While the base script is designed for single URLs, we can wrap it to process multiple listings.

Create a script named run_scraper.py. This script iterates through your Depop product URLs and saves the raw data into a JSONL (JSON Lines) file. JSONL is efficient because it allows us to append data line-by-line, preventing data loss if the script is interrupted.

import asyncio
from playwright.async_api import async_playwright
from scraper.depop_scraper_product_data_v1 import extract_data, DataPipeline

# Replace with your ScrapeOps API Key
API_KEY = "YOUR_SCRAPEOPS_API_KEY"

async def run_export(urls):
    async with async_playwright() as p:
        # Use a headless browser with stealth to mimic a real user
        browser = await p.chromium.launch(headless=True)
        pipeline = DataPipeline(jsonl_filename="depop_raw_data.jsonl")

        for url in urls:
            page = await browser.new_page()
            try:
                print(f"Scraping: {url}")
                await page.goto(url, wait_until="domcontentloaded", timeout=60000)
                # Wait 2 seconds for dynamic images to load
                await asyncio.sleep(2) 

                data = await extract_data(page)
                if data:
                    pipeline.add_data(data)
            except Exception as e:
                print(f"Error scraping {url}: {e}")
            finally:
                await page.close()

        await browser.close()

if __name__ == "__main__":
    my_listings = [
        "https://www.depop.com/products/example-item-1/",
        "https://www.depop.com/products/example-item-2/"
    ]
    asyncio.run(run_export(my_listings))
Enter fullscreen mode Exit fullscreen mode

Step 2: Mapping the Data

Depop’s data structure differs significantly from Shopify’s import requirements. We need to map the fields carefully.

Depop Field (JSONL) Shopify CSV Header Transformation Needed
name Title Direct map
name Handle URL-friendly slug (e.g., "Vintage Red Tee" becomes vintage-red-tee)
description Body (HTML) Convert plain text to basic HTML line breaks
price Variant Price Ensure numeric format
images Image Src Extract the first URL from the image list

Shopify also requires specific columns like Vendor, Published, and Status. We will add these as constants in our transformation script.

Step 3: The Transformation Script

Create convert_to_shopify.py. This script uses Pandas to load the JSONL file, apply the mapping logic, and export the final CSV.

import pandas as pd
import json
import re

def slugify(text):
    """Converts a title into a URL-friendly handle."""
    text = text.lower()
    return re.sub(r'[^\w\s-]', '', text).strip().replace(' ', '-')

def format_html(text):
    """Converts plain text description to basic HTML."""
    if not text: return ""
    return text.replace('\n', '<br>')

def transform_depop_to_shopify(input_file, output_file):
    items = []
    with open(input_file, 'r', encoding='utf-8') as f:
        for line in f:
            items.append(json.loads(line))

    df = pd.DataFrame(items)
    shopify_df = pd.DataFrame()

    # Core Mapping Logic
    shopify_df['Handle'] = df['name'].apply(slugify)
    shopify_df['Title'] = df['name']
    shopify_df['Body (HTML)'] = df['description'].apply(format_html)
    shopify_df['Vendor'] = 'My Depop Shop'
    shopify_df['Type'] = df['category']
    shopify_df['Tags'] = df['brand']
    shopify_df['Published'] = 'TRUE'

    # Pricing & Inventory
    shopify_df['Variant Price'] = df['price']
    shopify_df['Variant Inventory Tracker'] = 'shopify'
    shopify_df['Variant Inventory Qty'] = 1
    shopify_df['Variant Fulfillment Service'] = 'manual'

    # Image Handling: Grab the first image URL
    shopify_df['Image Src'] = df['images'].apply(lambda x: x[0]['url'] if (isinstance(x, list) and len(x) > 0) else '')

    shopify_df['Status'] = 'active'

    # Export to CSV with UTF-8 encoding for emojis
    shopify_df.to_csv(output_file, index=False, encoding='utf-8-sig')
    print(f"Successfully exported {len(shopify_df)} items to {output_file}")

if __name__ == "__main__":
    transform_depop_to_shopify('depop_raw_data.jsonl', 'shopify_import.csv')
Enter fullscreen mode Exit fullscreen mode

Why Pandas?

Pandas handles missing data gracefully. If a Depop listing is missing a brand or category, you can easily fill those gaps with defaults like "Unbranded" or "Clothing" using .fillna() or .apply() without the script crashing.

Step 4: Handling Images and Variations

A common hurdle in migration is high-resolution images. The Playwright scraper is tuned to find the src attributes pointing to Depop's CDN.

If you have multiple images per product, Shopify requires a separate row for each additional image. In these rows, the Handle remains the same, but other columns are left blank. While this script grabs the primary image, you can expand it using df.explode('images') if you need every photo transferred.

Step 5: Importing into Shopify

Once the shopify_import.csv is ready, follow these steps:

  1. Log in to your Shopify Admin.
  2. Go to Products and click Import.
  3. Upload your shopify_import.csv.
  4. If you are updating existing products with the same handle, check the box to Overwrite any current products.
  5. Preview the import. Verify that the Title, Price, and Image URL are correctly mapped.
  6. Click Import Products.

Troubleshooting

  • 403 Forbidden Errors: If Depop blocks your scraper, check your ScrapeOps API key. Depop is highly sensitive to IP reputation, so using a proxy is necessary.
  • Missing Handles: Shopify will fail if a Handle is empty. If you have non-Latin titles, ensure the slugify function produces a valid string.
  • Character Encoding: If your descriptions contain emojis, the encoding='utf-8-sig' argument in the Pandas to_csv function prevents "garbage" characters from appearing in Shopify.

To Wrap Up

By automating the extraction and transformation of your listings, you turn a multi-day manual task into a 60-second process. You can further improve this by using a profile scraper to automatically gather all listing URLs, creating a completely hands-off migration tool.

Top comments (0)