DEV Community

agenthustler
agenthustler

Posted on

Building a Gig Economy Rate Tracker with Python (TaskRabbit, Handy, etc.)

Gig economy pricing fluctuates wildly by city, time, and demand. Build a tracker that monitors rates across platforms to find arbitrage opportunities and market trends.

Why Track Gig Rates?

Gig platforms like TaskRabbit, Handy, and Thumbtack adjust pricing dynamically. Tracking these rates reveals local labor market conditions, seasonal patterns, and pricing arbitrage between platforms.

Scraping TaskRabbit Rates

import requests
from bs4 import BeautifulSoup
import pandas as pd
from datetime import datetime

API_KEY = "YOUR_SCRAPERAPI_KEY"  # Get one at https://www.scraperapi.com?fp_ref=the52

def scrape_taskrabbit_rates(city, task_type):
    url = f"https://www.taskrabbit.com/{city}/{task_type}"
    proxy_url = f"http://api.scraperapi.com?api_key={API_KEY}&url={url}&render=true"

    response = requests.get(proxy_url, timeout=60)
    soup = BeautifulSoup(response.text, 'html.parser')

    taskers = []
    for card in soup.select('.tasker-card, .provider-card'):
        name = card.select_one('.tasker-name')
        rate = card.select_one('.tasker-rate, .hourly-rate')
        rating = card.select_one('.rating-score')

        if rate:
            taskers.append({
                'city': city,
                'task_type': task_type,
                'name': name.text.strip() if name else 'N/A',
                'hourly_rate': rate.text.strip(),
                'rating': rating.text.strip() if rating else 'N/A',
                'scraped_at': datetime.now().isoformat()
            })
    return taskers

cities = ['sf', 'nyc', 'chicago', 'austin', 'miami']
tasks = ['furniture-assembly', 'moving-help', 'cleaning', 'handyman']

all_rates = []
for city in cities:
    for task in tasks:
        rates = scrape_taskrabbit_rates(city, task)
        all_rates.extend(rates)
        print(f"{city}/{task}: {len(rates)} taskers")

df = pd.DataFrame(all_rates)
df.to_csv('gig_rates.csv', index=False)
Enter fullscreen mode Exit fullscreen mode

Parsing Rate Data

import re

def parse_rate(rate_string):
    match = re.search(r'\$(\d+\.?\d*)', rate_string)
    return float(match.group(1)) if match else None

df['rate_numeric'] = df['hourly_rate'].apply(parse_rate)
city_avg = df.groupby(['city', 'task_type'])['rate_numeric'].agg(['mean', 'min', 'max'])
print("\nAverage rates by city and task:")
print(city_avg.round(2).to_string())
Enter fullscreen mode Exit fullscreen mode

Cross-Platform Comparison

Scrape multiple platforms to find pricing gaps. Use ThorData for residential proxy rotation:

def scrape_thumbtack_rates(zipcode, service):
    url = f"https://www.thumbtack.com/{service}/near-me/?zip={zipcode}"
    proxy_url = f"http://api.scraperapi.com?api_key={API_KEY}&url={url}&render=true"

    response = requests.get(proxy_url, timeout=60)
    soup = BeautifulSoup(response.text, 'html.parser')

    prices = []
    for card in soup.select('.pro-card'):
        price = card.select_one('.price-estimate')
        if price:
            prices.append(price.text.strip())
    return prices

tr_rates = scrape_taskrabbit_rates('nyc', 'cleaning')
tt_rates = scrape_thumbtack_rates('10001', 'house-cleaning')
Enter fullscreen mode Exit fullscreen mode

Visualizing Rate Trends

import matplotlib.pyplot as plt

pivot = df.pivot_table(values='rate_numeric', index='city', columns='task_type', aggfunc='mean')
pivot.plot(kind='bar', figsize=(12, 6), title='Average Gig Rates by City')
plt.ylabel('Hourly Rate ($)')
plt.tight_layout()
plt.savefig('gig_rate_comparison.png', dpi=150)
Enter fullscreen mode Exit fullscreen mode

Automating Daily Collection

Schedule daily and track changes. Use ScrapeOps to monitor scraper health. Store in SQLite for time-series analysis:

import sqlite3

conn = sqlite3.connect('gig_rates.db')
df.to_sql('rates', conn, if_exists='append', index=False)

weekly_trend = pd.read_sql('''
    SELECT city, task_type, AVG(rate_numeric) as avg_rate, DATE(scraped_at) as date
    FROM rates GROUP BY city, task_type, date ORDER BY date
''', conn)
Enter fullscreen mode Exit fullscreen mode

Key Insights

  • Rate differences between platforms in the same city can exceed 40%
  • Seasonal patterns are predictable — cleaning rates spike before holidays
  • New cities show initially low rates that normalize within 3 months
  • ScraperAPI handles the JavaScript rendering these platforms require

Top comments (0)