DEV Community

Alex Spinov
Alex Spinov

Posted on

I Built a Price Monitoring Bot That Saved My Client $12,000

The phone call that started everything

Last year, a small e-commerce owner called me in a panic. His competitor was undercutting him on 200+ products — sometimes by just $0.50 — and he was losing sales every day.

He'd been checking prices manually. Every. Single. Day. For 200 products across 3 competitor sites.

I told him: "Give me a weekend."

Here's the bot I built, and how it saved him $12,000 in the first 3 months.


The architecture (surprisingly simple)

┌─────────────┐     ┌──────────────┐     ┌─────────────┐
│ Scraper      │────▶│ Price DB     │────▶│ Alert System│
│ (Python)     │     │ (SQLite)     │     │ (Email/TG)  │
└─────────────┘     └──────────────┘     └─────────────┘
       │                    │                     │
   Runs every 6h     Stores history      Sends alert if
   via cron          + calculates diff   price drops >5%
Enter fullscreen mode Exit fullscreen mode

Three components. No cloud. No database server. Runs on a $5/mo VPS.


Step 1: The scraper

import requests
from bs4 import BeautifulSoup
import sqlite3
from datetime import datetime

def scrape_price(url, selector):
    """Scrape price from any page using CSS selector."""
    headers = {'User-Agent': 'Mozilla/5.0 (compatible; PriceBot/1.0)'}
    resp = requests.get(url, headers=headers, timeout=10)
    soup = BeautifulSoup(resp.text, 'html.parser')
    price_el = soup.select_one(selector)
    if price_el:
        # Extract number from text like '$29.99' or '29,99 €'
        import re
        price_text = re.sub(r'[^\d.,]', '', price_el.text)
        price_text = price_text.replace(',', '.')
        return float(price_text)
    return None

# Example: scrape a product price
price = scrape_price(
    'https://example-store.com/product-123',
    '.price-current'  # CSS selector for the price element
)
print(f'Current price: ${price}')
Enter fullscreen mode Exit fullscreen mode

The key insight: every e-commerce site has a CSS selector for the price. You just need to find it once per site.


Step 2: The database

def init_db():
    conn = sqlite3.connect('prices.db')
    conn.execute('''CREATE TABLE IF NOT EXISTS prices (
        id INTEGER PRIMARY KEY,
        product_name TEXT,
        competitor TEXT,
        price REAL,
        scraped_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
    )''')
    return conn

def save_price(conn, product, competitor, price):
    conn.execute(
        'INSERT INTO prices (product_name, competitor, price) VALUES (?, ?, ?)',
        (product, competitor, price)
    )
    conn.commit()

def get_price_change(conn, product, competitor):
    """Get the last two prices to calculate change."""
    rows = conn.execute(
        '''SELECT price FROM prices 
           WHERE product_name=? AND competitor=?
           ORDER BY scraped_at DESC LIMIT 2''',
        (product, competitor)
    ).fetchall()
    if len(rows) == 2:
        current, previous = rows[0][0], rows[1][0]
        change_pct = ((current - previous) / previous) * 100
        return current, previous, change_pct
    return None, None, 0
Enter fullscreen mode Exit fullscreen mode

SQLite is perfect here. No setup, no server, file-based. For 200 products × 4 checks/day, the database stays under 10MB per year.


Step 3: The alert system

import smtplib
from email.mime.text import MIMEText

def send_alert(product, competitor, old_price, new_price, change_pct):
    direction = '📉 DROPPED' if change_pct < 0 else '📈 INCREASED'

    body = f"""
    {direction}: {product}
    Competitor: {competitor}
    Was: ${old_price:.2f}
    Now: ${new_price:.2f}
    Change: {change_pct:+.1f}%

    Action needed: {'Update your price!' if change_pct < -5 else 'Monitor'}
    """

    msg = MIMEText(body)
    msg['Subject'] = f'{direction} {product} ({change_pct:+.1f}%)'
    msg['From'] = 'bot@yourdomain.com'
    msg['To'] = 'owner@store.com'

    with smtplib.SMTP('smtp.gmail.com', 587) as s:
        s.starttls()
        s.login('bot@yourdomain.com', 'app-password')
        s.send_message(msg)
Enter fullscreen mode Exit fullscreen mode

Step 4: Putting it all together

# config.json — one entry per product
PRODUCTS = [
    {
        'name': 'Wireless Mouse X200',
        'our_price': 29.99,
        'competitors': [
            {'name': 'Store A', 'url': 'https://store-a.com/mouse-x200', 'selector': '.price'},
            {'name': 'Store B', 'url': 'https://store-b.com/products/123', 'selector': '.product-price'},
        ]
    },
    # ... 199 more products
]

def run_monitor():
    conn = init_db()
    alerts = []

    for product in PRODUCTS:
        for comp in product['competitors']:
            price = scrape_price(comp['url'], comp['selector'])
            if price is None:
                continue

            save_price(conn, product['name'], comp['name'], price)
            current, previous, change = get_price_change(
                conn, product['name'], comp['name']
            )

            if abs(change) > 5:  # Alert on >5% change
                send_alert(
                    product['name'], comp['name'],
                    previous, current, change
                )
                alerts.append(f"{product['name']}: {change:+.1f}%")

    print(f'Checked {len(PRODUCTS)} products. {len(alerts)} alerts sent.')
    conn.close()

if __name__ == '__main__':
    run_monitor()
Enter fullscreen mode Exit fullscreen mode

Add to crontab: 0 */6 * * * python3 /home/bot/monitor.py

Done. Runs 4 times a day, completely unattended.


The results

Month 1: Bot detected 47 price changes. Client adjusted 23 prices within hours instead of days. Estimated saved revenue: $3,200.

Month 2: Added Telegram alerts (faster than email). Client started proactively undercutting competitors on high-margin products. Revenue up $4,800.

Month 3: Added historical price trend analysis. Client could now predict when competitors would drop prices (seasonal patterns). Another $4,000 saved.

Total: $12,000 in 3 months from a weekend project.


What I'd do differently

  1. Use Playwright instead of requests for JS-heavy sites
  2. Add proxy rotation for sites that block repeated requests
  3. Build a simple dashboard instead of email-only alerts
  4. Add competitor stock monitoring — knowing when they're out of stock is just as valuable

Want to build something similar?

I've open-sourced 77 web scrapers on Apify Store — many of them do exactly this kind of price monitoring. Check out my GitHub for more automation tools.

Have you built a bot that saved real money? I'd love to hear your story in the comments.


I build data pipelines and web scrapers for businesses. If you need a custom monitoring solution — email me.


More from me: 10 Dev Tools I Use Daily | 77 Scrapers on a Schedule | 150+ Free APIs

Top comments (0)