Saturday afternoon. You've been eyeing that ₹45,000 laptop on Amazon India for three weeks. Does it dip during the Great Indian Sale? Does it spike on weekends? Nobody knows — because nobody is watching.
Today we'll fix that with ~50 lines of Python. No paid APIs, no ₹499/month SaaS, no Chrome extensions that sell your data. Just a weekend project you can finish before dinner.
What we're building
A tiny script that:
- Visits an Amazon India product URL.
- Scrapes the current price.
- Logs it to a CSV with a timestamp.
- Pings you on Telegram if the price drops below a target.
Total build time: ~45 minutes. Total cost: ₹0.
Step 1 — Install the basics
You need Python 3.9+ and two libraries:
pip install requests beautifulsoup4
That's it. No Selenium, no headless Chrome, no scraping service.
Step 2 — The scraper
Amazon blocks bare requests calls, so we send a real browser's User-Agent and accept-language header. This works 90% of the time for public product pages; if you hit a CAPTCHA, just wait an hour and try again.
import requests
from bs4 import BeautifulSoup
HEADERS = {
"User-Agent": (
"Mozilla/5.0 (Macintosh; Intel Mac OS X 13_5) "
"AppleWebKit/537.36 (KHTML, like Gecko) "
"Chrome/124.0.0.0 Safari/537.36"
),
"Accept-Language": "en-IN,en;q=0.9",
}
def get_price(url: str) -> tuple[str, int]:
r = requests.get(url, headers=HEADERS, timeout=20)
r.raise_for_status()
soup = BeautifulSoup(r.text, "html.parser")
title = soup.select_one("#productTitle").get_text(strip=True)
price_el = soup.select_one(".a-price .a-offscreen")
if not price_el:
raise RuntimeError("Price element not found — page layout changed.")
# "₹45,999.00" -> 45999
raw = price_el.get_text(strip=True).replace("₹", "").replace(",", "")
rupees = int(float(raw))
return title, rupees
Two things worth noting:
- The CSS selectors (
#productTitle,.a-price .a-offscreen) are stable on Amazon India as of 2026, but Amazon rotates layouts. If your script breaks, right-click the price → Inspect → copy a fresh selector. - Always check
raise_for_status(). A 503 usually means you hit Amazon's rate limit — back off, don't hammer it.
Step 3 — The logger
CSV is fine. You don't need a database for a personal price tracker.
import csv
from datetime import datetime
from pathlib import Path
LOG = Path("prices.csv")
def log_price(title: str, rupees: int) -> None:
new = not LOG.exists()
with LOG.open("a", newline="", encoding="utf-8") as f:
w = csv.writer(f)
if new:
w.writerow(["timestamp", "title", "rupees"])
w.writerow([
datetime.now().isoformat(timespec="seconds"),
title,
rupees,
])
After a week of data you'll have a CSV you can open in Excel or pandas. Plot it. Watch the weekend dips.
Step 4 — The Telegram alert
Create a Telegram bot via @botfather, note the token, and message your bot once so it knows your chat ID (fetch it from https://api.telegram.org/bot<TOKEN>/getUpdates).
import os
BOT_TOKEN = os.environ["TG_BOT_TOKEN"]
CHAT_ID = os.environ["TG_CHAT_ID"]
def ping(msg: str) -> None:
requests.post(
f"https://api.telegram.org/bot{BOT_TOKEN}/sendMessage",
data={"chat_id": CHAT_ID, "text": msg},
timeout=10,
)
Environment variables, not hardcoded tokens. Always.
Step 5 — Tie it together
PRODUCTS = [
("https://www.amazon.in/dp/B0CHX1W1XY", 45000), # (url, target_rupees)
("https://www.amazon.in/dp/B0BDHWDR12", 12000),
]
def main():
for url, target in PRODUCTS:
try:
title, price = get_price(url)
log_price(title, price)
if price <= target:
ping(f"💸 {title[:60]} is ₹{price:,} (target ₹{target:,})\n{url}")
print(f"OK ₹{price:,} — {title[:60]}")
except Exception as e:
print(f"FAIL {url[:60]} — {e}")
if __name__ == "__main__":
main()
That's the full 50 lines. Save it as price_tracker.py.
Step 6 — Schedule it
On macOS or Linux, cron runs it every 6 hours:
0 */6 * * * cd /home/you/tracker && /usr/bin/python3 price_tracker.py >> tracker.log 2>&1
On Windows, Task Scheduler does the same. On a Raspberry Pi? Even better — your tracker runs 24/7 on ₹200/year of electricity.
What you'll notice after 2 weeks
Running this on 5–10 products for two weeks taught me three things I didn't know:
- Amazon India prices move daily, not seasonally. Same laptop: ₹45,999 on Tuesday, ₹43,499 on Saturday, ₹46,499 on Monday.
- "Deals of the Day" aren't usually the lowest price that month. The real dip often happens the week after a sale ends.
- Pin codes matter. A product shown at ₹1,299 in Mumbai can be ₹1,399 in a Tier-2 pin code — the script above uses whatever pin code Amazon defaults to. Add a
?pincode=110001variant if you want consistency.
Ways to extend it this weekend
If you finish early, three upgrades worth ~30 min each:
- Flipkart support — different selectors, same pattern. You now track both in one CSV.
- 7-day rolling min/max — load the CSV with pandas, alert only when price hits a new 7-day low.
- Chart generator — a weekly email with a matplotlib PNG of every product you're tracking.
A small warning
Scraping at human-level frequency (every few hours, a handful of products) is fine. Scraping 10,000 URLs every 60 seconds will get your IP blocked and is against Amazon's ToS. Be a good citizen. If you need scale, pay for a proper scraping API — but for personal use, this script is plenty.
That's the whole weekend project. Clone, customize, commit. By Sunday evening you'll have a working tracker and a week of data starting to pile up.
Ping me on Dev.to if you ship it — I read every reply.
I'm Archit Mittal — I automate chaos for businesses. Follow me for daily automation content.
Top comments (0)