The Most Productive Thing I Ever Did
Two years ago, I created a ~/scripts/ folder. Every time I write a one-off script that saves me time, it goes in there. Now I have 30+ scripts that save me hours every week.
Here are the most useful ones.
1. Quick API Check
#!/usr/bin/env python3
# api_check.py — Test if an API endpoint is alive
import requests, sys
url = sys.argv[1] if len(sys.argv) > 1 else input("URL: ")
try:
resp = requests.get(url, timeout=10)
print(f"Status: {resp.status_code}")
print(f"Time: {resp.elapsed.total_seconds():.2f}s")
print(f"Size: {len(resp.content)} bytes")
if resp.headers.get("content-type", "").startswith("application/json"):
import json
print(json.dumps(resp.json(), indent=2)[:500])
except Exception as e:
print(f"Error: {e}")
Usage: python3 api_check.py https://api.openalex.org/works?per_page=1
2. CSV Quick Look
#!/usr/bin/env python3
# csv_look.py — Quick stats about any CSV
import sys
try:
import duckdb
f = sys.argv[1]
print(duckdb.sql(f"SELECT COUNT(*) as rows FROM '{f}'").fetchone()[0], "rows")
print(duckdb.sql(f"DESCRIBE SELECT * FROM '{f}'"))
print("\nFirst 3 rows:")
print(duckdb.sql(f"SELECT * FROM '{f}' LIMIT 3"))
except ImportError:
import csv
with open(sys.argv[1]) as f:
reader = csv.reader(f)
headers = next(reader)
rows = list(reader)
print(f"Columns: {len(headers)} | Rows: {len(rows)}")
print(f"Headers: {', '.join(headers)}")
3. JSON Formatter
#!/usr/bin/env python3
# jf.py — Format JSON from clipboard, file, or stdin
import json, sys
if len(sys.argv) > 1:
data = open(sys.argv[1]).read()
else:
data = sys.stdin.read()
print(json.dumps(json.loads(data), indent=2, ensure_ascii=False))
Usage: cat data.json | python3 jf.py or python3 jf.py data.json
4. Find Large Files
#!/usr/bin/env python3
# big_files.py — Find largest files in directory
from pathlib import Path
import sys
path = sys.argv[1] if len(sys.argv) > 1 else "."
files = [(f, f.stat().st_size) for f in Path(path).rglob("*") if f.is_file()]
files.sort(key=lambda x: x[1], reverse=True)
for f, size in files[:20]:
if size > 1_000_000:
print(f"{size/1_000_000:.1f} MB | {f}")
elif size > 1_000:
print(f"{size/1_000:.0f} KB | {f}")
5. Port Checker
#!/usr/bin/env python3
# ports.py — Check which ports are open on a host
import socket, sys
host = sys.argv[1] if len(sys.argv) > 1 else "localhost"
common_ports = [22, 80, 443, 3000, 3306, 5432, 6379, 8000, 8080, 8443, 9090]
for port in common_ports:
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.settimeout(1)
result = sock.connect_ex((host, port))
if result == 0:
print(f" OPEN | {port}")
sock.close()
6. Git Stats
#!/bin/bash
# git_stats.sh — Quick repo stats
echo "Commits: $(git rev-list --count HEAD)"
echo "Authors: $(git shortlog -sn | wc -l)"
echo "Files: $(git ls-files | wc -l)"
echo "Lines: $(git ls-files | xargs wc -l 2>/dev/null | tail -1)"
echo ""
echo "Top 5 authors:"
git shortlog -sn | head -5
echo ""
echo "Last 5 commits:"
git log --oneline -5
7. Quick Web Scrape
#!/usr/bin/env python3
# scrape.py — Extract text from any URL
import requests, sys, re
url = sys.argv[1] if len(sys.argv) > 1 else input("URL: ")
resp = requests.get(url, headers={"User-Agent": "Mozilla/5.0"}, timeout=10)
text = re.sub(r"<script[^>]*>.*?</script>", "", resp.text, flags=re.DOTALL)
text = re.sub(r"<style[^>]*>.*?</style>", "", text, flags=re.DOTALL)
text = re.sub(r"<[^>]+>", " ", text)
text = re.sub(r"\s+", " ", text).strip()
print(text[:3000])
The System
- Script goes in
~/scripts/ - Add
export PATH="$HOME/scripts:$PATH"to.bashrc - Make executable:
chmod +x script.py - Use anywhere:
csv_look.py data.csv
After a year, you'll have a personal toolkit that nobody else has — perfectly tailored to YOUR workflow.
All My Scripts (Open Source)
I've published my most useful scripts on GitHub:
python-data-scripts — 10+ scripts for APIs, data processing, and automation.
What scripts are in YOUR scripts folder? I'm always looking for new additions.
I write about developer productivity and Python tools. Follow for more.
More from me: 10 Dev Tools I Use Daily | 77 Scrapers on a Schedule | 150+ Free APIs
Need web scraping or data extraction? I've built 77+ production scrapers. Email spinov001@gmail.com — quote in 2 hours. Or try my ready-made Apify actors — no code needed.
Top comments (0)