You don't need Celery for a cron job.
I've been building Python applications for years, and one pattern I see repeatedly is developers reaching for heavy-duty solutions when they just need to run a function every 5 minutes.
You want to send a daily report? Here comes Redis, Celery, a message broker, worker processes, and suddenly your simple scheduled task requires its own infrastructure documentation.
There's a better way.
The Problem with Python Scheduling Today
Let's be honest about the current landscape:
Celery is fantastic for distributed task queues at scale. But if you're not at scale? You're maintaining a Redis server for a morning email reminder.
APScheduler is capable, but the API feels like it was designed by someone who gets paid by the line of code. Configuring a simple daily job requires understanding triggers, executors, job stores, and more concepts than the actual business logic you're implementing.
Schedule is lightweight and nice, but lacks async support, persistence, and any visibility into what's actually running.
What I wanted was something that felt as natural as writing a decorator, with just enough features to be useful in production — without the overhead.
So I built FastScheduler.
MichielMe
/
fastscheduler
Decorator-first Python scheduler — cron/interval/at jobs with simple persistence and built-in run history.
FastScheduler
Simple, lightweight task scheduler for Python with async support, timezone handling, cron expressions, and a beautiful real-time dashboard.
If this saves you time, ⭐️ the repo and open an issue for ideas — I'm actively improving it.
Features
- 🎯 Simple decorator-based API - Schedule tasks in one line
- ⚡ Async/await support - Native async function support
- 🕐 Timezone support - Schedule jobs in any timezone
- 📅 Cron expressions - Complex schedules with cron syntax
- 💾 Persistent state - Survives restarts, handles missed jobs
- 🎨 FastAPI dashboard - Beautiful real-time monitoring UI
- 🔄 Automatic retries - Configurable retry with exponential backoff
- ⏱️ Job timeouts - Kill long-running jobs automatically
- ⏸️ Pause/Resume - Control jobs without removing them
- 📋 Dead Letter Queue - Track and debug failed jobs
Installation
# Basic installation
pip install fastscheduler
# With FastAPI dashboard
pip install fastscheduler[fastapi]
# With cron expression support
pip install fastscheduler[cron]
#…What If Scheduling Looked Like This?
from fastscheduler import FastScheduler
scheduler = FastScheduler()
@scheduler.every(10).seconds
def check_inventory():
print("Checking stock levels...")
@scheduler.daily.at("09:00")
async def send_morning_report():
await generate_and_send_report()
@scheduler.cron("0 9 * * MON-FRI")
def weekday_standup_reminder():
notify_team("Standup in 15 minutes!")
scheduler.start()
That's it. No configuration files. No separate worker processes. No message broker.
Just Python.
Key Features
🔄 Persistence That Actually Works
FastScheduler keeps state on disk. When your app restarts, it knows which jobs exist, when they last ran, and what they missed.
scheduler = FastScheduler(
state_file="scheduler.json",
history_retention_days=7,
max_history=5000
)
🎨 A Dashboard You'll Actually Use
If you're using FastAPI, adding a real-time dashboard takes three lines:
from fastapi import FastAPI
from fastscheduler import FastScheduler
from fastscheduler.fastapi_integration import create_scheduler_routes
app = FastAPI()
scheduler = FastScheduler()
app.include_router(create_scheduler_routes(scheduler))
Now visit /scheduler/ and you get:
- Real-time job status with countdown timers
- Execution history with success/failure indicators
- One-click pause, resume, and cancel
- Dead letter queue for debugging failures
- Live updates via SSE — no page refresh needed
🌍 Timezones Without the Headaches
@scheduler.daily.at("09:00", tz="America/New_York")
def east_coast_morning():
send_market_open_alert()
@scheduler.cron("0 9 * * MON-FRI").tz("Asia/Tokyo")
def tokyo_market_open():
fetch_nikkei_data()
💪 Production-Ready Failure Handling
Timeouts kill jobs that run too long:
@scheduler.every(5).minutes.timeout(30)
def quick_health_check():
check_all_services()
Retries with exponential backoff:
@scheduler.hourly.at(":15").retries(3)
def sync_with_external_api():
fetch_and_store_data()
Dead Letter Queue captures failures:
failed_jobs = scheduler.get_dead_letters(limit=100)
for failure in failed_jobs:
print(f"{failure['func_name']} failed: {failure['error']}")
The Full API in 60 Seconds
Interval-based:
@scheduler.every(10).seconds
@scheduler.every(5).minutes
@scheduler.every(2).hours
Time-based:
@scheduler.daily.at("09:00")
@scheduler.hourly.at(":30")
@scheduler.weekly.monday.at("10:00")
Cron expressions:
@scheduler.cron("*/15 * * * *") # Every 15 minutes
@scheduler.cron("0 9 * * MON-FRI") # Weekdays at 9 AM
One-time jobs:
@scheduler.once(60) # 60 seconds from now
@scheduler.at("2025-12-25 00:00:00") # Specific datetime
Real-World Example
from contextlib import asynccontextmanager
from fastapi import FastAPI
from fastscheduler import FastScheduler
from fastscheduler.fastapi_integration import create_scheduler_routes
scheduler = FastScheduler(state_file="app_scheduler.json", quiet=True)
@asynccontextmanager
async def lifespan(app: FastAPI):
scheduler.start()
yield
scheduler.stop(wait=True)
app = FastAPI(lifespan=lifespan)
app.include_router(create_scheduler_routes(scheduler))
@scheduler.every(1).minutes.timeout(10)
def health_check():
check_all_services()
@scheduler.daily.at("08:00", tz="America/New_York").retries(2)
async def send_daily_digest():
users = await get_subscribed_users()
for user in users:
await send_digest_email(user)
@scheduler.cron("*/15 9-17 * * MON-FRI").tz("America/New_York")
def sync_crm_data():
fetch_and_merge_crm_updates()
That's a complete background job system. No Redis. No Celery. No separate processes.
When to Use FastScheduler
Perfect for:
- Single-application background tasks
- Scheduled jobs in web applications
- Periodic data synchronization
- Automated reports and notifications
- Health checks and monitoring
Use something else if:
- You need distributed task execution across multiple workers
- You're processing millions of jobs per day
- You need exactly-once delivery guarantees
Get Started
# Basic
pip install fastscheduler
# With dashboard
pip install fastscheduler[fastapi]
# Everything
pip install fastscheduler[all]
MichielMe
/
fastscheduler
Decorator-first Python scheduler — cron/interval/at jobs with simple persistence and built-in run history.
FastScheduler
Simple, lightweight task scheduler for Python with async support, timezone handling, cron expressions, and a beautiful real-time dashboard.
If this saves you time, ⭐️ the repo and open an issue for ideas — I'm actively improving it.
Features
- 🎯 Simple decorator-based API - Schedule tasks in one line
- ⚡ Async/await support - Native async function support
- 🕐 Timezone support - Schedule jobs in any timezone
- 📅 Cron expressions - Complex schedules with cron syntax
- 💾 Persistent state - Survives restarts, handles missed jobs
- 🎨 FastAPI dashboard - Beautiful real-time monitoring UI
- 🔄 Automatic retries - Configurable retry with exponential backoff
- ⏱️ Job timeouts - Kill long-running jobs automatically
- ⏸️ Pause/Resume - Control jobs without removing them
- 📋 Dead Letter Queue - Track and debug failed jobs
Installation
# Basic installation
pip install fastscheduler
# With FastAPI dashboard
pip install fastscheduler[fastapi]
# With cron expression support
pip install fastscheduler[cron]
#…If it saves you time, give it a ⭐ on GitHub. If you have ideas, open an issue — I'm actively improving it.
What scheduling solutions are you currently using? Let me know in the comments!


Top comments (0)