File shows up → do something. Why is this still hard?
I've built more file-processing pipelines than I can count. Media ingest systems, config reloaders, data pipelines, upload processors. They all start the same way: watch a folder, react to changes.
And every single time, I end up writing the same boilerplate:
from watchfiles import watch
for changes in watch('./inbox'):
for change_type, path in changes:
if path.endswith('.json'):
if change_type == Change.added:
process_new_file(path)
elif change_type == Change.modified:
reload_config(path)
# ... and so on
It works. But it's tedious. The logic for what you're watching gets tangled with how you're watching it.
I wanted something simpler.
MichielMe
/
flowwatch
FlowWatch is a tiny ergonomic layer on top of Watchfiles that makes it easy to build file-driven workflows using simple decorators and a pretty Rich + Typer powered CLI.
FlowWatch
FlowWatch is a tiny ergonomic layer on top of watchfiles
that makes it easy to build file-driven workflows using simple decorators and a pretty
Rich + Typer powered CLI.
Instead of wiring watchfiles.watch() manually in every project, you declare:
- what folder(s) you want to watch
-
which patterns you care about (e.g.
*.mxf,*.json) - which function should run for a given event (created / modified / deleted)
FlowWatch takes care of:
- subscribing to all roots in a single watcher loop
- debouncing and recursive watching
- dispatching events to handlers with a small thread pool
- optional processing of existing files on startup
- nicely formatted logs and a CLI overview of registered handlers
- real-time web dashboard for monitoring events
Installation
FlowWatch is published as a normal Python package.
# Using uv (recommended)
uv add flowwatch
# Or with pip
pip install flowwatch
Optional Extras
# Standalone dashboard (Starlette + uvicorn)
uv…What If File Watching Looked Like This?
from flowwatch import on_created, on_modified, run
@on_created("./inbox", pattern="*.json")
def handle_new_json(event):
print(f"New JSON file: {event.path}")
process_upload(event.path)
@on_modified("./config", pattern="*.yaml")
def reload_config(event):
print(f"Config changed: {event.path}")
app.reload_settings()
@on_created("./media", pattern="*.mxf", process_existing=True)
def ingest_media(event):
print(f"New media file: {event.path}")
start_transcode(event.path)
run()
That's FlowWatch — a decorator-first layer on top of watchfiles that makes file-driven workflows readable and maintainable.
Your intent is obvious from the code. No wiring. No boilerplate.
Why I Built This
I work in broadcast technology where file-based workflows are everywhere. Media files land in watch folders. Metadata sidecars appear alongside them. Config files change.
Every project needed the same pattern:
- Watch one or more directories
- Filter by file extension
- Route to different handlers based on event type
- Maybe process files that already exist on startup
I kept copying the same wrapper code between projects. FlowWatch is that code — extracted into a proper library.
Core Concepts
The FileEvent Object
Every handler receives a FileEvent with everything you need:
@on_created("./uploads", pattern="*.pdf")
def handle_pdf(event):
print(event.path) # pathlib.Path to the file
print(event.root) # the folder being watched
print(event.pattern) # the pattern that matched
print(event.change) # watchfiles.Change enum
# Convenience properties
if event.is_created:
process_new_file(event.path)
Four Decorators, Four Intents
@on_created(root, pattern="*.txt") # New files
@on_modified(root, pattern="*.json") # Changed files
@on_deleted(root, pattern="*.tmp") # Removed files
@on_any(root, pattern="*.*") # All events
Process Existing Files on Startup
Set process_existing=True and FlowWatch will scan the directory on startup:
@on_created("./queue", pattern="*.job", process_existing=True)
def process_job(event):
# Runs for files already in ./queue when you start
# Then continues watching for new ones
execute_job(event.path)
No more "catch up on what we missed" logic.
A Real-Time Dashboard
FlowWatch ships with an optional web dashboard:
Features:
- Live event counters — created, modified, deleted at a glance
- Watched directories — see exactly what folders are being monitored
- Recent activity feed — every file event with status, path, and timestamp
Standalone Dashboard
pip install flowwatch[dashboard]
from flowwatch import on_created, run_with_dashboard
@on_created("./inbox", pattern="*.json")
def handle_json(event):
process_file(event.path)
run_with_dashboard(host="0.0.0.0", port=8000)
FastAPI Integration
pip install flowwatch[fastapi]
from fastapi import FastAPI
from flowwatch import on_created
from flowwatch.fastapi import create_flowwatch_router
app = FastAPI()
@on_created("./uploads", pattern="*.*")
def handle_upload(event):
process_upload(event.path)
app.include_router(create_flowwatch_router(), prefix="/flowwatch")
The CLI
FlowWatch ships with a Typer + Rich CLI:
flowwatch run myproject.watchers
This imports your module, discovers all handlers, and shows you a formatted table:
┏━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━┳━━━━━━━━━┳━━━━━━━━━━┓
┃ Handler ┃ Root ┃ Events ┃ Pattern ┃
┡━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━╇━━━━━━━━━╇━━━━━━━━━━┩
│ handle_new_json │ ./inbox │ created │ *.json │
│ reload_config │ ./config │ modified│ *.yaml │
│ ingest_media │ ./media │ created │ *.mxf │
└──────────────────┴──────────────────┴─────────┴──────────┘
Options:
flowwatch run myproject.watchers \
--debounce 500 \
--max-workers 8 \
--no-recursive \
--log-level DEBUG
Real-World Example: Media Ingest
Here's a pattern I use in production:
from pathlib import Path
from flowwatch import FileEvent, on_created, on_deleted, run
INBOX = Path("/media/incoming")
@on_created(str(INBOX), pattern="*.mxf", process_existing=True)
def ingest_media(event: FileEvent) -> None:
"""New media file arrived — start processing."""
print(f"[ingest] New media: {event.path.name}")
# Check for sidecar metadata
sidecar = event.path.with_suffix(".json")
if sidecar.exists():
start_full_ingest(event.path, sidecar)
else:
queue_pending(event.path)
@on_created(str(INBOX), pattern="*.json")
def handle_sidecar(event: FileEvent) -> None:
"""Metadata sidecar arrived — check if media is waiting."""
media_file = event.path.with_suffix(".mxf")
if is_pending(media_file):
start_full_ingest(media_file, event.path)
if __name__ == "__main__":
run()
Docker-Ready
FlowWatch runs great as a worker container:
services:
backend:
build: ./backend
volumes:
- media:/media
flowwatch:
build: ./backend
command: flowwatch run myproject.watchers
volumes:
- media:/media
restart: unless-stopped
volumes:
media:
When to Use FlowWatch
Perfect for:
- File ingest pipelines
- Config file reloaders
- Upload processing queues
- Build systems and asset pipelines
- Any "file arrives → do something" workflow
Use something else if:
- You need distributed file watching across machines
- You need complex DAG-based orchestration (use Airflow, Prefect)
FlowWatch is intentionally simple — a thin layer over watchfiles, not a workflow engine.
Installation
# Core library
pip install flowwatch
# With standalone dashboard (Starlette + uvicorn)
pip install flowwatch[dashboard]
# With FastAPI integration
pip install flowwatch[fastapi]
# All features
pip install flowwatch[all]
Or with uv:
uv add flowwatch
uv add flowwatch --extra dashboard
uv add flowwatch --extra fastapi
uv add flowwatch --extra all
MichielMe
/
flowwatch
FlowWatch is a tiny ergonomic layer on top of Watchfiles that makes it easy to build file-driven workflows using simple decorators and a pretty Rich + Typer powered CLI.
FlowWatch
FlowWatch is a tiny ergonomic layer on top of watchfiles
that makes it easy to build file-driven workflows using simple decorators and a pretty
Rich + Typer powered CLI.
Instead of wiring watchfiles.watch() manually in every project, you declare:
- what folder(s) you want to watch
-
which patterns you care about (e.g.
*.mxf,*.json) - which function should run for a given event (created / modified / deleted)
FlowWatch takes care of:
- subscribing to all roots in a single watcher loop
- debouncing and recursive watching
- dispatching events to handlers with a small thread pool
- optional processing of existing files on startup
- nicely formatted logs and a CLI overview of registered handlers
- real-time web dashboard for monitoring events
Installation
FlowWatch is published as a normal Python package.
# Using uv (recommended)
uv add flowwatch
# Or with pip
pip install flowwatch
Optional Extras
# Standalone dashboard (Starlette + uvicorn)
uv…If it saves you time, give it a ⭐ on GitHub. If you have ideas, open an issue — I'm actively improving it.
What file-watching patterns do you use? Let me know in the comments!


Top comments (0)