Welcome to the finale of our Local Agent series.
Processing sensitive user data (PII) is risky. You can't send it to the cloud. You don't want to read it manually.
The PII Scrubber Agent demonstrates a "Level 2" Agent capability: Writing its own tools.
The Goal
- Analyze a CSV (
users.csv). - Detect PII (Emails, Names).
- Write a Python script to scrub the data safely.
- Run the script to produce
users_cleaned.csv.
The Architecture: "Plan, Code, Run"
Instead of rewriting the CSV line-by-line (slow and expensive), the Agent writes code to do it efficiently.
The "Smarts" Requirement
This workflow crashed smaller models (llama3.2). They would try to run the script before writing it!
We upgraded to gpt-oss:20b, which successfully:
- Paused to inspect the file.
- Wrote the complete Python script.
- Executed it successfully.
Conclusion
We've built 4 agents:
- Tidy-Up: Simple Actions.
- Analyst: DB Querying.
- Archaeologist: File Editing.
- Scrubber: Code Generation & Execution.
And we did it all locally. No API keys. No data leaks. Just you, Goose, and Ollama.
Complete code is available on Github here.
Go build something cool! 🪿


Top comments (0)