DEV Community

Cover image for Stop sending your code to the cloud to find bugs
Nihal
Nihal

Posted on

Stop sending your code to the cloud to find bugs

I built FixMySlop to fix this.

**
 **
AI tools like Cursor and GitHub Copilot write 40-60% of code
in many teams now. Shipping is faster — but the hidden cost
is real: hardcoded secrets, SQL injection, unsafe pickle,
shell injection, weak hashing. All quietly sitting in your
codebase.

What Is It?

FixMySlop is a free, open-source desktop app that scans your
codebase for security issues, bugs, and AI slop patterns.

  • Runs 100% locally via Ollama — your code never leaves your machine
  • No API keys, no subscriptions, no cloud
  • Combines static analysis (Ruff + Bandit) with local LLM deep analysis
  • Works on Windows, macOS, Linux

Real Results

On a deliberately bad Python file with 40+ planted issues:

Static only:  42 issues found in 0.8 seconds
Turbo mode:   10 critical issues in 25 seconds  
Deep mode:    44 issues in 16 seconds
Slop Score:   100/100 💀
Enter fullscreen mode Exit fullscreen mode

Issues it caught that static tools missed:

  • Hardcoded AWS access keys and private RSA keys
  • Plaintext password comparison (no hashing at all)
  • Sensitive data leaking through URL parameters
  • Missing tests across entire classes
  • Race conditions in threading code

Two Scan Modes

⚡ Turbo — fast, top 10 critical issues, ~25s per file.
Great for daily dev workflow.

🔍 Deep — full analysis, all issues, ~16s per file.
Great for pre-commit or full audits.

Static only — instant, no LLM, great for CI pipelines.


How To Use It

# Install
git clone https://github.com/MrSpideyNihal/FixMySlop
pip install -r requirements.txt
ollama pull qwen2.5-coder:3b

# Turbo scan
python main.py scan ./myproject --mode turbo

# Deep scan  
python main.py scan ./myproject --mode deep

# Launch GUI
python main.py
Enter fullscreen mode Exit fullscreen mode

FixMySlop auto-detects which Ollama models you have —
no manual config needed.


Recommended Models by VRAM

VRAM Model Command
4GB qwen2.5-coder:3b ollama pull qwen2.5-coder:3b
8GB qwen2.5-coder:7b ollama pull qwen2.5-coder:7b
16GB+ qwen2.5-coder:14b ollama pull qwen2.5-coder:14b

Tech Stack

  • Python 3.10+
  • PyQt5 (GUI)
  • Typer + Rich (CLI)
  • Ruff + Bandit (static analysis)
  • OpenAI-compatible API (Ollama backend)
  • 39 tests passing

GitHub

https://github.com/MrSpideyNihal/FixMySlop

Feedback, issues and PRs welcome.
Still early — lots of room to grow 🚀

Top comments (0)