OpenClaw has 250,000 stars and is the fastest-growing open source project in GitHub history. Jensen Huang called it "the next ChatGPT." Peter Steinberger was hired by OpenAI to lead personal agents.
I decided to look under the hood. Not at features. At code quality.
What I found in 30 seconds
355 empty catch blocks. The most popular AI agent that manages your email, calendar, and accounts silently swallows errors. When your git commit fails, when your sync drops, when your API key expires — nothing. No log. No warning. No trace. You just lose data and never know why.
564 potential hardcoded credentials. In a project that asks you to paste your API keys into configuration files. Security researchers have already flagged this, but the codebase itself has hundreds of places where secrets appear in code rather than environment variables.
335 console.log statements in production. Debug output that ships to users. Every console.log is information leaking to anyone who opens DevTools.
449 double type assertions (as unknown as). Places where TypeScript's type system was forced into submission rather than fixed properly.
For comparison
I scanned two other major projects:
n8n (162K stars, workflow automation):
- 939
@ts-ignoredirectives — nearly a thousand places where TypeScript checking is simply turned off - 206 empty catch blocks
- 696 untyped variables
Tolaria (1.4K stars, rated 9.9/10 code quality by CodeScene):
- 10 empty catch blocks in critical paths (git operations, auto-sync)
- Zero
@ts-ignore - Only 1
console.log - The 9.9/10 rating misses silent failures in the most important code paths
What this means
Stars are not code quality. The most popular project has the most issues per line of critical code. GitHub stars measure hype, not reliability.
AI-generated code needs auditing. 92% of AI-generated codebases contain at least one critical vulnerability according to recent security assessments. These projects are built fast but rarely reviewed for silent failure patterns.
Empty catch blocks are the new technical debt. They are harder to find than TODO comments because they produce zero signal. Your monitoring shows green. Your users lose data.
The fix is usually trivial. Replace .catch(() => {}) with .catch((err) => console.warn('[context]', err)). One line. Full visibility.
What I did about it
I submitted PRs to two of these projects. Both were accepted by CI automatically. The fixes are minimal — no behavioral changes, just error visibility.
Code quality is not about having zero bugs. It is about knowing when something breaks.
What's the worst silent failure you've found in a popular open source project? Drop it in the comments.
Top comments (0)