I kept pasting sensitive API payloads into online JSON formatters — and feeling uneasy about it.
As an engineer working in fintech, where data leakage isn't just a bug but a compliance risk, I needed a better way. That's why I built stdout: a local-first toolkit that runs entirely in your browser or as a desktop app.
🔗 GitHub: https://github.com/cminhho/stdout
This post walks through the design decisions, technical trade-offs, and lessons learned — not as a polished case study, but as notes from someone building in public.
Why local-first matters (especially when you handle financial data)
Most developer utilities work fine for generic tasks. But when you're debugging a lending API, inspecting a JWT that contains user PII, or mocking a payment payload, the question shifts from "does this tool work?" to "where does my data go?"
In domains like banking or lending, transmitting data to third-party servers can introduce compliance risks (PCI DSS, GDPR, local data residency laws). Even if a tool claims not to log data, the mental overhead of verifying that claim breaks flow.
Local-first software — where computation happens on-device by default — addresses this by design. No network request means no interception risk, no server logs, and no dependency on external uptime.
That principle became the foundation of stdout.
What's inside: 59 tools, one workspace
Instead of maintaining a list of bookmarked utilities, stdout consolidates common developer tasks into a single interface:
| Category | Examples |
|---|---|
| Format & Validate | JSON, YAML, SQL, XML, CSS, HTML |
| Convert & Transform | JSON↔YAML, Base64, Image↔Base64, Epoch ↔ Date |
| Security & Encoding | JWT Debugger, Hash Generators (MD5/SHA), HMAC, URL Encode |
| Generator & Mock | UUID, Password, Mock Payload, Lorem Ipsum |
| Network & DevOps | cURL Builder, HAR Viewer, Cron Parser, .env Converter |
The goal wasn't to rebuild every utility from scratch, but to provide a consistent, offline-capable environment where sensitive data never leaves the device.
Architecture decisions that shaped the product
Zero-backend by default
All logic runs client-side — in the browser for the PWA, or in the Electron renderer for the desktop app. There is no API layer, no database, no telemetry.
This simplifies deployment (just static files) and eliminates an entire class of security reviews. For fintech workflows, it also means you can validate a production payload without worrying about accidental exposure.
Modular registry for maintainability
Each tool is implemented as an isolated module and registered centrally. Here's a simplified view of how the registry works:
-
name: unique identifier for the tool, e.g.json-formatter -
component: lazy-loaded React component, e.g.lazy(() => import('./tools/json-formatter')) -
description: short text shown in the UI -
category: for grouping, e.g.format,security,convert
Example entry structure:
// src/tools/registry.ts (simplified)
export const registry = [
{
name: 'json-formatter',
component: lazy(() => import('./tools/json-formatter')),
description: 'Formats and validates JSON data locally.',
category: 'format',
},
{
name: 'jwt-debugger',
component: lazy(() => import('./tools/jwt-debugger')),
description: 'Decodes and inspects JWT tokens client-side.',
category: 'security',
},
// ...57 more tools
];
This pattern, inspired by micro-frontend architectures, allows:
- Lazy-loading to keep initial bundle size manageable
- Independent testing per tool
- Easy extension: adding a new utility requires only a new module + registry entry
Hybrid distribution: meet users where they are
- Web (PWA): Built with Vite, deployable as static files. Service workers cache assets after first load for offline use.
- Desktop (Electron): Cross-platform builds for macOS, Windows, Linux. Enables system-level shortcuts and native window management.
-
Package manager: macOS users can install via Homebrew:
brew install --cask cminhho/tap/stdout
The web version lowers the barrier to try; the desktop version supports power users who want native integration.
Challenges & lessons learned
Verifying "no data leaves the device"
Claiming privacy is easy; proving it is harder. Early on, I added a simple audit step:
- Open browser dev tools → Network tab
- Run each tool with sample data
- Confirm zero outbound requests
This became part of the pre-release checklist. In regulated domains, this kind of verification isn't optional — it's table stakes.
Performance with large payloads
Browsers have memory limits. Formatting a 50MB JSON file client-side can block the UI if not handled carefully.
Approach:
- Use Web Workers for heavy parsing (when needed)
- Implement chunked processing for streaming-like behavior
- Profile with Chrome Performance tab during development
Trade-off: Not every edge case is optimized yet. The goal is "fast enough for daily use" with clear paths to improve.
Keeping dependencies lean
Each npm package adds bundle size, potential vulnerabilities, and maintenance overhead.
Strategy:
- Prefer native Web APIs when possible (e.g.
crypto.subtlefor hashing) - Audit dependencies regularly with
pnpm auditornpm audit - Ask: "Does this tool need a library, or can it be 20 lines of pure JS?"
Result: The core app stays under 2MB gzipped, even with 59 tools.
Testing strategy
- Unit tests (Jest) for pure logic: encoding, parsing, validation
- Manual UI testing for workflow: does the output feel immediate and predictable?
- Community feedback via GitHub issues as a form of integration testing
Lesson: Ship a minimal viable set first, then iterate based on real usage patterns.
What's next
Short-term:
- Improve accessibility (keyboard navigation, screen reader support)
- Add 5-10 most-requested tools from community feedback
- Document contribution workflow more clearly
Long-term experiments:
- WebAssembly for compute-heavy tasks (e.g. image processing) without leaving the client
- Encrypted, user-controlled sync for settings — optional, never automatic
- Plugin API for advanced users to add custom tools
None of these are commitments. They're directions I'm exploring publicly, and feedback will shape what actually gets built.
Try it or contribute
Run locally:
git clone https://github.com/cminhho/stdoutcd stdoutpnpm installpnpm dev
macOS via Homebrew:
brew install --cask cminhho/tap/stdout
If you spot a bug, have a tool request, or want to discuss architecture choices, GitHub issues and PRs are welcome.
One question for you
What's one utility you currently use online that you'd feel more comfortable running entirely offline?
I'm prioritizing the next batch of tools based on what developers actually need — not what seems clever to build. Let me know in the comments.
About the author: I'm a software engineer based in Ho Chi Minh City, working on multi-country fintech platforms. I write about architecture, product strategy, and building tools that respect user privacy. Connect on GitHub or [LinkedIn] for discussions on local-first technologies.


Top comments (0)