Table of Contents
Introduction
From Prompt to Binary Code
Is It Even Feasible?
Is It Worth It?
Things That Almost Everyone Gets Wron...
For further actions, you may consider blocking this person and/or reporting abuse
The entire history of programming languages is the history of moving away from binary.
Machine code → assembly → FORTRAN (1957) → everything since.
Each layer of abstraction exists because humans couldn't reason about the layer below it at scale.
Grace Hopper was literally laughed at for suggesting that programs could be written in English-like words instead of machine code.
The question isn't whether AI can generate binary — it's whether we'd be able to verify, debug, or trust what it generated.
Languages exist for human comprehension.
Great article and question!
Totally agree. Code is the only deterministic interface we have at this moment. I don't think we want to end up with the only interface (natural language), which is totally non-deterministic. Granted, it's much easier and everyone can use it, but things can go wrong with that approach.
Of course it is feasible.
Humans can do it. When I was a kid I had a book on Z80 assembly language, but no assembler. I wrote some short programs by hand by using the built in BASIC on the ZX Spectrum to insert the bytes straight into memory - bytes that I'd worked out on paper. It worked, but would obviously become untenable pretty quickly for larger programs.
The key word is determinism.
Any comparison with how people built software before (assembly) vs now (high lvl abstractions) is pointless, because on both sides determinism was present.
AI cannot be deterministic. input A could result in B, but can also result in A1.
You can argue that given a rigid input > output > refactor loop AI can get it right and produce deterministic binaries. But at what cost (tokens and time)?
Exactly. Technically it’s absolutely feasible. However things start getting tricky when dealing with large, enterprise apps, or even middle-sized apps. AI can really struggle with those.
Yes, AI can technically generate binary directly because binary is just a sequence of bits that represent machine instructions. In theory, a trained AI model could produce executable binary code. However, in practice this approach is not very feasible. Even a small mistake in the bit sequence can break the entire program, making it difficult to debug and maintain.
For this reason, most AI systems generate higher-level programming code instead, which is then compiled into binary using traditional compilers. This process is more reliable, transparent, and easier for developers to review. Many organizations working in machine learning development, including Nadcab Labs, focus on AI-assisted code generation rather than direct binary creation.
Thanks for expanding on the point of this article perfectly. You’re spot on, even a single incorrectly flipped bit can cause catastrophic issues in the system - which is more than likely when building something at this level, using probabilistic methods.
Abandoning the determinism that was forged over decades in favor of a probabilistic approach simply doesn’t make any sense.
Great post and topic! My biggest problem with the entire concept is that LLMs are trained on natural language by design. No matter what you do beyond the dev -> prompt -> LLM flow, it still reasons with the same language humans do.
The only thing you could possibly gain from this theory is outright hiding the source layer, which as you pointed out is where all of the collaboration happens. Even if we get to a point in the future where code exists solely for AI (which is an entirely different debate), the LLM still needs natural language to function. The entire concept is built around translating intent expressed using human words.
Sure, one could argue that binary is technically a “language.” But then the cost explodes trying to train a model to be fluent in both human reasoning and machine-level instructions. At that point you’re basically rebuilding something compilers already solve extremely well—and have been for more than 70 years!
So unless the goal is to remove the human layer entirely, I’m not sure what practical advantage direct binary generation actually provides...
Totally. Even if we want to remove the human element from the equation, there’s no need for removing traditional source code. It just makes things way more complicated.
The idea of skipping code to generate binary feels like generating a pixel-perfect UI screenshot without any underlying design system or logic. It might 'work' in the short term, but you lose the 'Why' behind the 'How'. Software engineering isn't just about shipping executables. it’s about maintainable logic and human accountability. Without source code, we lose our most important tool: verification. We shouldn't trade transparency for a black-box output just because it feels faster.
Absolutely. And it’s not even actually faster, it just feels that way, just like you mentioned.