DEV Community

Dimension AI Technologies
Dimension AI Technologies

Posted on

Just What Is Rust, Anyway?

Any why does everything say it's hard?


Every mainstream language fits a mental slot. C is a systems language. Python is a runtime ecosystem that excels at orchestration. JavaScript is a browser language that escaped into the server. Rust doesn't fit the usual slots, though.

Where Python is hard to classify because it does so many things loosely, Rust is hard to classify because by contrast it does one thing with unusual rigour.

Developers who haven't used it have heard the same three adjectives: fast, safe, and difficult. That specific combination is not at all accidental; it is in fact fully intentional. Understanding why those three things travel together is the key to understanding what Rust actually is.

This piece is the third in a series.

The first article argued that Python is best understood as a runtime ecosystem (alongside .NET and the JVM) rather than a language with a clean output artefact.

The second argued that C's importance derives not from the language itself but from the ABI — the universal binary interface that every other language targets at its boundaries.

Rust sits at the intersection of both arguments. It is a systems language that takes the C ABI seriously, and its design is most clearly understood once that context is established.


The problem Rust was built to solve

The dominant categories of serious software vulnerability — buffer overflows, use-after-free, dangling pointers, data races — are not programming errors in the usual sense. They are permitted operations in C and C++. The language allows them. The programmer is the last line of defence, and programmers are not reliable at scale.

This is not a minor concern. Microsoft disclosed in 2019 that roughly 70% of the CVEs it addressed in the preceding decade were memory safety issues. Google reported similar figures for the Chrome codebase. These are not exotic edge cases; they are the default failure mode of systems written in C and C++.

Languages that solved this earlier — Java, C#, Go — did so with garbage collectors and managed runtimes. That works, but the cost is real: non-deterministic pauses, memory overhead, and loss of control over how data is laid out. For application development those costs are usually acceptable. For kernels, databases, real-time systems, and network infrastructure they frequently are not.

Approach Examples Cost of safety
Human discipline C, C++ Vulnerabilities (CVEs)
Runtime (GC / interpreter) Python, Java, C#, Go Latency and memory overhead
Static analysis (type system) Rust Development time and compile time

Rust proposes that the third row is achievable: memory safety enforced by the compiler, with no garbage collector, no managed runtime, and no runtime cost. The question is what that actually requires.


The organising insight: correctness moves into the type system

Every language makes a choice about when to catch errors. Dynamic languages catch them at runtime — when the programme is already running. Statically typed compiled languages catch type errors at compile time. Rust extends this principle as far as it can be extended: memory safety, aliasing rules, and concurrency correctness are all encoded in the type system and verified before the programme runs.

This is not merely a technical decision. It represents a different understanding of what a compiler is for. In most languages, the compiler translates code that looks correct into code that runs. In Rust, the compiler's job includes refusing to translate code that could behave incorrectly at runtime — even if the same code would compile without complaint in C or C++.

The Rust compiler is therefore closer in spirit to a formal verification tool than to a conventional translator. Ownership, borrowing, and lifetimes are not quirks of the language's syntax or design accidents. They are the mechanism by which the type system reasons about memory at compile time.


The three mechanisms

Ownership

Every value in memory has exactly one owner at any given time. When ownership is transferred to another binding, the original becomes invalid. The compiler enforces this statically.

let s = String::from("hello");
let t = s;                      // ownership moves to t
println!("{}", s);              // compile error: s is no longer valid
Enter fullscreen mode Exit fullscreen mode

This is not a runtime check. The compiler refuses to produce this programme. The consequence is that double-free errors become structurally impossible: if only one binding owns the memory, only one thing can free it, and that happens automatically when the owner goes out of scope.

Borrowing

Rather than transferring ownership, code can borrow temporary access to a value. Rust permits either multiple simultaneous read-only borrows, or a single read-write borrow — but never both at once.

let mut v = vec![1, 2, 3];
let r = &v;                     // immutable borrow begins here
v.push(4);                      // compile error: cannot mutate while borrowed
println!("{:?}", r);
Enter fullscreen mode Exit fullscreen mode

This rule is the compile-time equivalent of a readers-writer lock. It eliminates data races before the programme runs: if the compiler can see that two threads could hold conflicting access to the same data, it refuses to compile the code. In an era where every server runs on multi-core hardware and concurrent execution is the norm rather than the exception, this guarantee matters. C requires the programmer to reason about thread safety manually. Python sidesteps the problem with the Global Interpreter Lock, which prevents true parallelism. Rust encodes the constraint in the type system and enforces it for free.

Lifetimes

The compiler tracks how long each reference remains valid. If a reference might outlive
the value it points to, the programme does not compile.

fn first_word(s: &str) -> &str {
    &s[..5]  // compiler verifies this reference cannot outlive s
}
Enter fullscreen mode Exit fullscreen mode

Dangling pointers — references to memory that has already been freed — are structurally prevented. The compiler requires that the proof of safety be expressible in terms it can verify; if it cannot be expressed, the code does not compile.

Together, these three mechanisms mean that a Rust programme which compiles cannot exhibit the class of memory error that accounts for the majority of CVEs in C and C++ codebases. The bugs are not caught at runtime. They are made structurally unavailable.


Why Rust is difficult

The difficulty follows directly from the mechanisms above. Most languages ask the programmer to write code that runs. Rust asks the programmer to write code the compiler can prove is safe — and that proof must be expressed explicitly enough for a static analyser to verify it.

Developers trained in languages that hide memory management have never been required to reason about ownership, aliasing, or reference lifetimes. These concerns exist in every language; they are simply invisible. Rust makes them explicit and mandatory.

This reframing matters in practice. The borrow checker — the component of the compiler that enforces borrowing and lifetime rules — is a common source of frustration for new Rust programmers. Experienced Rust programmers tend to describe the same component as a collaborator. The difference is not in the tool; it is in whether the programmer understands the exchange being made. The difficulty is the price of the guarantee. Once code compiles, an entire class of runtime failure has been ruled out by construction.


Why Rust compiles slowly

Rust compiles slowly relative to most comparable languages. This is accurate and has a specific cause.

Most compilers translate code. Rust's compiler also performs static analysis: it verifies ownership transfers, checks borrow constraints, validates lifetimes across the entire programme, and confirms the absence of data races — all before emitting machine code. This work has no direct equivalent in C, Go, or Java compilation.

Language Memory/concurrency checking at compile time
C / C++ None — left to the programmer
Go Escape analysis (basic); race detector available at runtime
Java / C# Null safety (partial); no memory ownership model
Rust Ownership, borrows, lifetimes, and data races — all static

The compilation overhead is not a toolchain deficiency. It is the cost of the guarantees being purchased. The programme that eventually runs has already been formally checked in ways that no other mainstream systems language provides.


More on Rust and C

The C article in this series argued that C's importance derives from the ABI — the binary interface that every language targets when it needs to call outside itself. Rust in the Linux kernel still speaks C at the boundary, building safer implementations behind a C-compatible façade.

We need to say more about that observation here, because it identifies what makes Rust different from every previous attempt at a memory-safe systems language.

Java and C# solve memory safety, but they require a managed runtime. That runtime sits between the language and the machine. It manages its own heap, controls its own memory, and cannot easily share ownership with C code. Calling a C library from Java requires a bridge — JNI — that is notoriously difficult and imposes overhead. The managed runtime is the reason these languages never became genuine C replacements in systems
programming: the boundary between the managed world and the unmanaged world is too expensive to cross.

Rust has no managed runtime. It has no garbage collector. Its memory layout is directly controllable. It can call C code, and be called by C code, through the same ABI mechanism that all other languages use — but without a runtime in the middle.

extern "C" {
    fn strlen(s: *const u8) -> usize;  // calls a C function directly via the platform ABI
}
Enter fullscreen mode Exit fullscreen mode

That single declaration is enough for Rust to call any C function with no marshalling layer and no overhead. The reverse is equally straightforward: a Rust library can expose a C-compatible interface and be consumed by Python, Ruby, Go, or any other language as though it were a C library.

Rust therefore occupies a position no other memory-safe language has held: it can replace C module-by-module, function-by-function, without replacing the interfaces that the rest of the system depends on. The Linux kernel illustrates this in practice. Rust has been accepted into the kernel — first experimentally, then as a first-class language alongside C in December 2025 — and Rust modules interoperate with fifty years of existing C kernel code through C-compatible interfaces. Rust is not competing with the C ABI. It was designed to work within it.


Where Rust is used

Operating systems, databases (TiKV, sled), web infrastructure (Cloudflare's core networking stack, Amazon's Firecracker virtualisation engine), embedded systems, and the Linux kernel. The common factor is consistent across all of them: memory safety without a managed runtime.

One data point is worth including. The Stack Overflow developer survey found Rust the most admired language for nine consecutive years, from 2015 to 2024. The gap between admiration and adoption has been wide and consistent, which is diagnostic rather than merely interesting: developers recognise the value of the guarantees but find the onboarding cost high. That gap has been closing since roughly 2022, as tooling, documentation, and the Cargo package manager have matured, and as the consequences of decades of memory-unsafe systems code have become harder to ignore.


What Rust actually is

Rust is a systems language that moved the programmer's memory discipline into the compiler. That single decision accounts for everything unusual about it: the strict type system, the slow compilation, the learning curve, the difficulty, and the guarantees.

The language is internally consistent once the central ambition is clear. The borrow checker is not arbitrary strictness; it is the mechanism by which ownership rules are enforced. The slow compilation is not a tooling failure; it is the cost of static analysis that other compilers do not perform. The difficulty is not poor design; it is the price of eliminating at compile time the errors that every other systems language discovers at runtime — or in a CVE report.

Rust is not a general-purpose language competing with Python or Go for application development. It is the first plausible answer, after fifty years, to the question of whether C-level performance and memory safety can coexist. The answer, it turns out, is yes — provided the compiler is willing to do enough of the work, and the programmer is willing to let it. It is to C's credit that it took half a century for anyone to work this out. It is an enormous advance, whose benefits are still in their infancy. Give it another 50 years to play out?

Top comments (0)