DEV Community

Berke Atalay
Berke Atalay

Posted on

My first Rust HTTP API

I’ve been using Rust in production for a while now, mostly for backend services.

At some point it started to feel a bit one-sided: I read other people’s blog posts, used their crates, copied their patterns… but never really shared anything back. This is a small attempt to fix that.

What follows isn’t about some fancy system at work. It’s about something much smaller: the very first HTTP API I built in Rust, back when I was still figuring out how all the pieces fit together.

It’s a tiny service. It doesn’t do much. But it’s the point where Rust stopped being a “future maybe language” and became something I could actually ship with.


Starting from nothing

I started with the usual ritual:

cargo new rust-hello-api
cd rust-hello-api
Enter fullscreen mode Exit fullscreen mode

Cargo created the standard layout:

  • Cargo.toml
  • src/main.rs

The default main.rs prints “Hello, world!”. I ran it once, mostly to see that everything was wired correctly:

cargo run
Enter fullscreen mode Exit fullscreen mode

Text showed up in the terminal, as expected. Nothing exciting yet.

For HTTP I decided to use actix-web. Not because I had deeply evaluated all options, but because it showed up often and seemed to work well for the kind of services I had in mind.

I added it to Cargo.toml:

[dependencies]
actix-web = "4"
Enter fullscreen mode Exit fullscreen mode

That was the whole setup.


“Hello, world” over HTTP

The first goal was simple: instead of printing to stdout, I wanted Rust to answer an HTTP request.

I replaced main.rs with this:

use actix_web::{get, App, HttpResponse, HttpServer, Responder};

#[get("/")]
async fn hello() -> impl Responder {
    HttpResponse::Ok().body("Hello from Rust")
}

#[actix_web::main]
async fn main() -> std::io::Result<()> {
    println!("Starting server on http://localhost:8080");

    HttpServer::new(|| {
        App::new()
            .service(hello)
    })
    .bind(("0.0.0.0", 8080))?
    .run()
    .await
}
Enter fullscreen mode Exit fullscreen mode

There’s a bit more ceremony than a “hello world” script in a scripting language, but the structure is straightforward:

  • hello() is the handler for /.
  • HttpServer::new and App::new wire it up.
  • #[actix_web::main] gives me an async main.

I ran it:

cargo run
Enter fullscreen mode Exit fullscreen mode

Then from another terminal:

curl http://localhost:8080/
Enter fullscreen mode Exit fullscreen mode

The response:

Hello from Rust

That was the first small click. It’s still just a string, but now it’s traveling over TCP, through a router, back to curl. The language wasn’t just something I was experimenting with in isolation anymore. It was speaking HTTP.


Adding a bit of JSON

Plain text is fine for a first check, but backend work rarely stops there. I wanted to see how painful it would be to return structured data.

I pulled in serde:

[dependencies]
actix-web = "4"
serde = { version = "1", features = ["derive"] }
serde_json = "1"
Enter fullscreen mode Exit fullscreen mode

Then I changed the handler:

use actix_web::{get, App, HttpResponse, HttpServer, Responder};
use serde::Serialize;

#[derive(Serialize)]
struct HelloResponse {
    message: String,
    language: String,
}

#[get("/")]
async fn hello() -> impl Responder {
    let body = HelloResponse {
        message: "Hello from Rust".to_string(),
        language: "rust".to_string(),
    };

    HttpResponse::Ok().json(body)
}
...
Enter fullscreen mode Exit fullscreen mode

serde does the boring work:

  • I derive Serialize on the struct.
  • .json(body) turns it into a JSON response with the right headers.

The response changed to:

{"message":"Hello from Rust","language":"rust"}

Nothing groundbreaking, but there’s a nice feeling when the types line up. If I add another field to HelloResponse, the compiler will force me to think about it. There’s no invisible “bag of keys” being passed around.


A tiny bit of state

An endpoint that always returns the same JSON is fine for demos, but it doesn’t tell you how stateful things feel.

For this part I wanted just enough state to see:

  • how shared data looks
  • how Actix passes state into handlers
  • how Rust treats concurrency in a simple setup

I went with a small “note” type:

use serde::{Deserialize, Serialize};

#[derive(Debug, Serialize, Deserialize, Clone)]
struct Note {
    id: u32,
    text: String,
}
Enter fullscreen mode Exit fullscreen mode

Then some shared state:

use std::sync::Mutex;

struct AppState {
    notes: Mutex<Vec<Note>>,
}
Enter fullscreen mode Exit fullscreen mode

This isn’t meant to be a scalable design. It’s just a safe place to hang a Vec so multiple requests can touch it without the program tearing itself apart.

In main I seeded it with a couple of notes:

use actix_web::{web, App, HttpServer};
use std::env;
use std::sync::Mutex;

#[actix_web::main]
async fn main() -> std::io::Result<()> {
    let port: u16 = env::var("PORT")
        .unwrap_or_else(|_| "8080".to_string())
        .parse()
        .expect("PORT must be a valid u16");

    let initial_notes = vec![
        Note { id: 1, text: "my first rust api".to_string() },
        Note { id: 2, text: "it actually works".to_string() },
    ];

    let state = web::Data::new(AppState {
        notes: Mutex::new(initial_notes),
    });

    println!("Starting server on http://localhost:{port}");

    HttpServer::new(move || {
        App::new()
            .app_data(state.clone())
            .service(hello)
            .service(list_notes)
            .service(create_note)
    })
    .bind(("0.0.0.0", port))?
    .run()
    .await
}
Enter fullscreen mode Exit fullscreen mode

The interesting part is web::Data and Mutex:

  • web::Data wraps the state so handlers can receive it.
  • Mutex makes the compiler happy about sharing it across threads.

Then I added two endpoints:

use actix_web::{get, post, web, HttpResponse, Responder};

#[get("/notes")]
async fn list_notes(data: web::Data<AppState>) -> impl Responder {
    let notes = data.notes.lock().unwrap();
    HttpResponse::Ok().json(&*notes)
}

#[derive(Debug, Deserialize)]
struct CreateNoteRequest {
    text: String,
}

#[post("/notes")]
async fn create_note(
    data: web::Data<AppState>,
    payload: web::Json<CreateNoteRequest>,
) -> impl Responder {
    let mut notes = data.notes.lock().unwrap();

    let new_id = notes.len() as u32 + 1;
    let note = Note {
        id: new_id,
        text: payload.text.clone(),
    };

    notes.push(note.clone());

    HttpResponse::Created().json(note)
}

Enter fullscreen mode Exit fullscreen mode

Now the API could:

curl http://localhost:8080/notes
# [{"id":1,"text":"my first rust api"}, {"id":2,"text":"it actually works"}]

curl -X POST http://localhost:8080/notes \
  -H "Content-Type: application/json" \
  -d '{"text":"hello from curl"}'
# {"id":3,"text":"hello from curl"}
Enter fullscreen mode Exit fullscreen mode

No database, no migrations, no ORMs. Just enough logic to exercise:

  • passing JSON in
  • turning JSON back out
  • mutating some shared state

The Mutex feels a bit heavy-handed for a toy list, but it’s a good reminder that “shared mutable” is not something Rust hides from you. You have to say it out loud.


A small step towards reality

One last thing I added before calling it a night was a tiny bit of configuration.

Hard-coding port 8080 works for the first test, but it gets annoying quickly. Reading it from the environment is enough to make it feel more like something I could containerize or move between environments:

let port: u16 = env::var("PORT")
    .unwrap_or_else(|_| "8080".to_string())
    .parse()
    .expect("PORT must be a valid u16");
Enter fullscreen mode Exit fullscreen mode

It’s a small line of code, but once you do that, things like “run in Docker behind a reverse proxy” stop being hypothetical.


What this little API gave me

Looking back, this is not a project I would show as a portfolio piece. It’s small, it uses a Mutex<Vec<_>>, and there’s no database.

But it was an important checkpoint.

A few things I took away from it:

  • Rust doesn’t have to be tackled all at once. It’s perfectly fine to grow something in small, understandable layers: text → JSON → state → config.
  • The type system actually helps. It’s not just there for blog posts. When you add or change fields in a struct, the compiler tells you where you forgot to think it through.
  • Concurrency is not invisible. The moment you reach for shared state, Rust makes you acknowledge what you’re doing. Even in a small toy server, that’s a good habit.

Since then I’ve used Rust for more serious services, with real databases, proper error types, logging, metrics, the usual production checklist. But this tiny HTTP API is where it stopped being an abstract “nice language” and turned into something I could trust with real work.

At some point I’ll probably write about the next step in that evolution: replacing the in-memory list with a real database, and what that changes in the shape of the code. For now, if you’ve been Rust-curious and haven’t built something over HTTP yet, a small service like this is a pretty good way to start.

Top comments (0)