DEV Community

Cover image for Rust vs Go - Load testing  webserv (>400k req/s)
Martin André
Martin André

Posted on

Rust vs Go - Load testing webserv (>400k req/s)

TLDR: Go can reach 270k req/s where Rust can hit 400k req/s.

Go server

Let's go straight to buisness with a minimal server sample using httprouter.

package main

import (
    "fmt"
    "net/http"

    "github.com/julienschmidt/httprouter"
)

func Index(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
    fmt.Fprint(w, "Welcome!")
}

func main() {
    router := httprouter.New()
    router.GET("/", Index)

    http.ListenAndServe(":8080", router)
}
Enter fullscreen mode Exit fullscreen mode

Rust

For the Rust server we'll use Actix as our framework of choice, down below is the minimal sample.

use actix_web::{web, Responder, middleware, App, HttpServer};

async fn health_check() -> impl Responder {
    "Welcome!"
}

fn routes(cfg: &mut web::ServiceConfig) {
    cfg.route("/health", web::get().to(health_check));
}

#[actix_web::main]
async fn main() -> std::io::Result<()> {
    let serv = HttpServer::new(move || {
        App::new()
            .wrap(middleware::Compress::default())
            .configure(routes)
    });
    serv.bind("127.0.0.1:8080")?
        .run()
        .await
}
Enter fullscreen mode Exit fullscreen mode

Benchmark

To put a big load on both our servers, we're going to use wrk.

the benchmarks are performed on a i7-8750H (6c, 12threads)

wrk -t12 -c1000 -d15s http://127.0.0.1:8080/
Enter fullscreen mode Exit fullscreen mode

Results

Rust:

Running 15s test @ http://127.0.0.1:8080/
  12 threads and 1000 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     3.73ms    4.70ms  57.76ms   85.86%
    Req/Sec    33.66k     5.80k   69.35k    71.65%
  6039978 requests in 15.10s, 714.26MB read
Requests/sec: 400095.92
Transfer/sec:     47.31MB
Enter fullscreen mode Exit fullscreen mode

Go:

Running 15s test @ http://127.0.0.1:8080/
  12 threads and 1000 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     5.03ms    6.11ms 102.78ms   86.66%
    Req/Sec    22.81k     4.77k   53.73k    71.19%
  4087276 requests in 15.10s, 487.24MB read
Requests/sec: 270691.36
Transfer/sec:     32.27MB
Enter fullscreen mode Exit fullscreen mode

The results speak for themselves... 400.000 vs 270.000 for Rust and Go respectively.

Conclusion

While Go might be easier to write and faster to compile compared to Rust, it's still slower compared to its competitors.

If you're hesitating, let me give you this advice: use rust if you want speed, else go with Go.

Cover image from dzone.

Top comments (11)

Collapse
 
r0mdau profile image
Romain Dauby

I played with the code on my computer and got better performance with Go httprouter and even better results with Go fasthttp: gist.github.com/r0mdau/ac0f416d230...
Conclusion: whatever the language, take time to understand how the framework you use works and pick what fits best to you / team / project.

Collapse
 
glebpom profile image
Gleb Pomykalov

You had to use rust release target instead of debug

Collapse
 
r0mdau profile image
Romain Dauby

Thank you for the comment, I edited the results. 1: fasthttp, 2: actix, 3: httprouter

Thread Thread
 
glebpom profile image
Gleb Pomykalov

There is still one issue with the benchmark. FastHTTP doesn't perform any path matching against the template, hence have some benefits. I'd suggest to use raw hyper benchmark for rust instead of actix.

Thread Thread
 
glebpom profile image
Gleb Pomykalov

And please also enable the LTO in Cargo.toml

[profile.release]
lto = true
Enter fullscreen mode Exit fullscreen mode
Thread Thread
 
glebpom profile image
Gleb Pomykalov • Edited

This code for hyper responds with exactly the same body and headers in size as fasthttp. On my computer, with LTO enabled, it performs much faster than go fasthttp.

use std::{convert::Infallible, net::SocketAddr};
use hyper::{Body, Request, Response, Server};
use hyper::service::{make_service_fn, service_fn};
use hyper::header::{CONTENT_TYPE, CONTENT_LENGTH, SERVER};

const RESP: &str = "Welcome!";

async fn handle(_: Request<Body>) -> Result<Response<Body>, Infallible> {
    let mut resp = Response::new(RESP.into());

    resp
        .headers_mut()
        .insert(SERVER, "rusthttp".parse().unwrap());
    resp
        .headers_mut()
        .insert(CONTENT_TYPE, "application/json".parse().unwrap());
    resp
        .headers_mut()
        .insert(CONTENT_LENGTH, RESP.len().to_string().parse().unwrap());

    Ok(resp)
}

#[tokio::main]
async fn main() {
    let addr = SocketAddr::from(([127, 0, 0, 1], 8080));

    let make_svc = make_service_fn(|_conn| async {
        Ok::<_, Infallible>(service_fn(handle))
    });

    let server = Server::bind(&addr)
        .serve(make_svc);

    if let Err(e) = server.await {
        eprintln!("server error: {}", e);
    }
}
Enter fullscreen mode Exit fullscreen mode

fasthttp:

Running 20s test @ http://127.0.0.1:8080/
  12 threads and 1000 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     2.24ms  342.10us   8.60ms   70.51%
    Req/Sec    10.70k     6.59k   29.73k    55.90%
  2128672 requests in 20.03s, 270.00MB read
  Socket errors: connect 755, read 87, write 0, timeout 0
Requests/sec: 106253.60
Transfer/sec:     13.48MB
Enter fullscreen mode Exit fullscreen mode

hyper:

Running 20s test @ http://127.0.0.1:8080/
  12 threads and 1000 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.52ms  294.96us  18.92ms   90.85%
    Req/Sec    14.30k     7.62k   31.12k    55.59%
  3146777 requests in 20.10s, 399.13MB read
  Socket errors: connect 755, read 96, write 0, timeout 0
Requests/sec: 156542.40
Transfer/sec:     19.86MB
Enter fullscreen mode Exit fullscreen mode
Thread Thread
 
r0mdau profile image
Romain Dauby

Updated my gist adding Hyper, routing in fasthttp and giving my go and rustc versions in doc. I still find fasthttp the best performer with these codes: gist.github.com/r0mdau/ac0f416d230...

Collapse
 
pranaypratyush profile image
Pranay Pratyush

Seems a bit unfair to pit net/http to actix. try fasthttp instead.

Collapse
 
soulsbane profile image
Paul Crane

Am I missing something? The Rust version uses async/await while the Go version does not use the equivalent(go routines).

Collapse
 
stevepryde profile image
Steve Pryde

Are you sure? It seems to me that the ListenAndServe() method spawns goroutines to handle incoming requests.

We should expect a performance difference for such a simple example. However most real world web apps are not so simple and differences in db queries etc could easily outweigh the choice of language.

I think Rust is almost always going to be faster than Go if both are equally optimised simply because it is somewhat closer to the metal. But how does the effort and expertise required to achieve such optimisation compare with each language? However I don't think rust should be seen as Go's primary competitor. Rather I think we should compare with Java, node.js, python, ruby, and C#. I expect Go would be quite competitive with all of those.

Let's also remember other factors like ease of maintaining a large codebase and hiring pool.

That said, I prefer Rust for its safety guarantees. The ease with which you can end up with data races or null pointers in Go just makes me nervous. I do see its appeal though. Simplicity is a worthy design goal.

Collapse
 
soulsbane profile image
Paul Crane • Edited

You are right. I meant my comment to come off as more a question than a statement but I kinda failed there. I was on my phone and couldn't check the source code for the Go version. But you answered my question most thoroughly though. Thanks for the reply and article!