DEV Community

Discussion on: Rust vs Go - Load testing webserv (>400k req/s)

Collapse
 
r0mdau profile image
Romain Dauby

I played with the code on my computer and got better performance with Go httprouter and even better results with Go fasthttp: gist.github.com/r0mdau/ac0f416d230...
Conclusion: whatever the language, take time to understand how the framework you use works and pick what fits best to you / team / project.

Collapse
 
glebpom profile image
Gleb Pomykalov

You had to use rust release target instead of debug

Collapse
 
r0mdau profile image
Romain Dauby

Thank you for the comment, I edited the results. 1: fasthttp, 2: actix, 3: httprouter

Thread Thread
 
glebpom profile image
Gleb Pomykalov

There is still one issue with the benchmark. FastHTTP doesn't perform any path matching against the template, hence have some benefits. I'd suggest to use raw hyper benchmark for rust instead of actix.

Thread Thread
 
glebpom profile image
Gleb Pomykalov

And please also enable the LTO in Cargo.toml

[profile.release]
lto = true
Enter fullscreen mode Exit fullscreen mode
Thread Thread
 
glebpom profile image
Gleb Pomykalov • Edited

This code for hyper responds with exactly the same body and headers in size as fasthttp. On my computer, with LTO enabled, it performs much faster than go fasthttp.

use std::{convert::Infallible, net::SocketAddr};
use hyper::{Body, Request, Response, Server};
use hyper::service::{make_service_fn, service_fn};
use hyper::header::{CONTENT_TYPE, CONTENT_LENGTH, SERVER};

const RESP: &str = "Welcome!";

async fn handle(_: Request<Body>) -> Result<Response<Body>, Infallible> {
    let mut resp = Response::new(RESP.into());

    resp
        .headers_mut()
        .insert(SERVER, "rusthttp".parse().unwrap());
    resp
        .headers_mut()
        .insert(CONTENT_TYPE, "application/json".parse().unwrap());
    resp
        .headers_mut()
        .insert(CONTENT_LENGTH, RESP.len().to_string().parse().unwrap());

    Ok(resp)
}

#[tokio::main]
async fn main() {
    let addr = SocketAddr::from(([127, 0, 0, 1], 8080));

    let make_svc = make_service_fn(|_conn| async {
        Ok::<_, Infallible>(service_fn(handle))
    });

    let server = Server::bind(&addr)
        .serve(make_svc);

    if let Err(e) = server.await {
        eprintln!("server error: {}", e);
    }
}
Enter fullscreen mode Exit fullscreen mode

fasthttp:

Running 20s test @ http://127.0.0.1:8080/
  12 threads and 1000 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     2.24ms  342.10us   8.60ms   70.51%
    Req/Sec    10.70k     6.59k   29.73k    55.90%
  2128672 requests in 20.03s, 270.00MB read
  Socket errors: connect 755, read 87, write 0, timeout 0
Requests/sec: 106253.60
Transfer/sec:     13.48MB
Enter fullscreen mode Exit fullscreen mode

hyper:

Running 20s test @ http://127.0.0.1:8080/
  12 threads and 1000 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.52ms  294.96us  18.92ms   90.85%
    Req/Sec    14.30k     7.62k   31.12k    55.59%
  3146777 requests in 20.10s, 399.13MB read
  Socket errors: connect 755, read 96, write 0, timeout 0
Requests/sec: 156542.40
Transfer/sec:     19.86MB
Enter fullscreen mode Exit fullscreen mode
Thread Thread
 
r0mdau profile image
Romain Dauby

Updated my gist adding Hyper, routing in fasthttp and giving my go and rustc versions in doc. I still find fasthttp the best performer with these codes: gist.github.com/r0mdau/ac0f416d230...