There are four test libraries today, which are:
repo address: github
Env
WebSocket Protocol
Correctness over performance, first test the WebSocket
protocol. Each package is basically configured with the default configuration, and compression is turned off for this test to save time.
As you can see, although gorilla/websocket and nhooyr/websocket claim to pass all autobahn-testsuite tests, they may require some additional code to be written by the developer.
- command
docker run -it --rm \
-v ${PWD}/config:/config \
-v ${PWD}/reports:/reports \
crossbario/autobahn-testsuite \
wstest -m fuzzingclient -s /config/fuzzingclient.json
- result
package | Pass | Info | Non-Strict | Unclean | Failed |
---|---|---|---|---|---|
lxzan/gws | 294 | 3 | 4 | 0 | 0 |
gorilla/websocket | 223 | 3 | 0 | 85 | 75 |
nhooyr/websocket | 173 | 3 | 0 | 0 | 125 |
gobwas/ws | 138 | 3 | 10 | 0 | 150 |
RPS
// 1000 connections, 500 messages/second, 1000 Byte Payload
tcpkali -c 1000 --connect-rate 500 -r 500 -T 30s -f assets/1K.txt --ws 127.0.0.1:${port}/connect
- gws
Destination: [127.0.0.1]:8000
Interface lo address [127.0.0.1]:0
Using interface lo to connect to [127.0.0.1]:8000
Ramped up to 1000 connections.
Total data sent: 12919.8 MiB (13547411965 bytes)
Total data received: 12854.5 MiB (13478908970 bytes)
Bandwidth per channel: 7.178⇅ Mbps (897.2 kBps)
Aggregate bandwidth: 3594.175↓, 3612.441↑ Mbps
Packet rate estimate: 316194.9↓, 581166.7↑ (3↓, 2↑ TCP MSS/op)
Test duration: 30.0017 s.
- gorilla
Destination: [127.0.0.1]:8001
Interface lo address [127.0.0.1]:0
Using interface lo to connect to [127.0.0.1]:8001
Ramped up to 1000 connections.
Total data sent: 7077.0 MiB (7420776528 bytes)
Total data received: 7089.8 MiB (7434174595 bytes)
Bandwidth per channel: 3.961⇅ Mbps (495.1 kBps)
Aggregate bandwidth: 1982.319↓, 1978.746↑ Mbps
Packet rate estimate: 272613.9↓, 173441.2↑ (2↓, 12↑ TCP MSS/op)
Test duration: 30.0019 s.
- nhooyr
Destination: [127.0.0.1]:8002
Interface lo address [127.0.0.1]:0
Using interface lo to connect to [127.0.0.1]:8002
Ramped up to 1000 connections.
Total data sent: 5103.5 MiB (5351431830 bytes)
Total data received: 5140.6 MiB (5390317539 bytes)
Bandwidth per channel: 2.856⇅ Mbps (357.0 kBps)
Aggregate bandwidth: 1437.359↓, 1426.990↑ Mbps
Packet rate estimate: 135048.1↓, 124004.1↑ (1↓, 14↑ TCP MSS/op)
Test duration: 30.0012 s.
- gobwas
Destination: [127.0.0.1]:8003
Interface lo address [127.0.0.1]:0
Using interface lo to connect to [127.0.0.1]:8003
Ramped up to 1000 connections.
Total data sent: 3364.6 MiB (3528061499 bytes)
Total data received: 3440.3 MiB (3607388324 bytes)
Bandwidth per channel: 1.893⇅ Mbps (236.7 kBps)
Aggregate bandwidth: 961.961↓, 940.808↑ Mbps
Packet rate estimate: 89305.6↓, 84530.8↑ (1↓, 18↑ TCP MSS/op)
Test duration: 30.0003 s.
Latency
- 1000 connections, 100 messages/second, 1000 Byte Payload
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
18305 caster 20 0 720780 38116 7332 S 248.8 1.0 24:29.55 gorilla-linux-amd64
18325 caster 20 0 720952 52544 7180 S 161.1 1.3 15:57.80 gws-linux-amd64
18346 caster 20 0 721460 50064 7364 R 311.3 1.3 20:49.94 nhooyr-linux-amd64
2797 caster 20 0 721068 20932 7048 S 322.6 0.5 23:07.91gobwas-linux-amd64
- 10000 connections, 10 messages/second, 1000 Byte Payload
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
19430 caster 20 0 1070196 395408 6924 S 294.0 9.9 3:44.56 gws-linux-amd64
19618 caster 20 0 930480 267108 7268 S 313.0 6.7 9:01.10 gorilla-linux-amd64
20939 caster 20 0 1067980 372916 7236 R 455.8 9.3 12:12.72 nhooyr-linux-amd64
3845 caster 20 0 791984 90576 7096 S 426.6 2.3 20:13.87gobwas-linux-amd64
Final Result
As you can see, except for memory, each item is
gws > gorilla >> nhooyr > gobwas
Top comments (1)
have a try ?