It's tricky that there are no benchmarks for this category. I am looking for the fastest Static webserver to get large files uploaded in a snap. Now I realize that there are several variables involved, I can't solve network speeds. I can however do some compression on the client or chunk requests.
Example:
- https://medium.com/javascript-in-plain-english/implementing-chunk-requests-and-uploading-large-files-30-faster-1c45bfe79938
- https://blog.daftcode.pl/how-to-make-uploading-10x-faster-f5b3f9cfcd52
- https://www.npmjs.com/package/zlib-wasm
Use case
What are these files? .gltf 3D models which are actually a json format with a lot of base64 encouraging.
I want to scrape the files in a fast Java service and also have them persist for loading into a 3D viewer.
Files could be 50mb+
So from the server I want to get them in and out of the server as quickly as possible, does anyone have any recommendations for frameworks or servers, language is not a constraint here.
Top comments (5)
.gltf files are explained pretty well on its wiki: en.wikipedia.org/wiki/GlTF
It sounds to me like your situation would be as easy as upload with a PUT request (or PUT requests if you're chunking the file on the client side). And a GET request to your reverse proxy for downloading the file to the browser for display. Of course you could add compression on the server and client side here to speed things up.
Draco compression might be something you want to look into: google.github.io/draco/.
Localhost:8080 ;)
I don't follow, sorry?
Nginx is one of the fastest servers out there. With brotli or gzip compression enabled, you will save a lot of traffic.
For simple uploading, you can use nginx's upload module.
Hey, yeah this is established knowledge, but If you believe in benchmarks then nginx is 109th place.
techempower.com/benchmarks/
But none of the benchmarks actually cover IO disk read write so who is to say, the server could have blistering request handling and poor IO capabilities.
I guess hardware is a big factor as well.