Hi , thank you so much for this test , i wanted to test that but creating a dataset with 1000000000 row takes so much time , maybe 1 ! week , and the final size would be about 20 gig. i have created a bash script that generate the data , is there any optimizer way or what !?
You are welcome!
The resulting file should be around 13 GB and there is a faster version of the generator available in the original repo at github.com/gunnarmorling/1brc in create_measurements_fast.sh
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
Hi , thank you so much for this test , i wanted to test that but creating a dataset with 1000000000 row takes so much time , maybe 1 ! week , and the final size would be about 20 gig. i have created a bash script that generate the data , is there any optimizer way or what !?
You are welcome!
The resulting file should be around 13 GB and there is a faster version of the generator available in the original repo at github.com/gunnarmorling/1brc in
create_measurements_fast.sh