DEV Community

Discussion on: How do you merge millions of small files in a S3 bucket to larger single files to a separate bucket daily?

Collapse
 
tiguchi profile image
Thomas Werner

I think it mostly depends what the expected target output format is. If you simply need to concatenate all events as a list (JSON array), then it could be probably done by opening an output stream for a file in the target S3 bucket, and writing each JSON file to it one after the other. Memory consumption should be constant, given that all input JSON files are the same size. You only need to make sure that the list of event file paths / handles is not loaded into a collection all at once, so you don't run out of memory.

But it sounds like you need to apply more complicated merge logic? What's an example for an event file and what's the expected result format?