DEV Community

Discussion on: Large file uploads to an S3 bucket done neatly in Laravel

Collapse
 
jasperf profile image
Jasper Frumau

Thanks a lot for this Adam. Going to try this out with Spatie Laravel Backups being sent to Digital Ocean Spaces. Any tips for combining multiple PUT/POST Objects into one key to be sent serialized? This to avoid rate limiting for multiple requests besides the now sometimes rate limiting taking place due to too long a request with large files we may remedy with your macro..

Collapse
 
adam_crampton profile image
Adam Crampton

Sorry for the slow response!

Personally I like combining data and passing JSON between endpoints, as Laravel provides pretty good tools for dealing with this.

e.g. You can easily convert a model collection to JSON, then have it reconstructed back to a collection object of that model class using Laravel's hydrate method (really handy when passing data back and forth between APIs, Redis, etc).

Hope that at least sort-of answers your question :)

Collapse
 
jasperf profile image
Jasper Frumau

Thanks Adam. Models can be converted to JSON with ease, true. I was more looking into have several images added (PUT) and or loaded (GET) from object storage. Seems I need to combine them and store them as JSON with images as base64 perhaps in one large object and then pull in and split again to load them. Perhaps an object array of images tied to one key. Just not enough experience yet.

Not sure if that is the way to go anymore though so have gone back to block storage using Digital Ocean volumes instead of Spaces. Been fighting rate limiting PUTing image files (200 requests per second max/ 150GB per 24 hrs) and retrieving / GETing them for a while now and decided to for now at least move back to server and or volume storage.

If you do know of ways to store images on object storage without surpassing rate limits whether S3 limitations or those of Digital Ocean Spaces do let me know.

Thanks.