TL;DR
I’ve created async versions of JSON stringify and parse plus a whole bunch of array functions, including sort, that don’t block t...
For further actions, you may consider blocking this person and/or reporting abuse
Ok I understand now. It’s all about the
requestIdleCallback
.What got me interested is because I have a repo at github.com/calvintwr/blitz-hermite...
It is a client-side JS image resizer, which is overall performant in context of downsizing huge client images (DSLR whatever just bring it), before transmitting it over to the server for storage. Time difference saved is from uploading the massive image to the server, and sending back a resized version to the client. Almost all online resizers/social media sites do this slower server-side resizing, I really wonder why.
In Blitz, I’m using a worker to do the resampling, but it glitches when the worker sends back the completed work, albeit just for a short while. I might think about how to Coroutines this.. hah.
Right well if you can encode it using a Uint8Array in the worker, there's a version of JSON parse that can handle them and you can use transferrable objects with Uint8Arrays that should be near-instant. Of course, you could do anything else you like with the array using a generator based coroutine.
Ahhh yes that can work. Although the next challenge is to part out your async json parser because I shouldn’t want to bring in the full js-coroutines library just to do that part.
I think you could probably just get away with the coroutine runner in this article and not bother with js-c, that's probably a few lines of code that will deal with your requirements.
Thank mate you are awesome!
I found out that
Uint8Array
is not a transferrable object, is it? developer.mozilla.org/en-US/docs/W...But
ArrayBuffer
is. And ImageData has property.data.buffer
that is ArrayBuffer, so i have used that to transfer between the main and worker thread: github.com/calvintwr/blitz-hermite...✌️👍
Off-topic, Another approach to handle resource-intensive calculation would be using worker thread instead of the main thread. developer.mozilla.org/en-US/docs/W.... Cool project btw 😄
Completely agree - I do mention it in the write up above. The issue is often the time taken to serialize and deserialize the workload to the other thread. If you are processing binary then it's totally fine and ideal to do it on a worker thread (because you can use transferrable objects). Structured cloning (to threads) used to be very broken, it's now much better but it's still pretty much only the same speed range as a serialize/deserialize to JSON - which can definitely cause frame loss glitches.
I've been wondering idly about adding threading to js-coroutines, but using it to create the packets so that it doesn't glitch on large payloads - thing is, it's pretty specialist to actually need that other thread, so not sure where the benefit cut off would be...
When your objects are so big you have to lz compress your json before sending it to the client then do segmented uncompresses so you don't crash the browser. Have to do it this way because objects have to be available when user loses connection.
Sounds interesting... Seems like you could do with an uncompress that also yields in that case?
@CaseyCole - I've added lz-string compression to the library more details here - not sure it will help in your case, but I'm gonna need it! Thanks for pointing me in this direction...
Wow, really cool!
Thanks :) Really can't believe I didn't think of it before! Seemed so obvious once I'd implemented it.
Hi Mike awesome work! I'm wondering if this can be applied to the backend. What I mean is if you're able to process arrays of a million records could this replace the event loop in node?
In node the event loop is not async, so a long running task can block. Would your generator technique allow more scaling by having http calls overlap and do work during main thread idle time?
You know I think it actually would yes with a modified form to only run for a particular period of time.
At the moment the polyfill does a requestAnimationFrame and then uses setTimeout to measure the amount of time remaining. On node I guess we'd just say "you are allowed 20ms" then skip to the next main loop.
I'll get to doing that.
Terrific Mike! Imagine you can get node to scale to 500 thousand or more? Currently node scaling is pretty poor. On techempower raw node is only 176190 concurrent requests sec (128th place).
techempower.com/benchmarks/#sectio...
I'm also a js dev so if you need help let me know.
Unjam your server: NodeJS collaborative multitasking
Mike Talbot ⭐ ・ Jul 13 '20
And I'd be delighted to get any help anyone would like to give. I'm sure that there are some other very useful functions that could be added!
I wrote a util almost exactly like this for Unity, to make sure VR games don't drop any frames: github.com/foolmoron/UnityUtils/bl...
Using requestIdleCallback to time the process is genius fit for JS though! Definitely gonna see where I can use this in my json handling.
Really great project, Mike!
Good work and thanks for sharing!
Another DoEvents in a more complex "modern" way.
Yes, with an understanding of how much time is left on the current frame rather than just pumping the message queue.
Thanks for building this, Mike. I was looking for such library from last 3-4 months.