Honestly, long term i think none of them all good, because they are trying to mask the real issue which is slow/big loading pages.
This shouldnt happen on desktop at all, because desktops are usually low-latency devices (wifi).
On mobile on the other hand, in theory it should be a good idea, but when you think about cost per MB, then your users might not want you to make those decisions for them. They probably are on a budget and websites are big enough already. I know there are ways to detect slow connections within native web api, but arent those connections really the main beneficient of those predictive preloading mechanisms?
I have free 4G everywhere and i know my phones bottleneck is the amount of JS to parse + execute, but if i had 3G that i would have to pay crazy amounts of money (like they do in the US) i would be really mad if someone made me download additional 1MB that i will never see, because i got tired of the webpage at the "Hey this is our app!, download it now!" / "Remember about cookieZZ!!!" / "Pay us on patreon" phase, and left before i ever used those unnenessairly download MBs.
So i would focus on making pages smaller, less js, compress every little graphic (or change the format of it) less css, less 3rd party slowdowns, load on demand whats not needed, speed up DB queries, minimize SSL handshakes (especially for critical path).
I would spend my time here as performance oriented web developer. I once did a fast website, and adding things like instant.page (or any other of its variance, including quicklinks, guess.js [this one actually required addition of GA, so its a no-go for some]) or turbolinks actually hurt performance.
By accident, found this on HN now, comments are kind of saying the same thing i wrote above, which is sentiment, dont send me trash in the first place.
Honestly, long term i think none of them all good, because they are trying to mask the real issue which is slow/big loading pages.
This shouldnt happen on desktop at all, because desktops are usually low-latency devices (wifi).
On mobile on the other hand, in theory it should be a good idea, but when you think about cost per MB, then your users might not want you to make those decisions for them. They probably are on a budget and websites are big enough already. I know there are ways to detect slow connections within native web api, but arent those connections really the main beneficient of those predictive preloading mechanisms?
I have free 4G everywhere and i know my phones bottleneck is the amount of JS to parse + execute, but if i had 3G that i would have to pay crazy amounts of money (like they do in the US) i would be really mad if someone made me download additional 1MB that i will never see, because i got tired of the webpage at the "Hey this is our app!, download it now!" / "Remember about cookieZZ!!!" / "Pay us on patreon" phase, and left before i ever used those unnenessairly download MBs.
So i would focus on making pages smaller, less js, compress every little graphic (or change the format of it) less css, less 3rd party slowdowns, load on demand whats not needed, speed up DB queries, minimize SSL handshakes (especially for critical path).
I would spend my time here as performance oriented web developer. I once did a fast website, and adding things like instant.page (or any other of its variance, including quicklinks, guess.js [this one actually required addition of GA, so its a no-go for some]) or turbolinks actually hurt performance.
By accident, found this on HN now, comments are kind of saying the same thing i wrote above, which is sentiment, dont send me trash in the first place.
news.ycombinator.com/item?id=21905301