DEV Community

Discussion on: Speed up your website in 1 minute with Instant.page

Collapse
 
shaijut profile image
Shaiju T

Thanks :) , Here is another plugin, the author in this and this post compares QuickLink and InstantPage and says that QuickLink doesn't have options like

  • Limit preloads per second .
  • Stops preloading on slow response.
  • Stops preloading if server crashes .

@pavelloz What both of you think which is better ?

Collapse
 
pavelloz profile image
Paweł Kowalski • Edited

Honestly, long term i think none of them all good, because they are trying to mask the real issue which is slow/big loading pages.

This shouldnt happen on desktop at all, because desktops are usually low-latency devices (wifi).

On mobile on the other hand, in theory it should be a good idea, but when you think about cost per MB, then your users might not want you to make those decisions for them. They probably are on a budget and websites are big enough already. I know there are ways to detect slow connections within native web api, but arent those connections really the main beneficient of those predictive preloading mechanisms?

I have free 4G everywhere and i know my phones bottleneck is the amount of JS to parse + execute, but if i had 3G that i would have to pay crazy amounts of money (like they do in the US) i would be really mad if someone made me download additional 1MB that i will never see, because i got tired of the webpage at the "Hey this is our app!, download it now!" / "Remember about cookieZZ!!!" / "Pay us on patreon" phase, and left before i ever used those unnenessairly download MBs.

So i would focus on making pages smaller, less js, compress every little graphic (or change the format of it) less css, less 3rd party slowdowns, load on demand whats not needed, speed up DB queries, minimize SSL handshakes (especially for critical path).

I would spend my time here as performance oriented web developer. I once did a fast website, and adding things like instant.page (or any other of its variance, including quicklinks, guess.js [this one actually required addition of GA, so its a no-go for some]) or turbolinks actually hurt performance.

Collapse
 
pavelloz profile image
Paweł Kowalski

By accident, found this on HN now, comments are kind of saying the same thing i wrote above, which is sentiment, dont send me trash in the first place.

news.ycombinator.com/item?id=21905301

Collapse
 
pavelloz profile image
Paweł Kowalski

Ouh, and when it comes to using 3rd party wordpress plugin (i dont use wordpress) i would think twice before i would install unknown plugin from untrusted source on my server, just to get something that is available as (usually) 1-2KB of open source code on github.
Thats a dangerous game, and as an WP administrator i wouldnt do it, thats for sure.

Collapse
 
shaijut profile image
Shaiju T • Edited

Thanks, when you said minimize ssl handshake is it using preconnect link tag ? Also I think instead of using the plugins to prefetch the whole next page. We can use prefetch to load js and image file of next page ? Is this good scenario ? are there any other scenario to use them wisely ?

Thread Thread
 
pavelloz profile image
Paweł Kowalski • Edited

Well, you can minimize handshakes by self-hosting everything you can, on your domain. It also removes 3rd party dependencies being your dependencies, as a positive side-effect
I use preconnect to project's cdn domain, and those 3rd parties that i didnt manage to self host.

I think better approach to js performance is dynamic code splitting (ie. in webpack - import() - docs here). It not only makes your initial bundle smaller, but also defers downloads of dependencies/plugins/vendor code after initial bundle has been executed. And because webpack is an amazing tool, you can also set some prefetching/preloading to those dynamically created chunks as well - webpack.js.org/guides/code-splitti... .