A Senior Developer working mostly with PHP and JavaScript, with a bit of Python thrown in for good measure, all on Linux. My tooling is simple, it's GitLab and JetBrains where possible.
I find the Lighthouse Audit from Chrome's developer tools useful in these instances. They give a whole host of things to investigate. Quickly running it against your site, there's some really quick wins:
Set height and width attributes on images. The rendering engine will know explicitly what needs to be set for sizing, rather than calculating it
Either increase the size of the thumbnail space on screen, or reduce the size of the thumbnail images served to the user. That will either reduce the required bandwidth (for smaller images) or reduce computation for scaling (if there's more space allocated)
Combine CSS and Javascript where possible. The page has over 100 requests just to serve the home page. That's a lot of things to download for the browser, even over HTTP/2. 78 Javascript files - that's a lot to get and have to wait for.
In terms of performance - I have Gigabit broadband, and I've seen load times on the page from 1.5 seconds up to 7 seconds. the lower of which is when the assets are cached. For someone on a lower speed connection, it's potentially prohibitively slow (and that's just the home page).
I'm a web developer and founder of Spyrath Dev. A graduate from RGU, I've worked on many web projects from custom designed WordPress to full-stack Node.js web apps.
Similar to what Gary said, make sure your image optimisation plugin resizes images to minimum size needed and install a plugin like Hummingbird to combine and minify CSS/JS files (though that can break some things so you may need to add exclusions.)
Hummingbird will scan you site based on Google Lighthouse and make it easy for you to action what it recommends. Make sure you have GZIP compression enabled too for all content types.
I find the
Lighthouse Auditfrom Chrome's developer tools useful in these instances. They give a whole host of things to investigate. Quickly running it against your site, there's some really quick wins:heightandwidthattributes on images. The rendering engine will know explicitly what needs to be set for sizing, rather than calculating itIn terms of performance - I have Gigabit broadband, and I've seen load times on the page from 1.5 seconds up to 7 seconds. the lower of which is when the assets are cached. For someone on a lower speed connection, it's potentially prohibitively slow (and that's just the home page).
Similar to what Gary said, make sure your image optimisation plugin resizes images to minimum size needed and install a plugin like Hummingbird to combine and minify CSS/JS files (though that can break some things so you may need to add exclusions.)
Hummingbird will scan you site based on Google Lighthouse and make it easy for you to action what it recommends. Make sure you have GZIP compression enabled too for all content types.
I actually use cloudinary for all images so they are off the server, but I will for sure checkout hummingbird!