We have been using frontend libraries/frameworks that help to build single-page applications out of the box where the application is rendered on th...
For further actions, you may consider blocking this person and/or reporting abuse
I would like to point out two things in this overall very well written and covering article (because it is good!).
First off, it sounds like you are referring to Single Page Applications when you write about CSR. It might be me who's old but client side rendering can happen even when SSR is used, and it is definitely used outside of SPAs, for example on light html sites with low interactivity or low dynamic content.
Also, the old myth that crawlers (which every one says but we all mean google) can not parse CSR sites or SPAs has been debunked. They do in fact use strategies to look at page contents even if you use SPA, dynamic content or just lousy lazy loading so its not a total no-go for SEO. But, there should be a fair warning that if you are building a SEO reliant site and you are going for a SPA you should ask yourself why you ended up in that decision 😁👌
Cheers mate and thanks for a good read!
In practice, search engines can parse CSR pages, but unless you have a very popular site, they probably won't.
Something I should probably do a post on is the concept of a 'crawl budget' -- if you have a new site, and submit it to Google Search Console, it is extremely unlikely that Google will crawl your pages even if they are perfect. This is because they a) are trying to be good citizens and not hugging a new site to death, and b) because they think it's unlikely that your site has value. They may crawl a page or five and may or may not index those pages, but until you show that you have value, the amount of their crawl budget that they're going to spend on your low-authority, brand-new site is very, very small.
CSR sites have to spin up an interpreter, and are parsed more slowly, which means they consume more of your crawl budget, which means it's less likely that all of your pages will be crawled. Google also does a full page refresh on every page scan, which means unlike your users, they have to download the entire React (or whatever) framework with every page they crawl. This further eats into the budget.
You can overcome this by showing Google that you have value. Traffic hitting your site shows value, as do links, and all the other common off-site SEO methods, and you can eventually get popular enough that they'll upgrade your crawl budget a bit at a time. If you become a top x destination site like Facebook or Twitter, then it won't matter at all that your site is CSR and you'll have won, but that's obviously hard, and up until you're the world's most popular site, you can get a lot more crawling done by implementing basically any kind of SSR in the early days to maximize your crawl budget, which usually equates to more pages crawled more quickly.
Hey, thanks for adding these points and I totally agree with your points 😃. It's just that covering every possible combination of these strategies and everything about these strategies in a single article is difficult 🥲.
And just to cover the basics of these along with implementation, it became the longest article that I wrote so far. Thanks for reading it and sharing your feedback 😃
Thanks, glad that you found it useful! :)
Сongratulations 🥳! Your article hit the top posts for the week - dev.to/fruntend/top-10-posts-for-f...
Keep it up 👍
I want to switch to nextjs so bad... If only they added support for vite instead of webpack. Great article explaining the different render strategies!
Thanks 😊. Vite is ❤️
Great Article @sjsouvik
Thanks 😊
ISR is not a rendering Strategie.
I think this is 100% technically correct, but semantically, if you have a slowish release cycle, and want the benefit of fallback data from caching, but don't want to have to generate on each page request, and also don't want to have to delay page updates for a week until the next build cycle -- then ISR is "the answer"
For that reason, I appreciate its inclusion in the article as a strategy.
Ok, would you like to tell why do you think so?
ISR i a framework feature to update a file on the file system in my opinion.
I could agree that it is in some sense a cache busting feature for SSR or or statically generated pages. But the "solution" becomes something you could call ISR