It is exactly what I'm doing :)
you are right, I actually somehow skipped the last url assignment. One possible improvement then, would be to fetch headers once, use URLSearchParams to get/set pages, and load all the pages at once in parallel, returning results as Promise.all(...)
That would be N pages at once, instead of N pages one after the other ;-)
Edit: my suggestion is based on the fact GitHub returns the last page too, but I guess for your article what you are doing is already good enough as example.
Thanks for the suggestion! Your solution would work perfectly :)
I don't think that my solution is valid for all scenarios but might be good sometimes. For example:
We're a place where coders share, stay up-to-date and grow their careers.
We strive for transparency and don't collect excess data.