Considering how limited the time is, that is really freely available for deliberate learning activities and how vast the potential body of knowledge, it is wise to know what not to do.
I don't have a list, but I have some criteria against which I assess topics I might spend time on.
Longevity / Transferability
I generally prefer to focus on topics that are not bound to a specific technology, or (actually more common) where the technology is only the means of presenting an idea.
Usefulness / Applicability
There are many concepts, which satify the first criterion, but the when chance of applying them at all is rather slim, then other things will take priority over them: Use it, or lose it. Time is too valuable to spend it on learning things that will soon fall into oblivion. Never say never though: in my first year after university, I found several use cases for algorithms/concepts from the more theoretical course of my curriculum, while some practical courses appeared rather naive in hindsight. Also, if I'm very committed to a specific technology, I keep up with it. The technologies i commit myself to are rather clearly defined for me, albeit I reassess the decision regularly.
Sometimes a jobs just needs to be done. Let's not pretend that everything we need to learn will be beautiful timeless ideas. Sometimes it's some poorly documented functionality of some obscure library/server/database/you name it, that needs to be grasped. That's where it comes down to a judgement call (and certainly a matter of confidence and experience): do I rely on copy-pasting from StackOverflow, or will I swallow the bitter pill and really research and understand it from the ground-up.
We're a place where coders share, stay up-to-date and grow their careers.
We strive for transparency and don't collect excess data.