Your explanation of "linearithmic" sounds as though it starts out linear, shifts to logarithmic at some point, but ends up worse than either. That is wrong. Heap sort, for example, is O(n log n) because it works in essentially two phases. It first builds a heap in O(n), then adjusts that heap--sometimes referred to as sifting down--in O(log n). It takes linear time plus logarithmic time.
The way I understand things is in relation to other things: Analogy: what can it be compared to? Absence: what would we have to do without it? Example: how does it work?
Your explanation of "linearithmic" sounds as though it starts out linear, shifts to logarithmic at some point, but ends up worse than either. That is wrong. Heap sort, for example, is O(n log n) because it works in essentially two phases. It first builds a heap in O(n), then adjusts that heap--sometimes referred to as sifting down--in O(log n). It takes linear time plus logarithmic time.
Thanks for the clarification, Erick! I added a footnote with your explanation. Any other feedback?