Nice article. Although it's not "thing of the past" per say. C# and Java, for example do some optimizations behind the scene where using lesser type like short will result in performance drop. At least there's code examples (this case C#) where people switched from short to integers in loops and gained performance just by making all loops written as usual (at least I'm used to everyone using int in loops). Thing is some guy tried to optimize usage by thinking about those things and instead it gave the opposite reuslt. So it's not like we don't think about this, it's more like some weird stuff happens if you try to micro optimize.
Nice article. Although it's not "thing of the past" per say. C# and Java, for example do some optimizations behind the scene where using lesser type like short will result in performance drop. At least there's code examples (this case C#) where people switched from short to integers in loops and gained performance just by making all loops written as usual (at least I'm used to everyone using int in loops). Thing is some guy tried to optimize usage by thinking about those things and instead it gave the opposite reuslt. So it's not like we don't think about this, it's more like some weird stuff happens if you try to micro optimize.
That is very strange indeed. I should run performance tests just in case this doesn't happen for me. Thanks for the heads up!