I'd expect the compiler to optimize away this kind of difference (i.e. ignore short and use long instead).
Maybe int32 is on-par with int64 on a 64-bit machine because these machines are still expected to execute a lot of 32-bit code and the hardware is optimized for the conversion?
Good article.
I'd expect the compiler to optimize away this kind of difference (i.e. ignore
shortand uselonginstead).Maybe int32 is on-par with int64 on a 64-bit machine because these machines are still expected to execute a lot of 32-bit code and the hardware is optimized for the conversion?
Yeah, that would be my guess as well (but didn’t dig any deeper).