I've been reading Thinking Better: The Art of the Shortcut by Oxford University Math Professor, Marcus du Sautoy, and he has raised a couple of good points that can be applied to coding.
- Short Cuts are not the same as Cutting Corners. Though both involve "cutting," short cuts are quicker ways to the same end goal you would have reached if you would have done it the long, hard way while cutting corners gives you an inferior product.
- We're spoiled by being able to do things with ever faster computers choosing to use brute force over thinking through a short cut.
Example of the second:
If I were to try figure out how many different possible combinations there are to an old-school iPhone with a 4-digit password, I could have a computer figure them all out for me and count them like this:
import itertools it = itertools.product([0,1,2,3,4,5,6,7,8,9], repeat=4) itlist = [x for x in it] print(len(itlist)) 10000
Or, I could remember my middle school algebra and recall that the number of possible combinations is the product of all of counts for each digit. In our iPhone example, a phone number pad only has the digits 0-9 as options, so we have 10 options for each of the 4 digits in the password or 10 * 10 * 10 * 10 (10^4). Therefore, we would get the same solution with:
Timing the two methods, I was able to execute the second function in 1/10 of the time of the first. The greater we scale the problem (e.g. 10000 digit password or one that allowed all alphanumeric options), the more noticeable the inefficiencies of brute force becomes.
Refactoring code isn't necessarily just about measuring whether
reduce to produce the same output is faster (I'm not discounting the fact they each do it differently!), but also about whether we could get to the same final place skipping/cutting out the step in first place without cutting corners creating less robust code.