It's a fact of life: you're going to spend the majority of your coding time debugging.
If you've been at this a while, you probably have debugging down to some sort of an art. For the rest of y'all, here's a few rules from the trenches:
Rule #1: Assume nothing. Assume absolutely nothing.
Until you have all the data starting you in the face, you cannot solve the problem. Take nothing for granted! Check the values of all the variables involved. Read the stack trace carefully. Step through your code.
Nine times out of ten, the bug will hide in the one area of the problem you think you can take for granted.
I also like how John Carmack puts it...
Most bugs are a result of the execution state not being exactly what you think it is.
By the way: never withhold all this information from anyone helping you debug. Show them the running code and the data, plus all the details of the output and error messages. Be generous. Withholding information wastes everyone's time.
Rule #2: The problem is you, until proven otherwise.
It is so tempting to jump to the conclusion there's a bug in the library, the language, the environment, the tools, or whatever. In reality, for every one bug you find in someone else's code, you'll usually find at least a hundred in your own.
Assume your code is the problem, until you have absolutely, quantifiably proven otherwise. Get someone else to check your code. Read the documentation. Try to replicate the issue in a separate environment. If you're certain your code is not the issue, check again.
Rule #3: The number of ways code can fail is theoretically infinite.
If I may be so audacious, I'd like to actually coin that as McDonald's First Law. It bears repeating.
The number of ways code can fail is theoretically infinite.
It never ceases to amaze me how many new and interesting ways my own code can absolutely blow itself to smithereens. This is one more reason why you assume the problem is you (Rule #2) - even when you've covered every conceivable exceptional state or fail case, there's still one more to be had. Once you add in the unpredictability of other people's code, you're in for a lifetime of surprises.
If you don't find yourself staring at the computer screen yelling "what in blue blazes is going on here?" or some such, at least once a year, you're doing it wrong.
Rule #4: Complexity is your enemy.
Real code is complex. That's why debugging it is such a pain in the tailfeathers! When you hit up against a particularly difficult problem, it helps to create a Minimum, Complete, Verifiable Example (or MCVE, in StackOverflow lingo).
Replicate the problem in a reasonably fresh environment with as little code and as a few dependencies as possible. If it works, keep adding details in until it blows up. Whatever you last did is almost certainly part of the problem!
Rule #5: If it's weird, it's memory.
This doesn't just apply to "sharp-edged" languages like C and C++, where you can twiddle bits to your heart's content. Any bug which involves utter and complete garbage coming from anywhere but a user is almost certainly related to a memory error.
Undefined behavior and memory errors exist at the very heart of the stack that all modern programming is so precariously perched upon. Dig through your favorite language's list of exceptions, and you'll almost certainly find something relating to memory.
Now, how you go about actually detecting and fixing said memory errors depends entirely on your language and platform.
Rule #6: You probably forgot a semicolon somewhere.
I once spent two days trying to fix an infinite loop in my code. I was tearing my hair out, half-convinced my computer was possessed by a dark force with evil intents, when my mother wandered into the room.
She stared at the screen for a moment, and then pointed. "Don't you need a semicolon in the middle of line 56?"
My mother is not a programmer. And yet, guess what I had forgotten on line 56? A semicolon.
In the years since, I've found that a great many terrifying and mysterious bugs were because of missing tokens: semicolons, commas, backslashes, spaces. There's a chasm of difference between "valid code" and "correct code", and every one of us will fail to make the Evel Knievel-style jump over said chasm at least once a week.
Chances are, you won't be able to see these little errors by yourself. We all get code blind! If you can't spot it after a few minutes, find another coder to look at it for you, and get comfortable laughing at your stupid mistakes. You'll be making a lot of them.
Rule #7: Never underestimate the ingenuity of complete fools.
The complete quote, in case you're wondering, is...
"A common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools." -Douglas Adams
Sooner or later, the irresistible force that is your code will meet the implacable idiocy that is your user. (Don't sneer...chances are, that user is going to be you at 2 A.M. with insufficient caffeine in your system.) Wetware is notoriously buggy: eventually, someone is going to ask your code to do something that defies the very laws of physics, mathematics, and good sense.
To prevent this, we write tests that are absolutely insidious:
"QA Engineer walks into a bar. Orders a beer. Orders 0 beers. Orders 999999999 beers. Orders a lizard. Orders -1 beers. Orders a sfdeljknesv." -Bill Sempf
As they say, "the best defense is a good offense," and that applies to debugging as well. Find as many ingenious ways to break your code yourself as you can. Discover what happens when you give it garbage input. Break the rules of your own API. Bring up the interface, and then ask your cat to sit on your keyboard. In short, do terrible things to your code.
(P.S. I'm not kidding about the cat thing. One of my coding friends has a cat that literally passed a practice test for a HAM radio license with an 85% while sitting on my sleeping friend's lap. I am dead serious.)
Rule #8: Old fashioned paper is your best friend.
The modern programmer has an entire arsenal of debugging tools at their disposal: debuggers, static analysis, memory checkers, code profilers, testing frameworks, and the list goes on! But don't overlook the oldest and most reliable debugging tool on the planet - a piece of paper, a pencil, and your own eyeballs.
Desk checking as it is called is an utterly invaluable tool. While you could chase your tail for hours using all your complex tools, nothing beats reading the code yourself, writing down the values of each variable as it changes, performing the math, evaluating every logical statement. Not only do you get great practice in thinking like the computer, but you'll be amazed at how quickly you zero in on the problem.
Desk checking is my personal first line of defense, using it either before or with my debugger. I've saved hours of confusion and heartache this way.
Rule #9: That bug you're hunting may well be the Jabberwocky.
Programming is weird. For reasons we don't entirely understand, software doesn't necessarily follow the laws of mathematics or physics. At times, it appears to have a mind of its own.
Consider the following, all taken from the Jargon File:
heisenbug (n): A bug that disappears or alters its behavior when one attempts to probe or isolate it.
mandelbug (n): A bug whose underlying causes are so complex and obscure as to make its behavior appear chaotic or even non-deterministic.
schroedinbug (n): A design or implementation bug in a program that doesn't manifest until someone reading source or using the program in an unusual way notices that it never should have worked, at which point the program promptly stops working for everybody until fixed.
As mythical as they may sound, they're very much real. I have actually observed and confirmed all of these in the wild at least once.
In short, a lifetime of debugging is going to produce some wild stories. While most bugs are Bohr bugs - that is, they manifest under a well-defined set of conditions - and most are your fault [Rule #2], be prepared for the bizarre. Never assume you understand the computer.
Which leads me to an important corollary: your computer will take all estimations of time and difficulty as a personal challenge. If you say "implementing X will be easy" or "it should only take me a weekend to do Y," you have practically guaranteed your code will be fraught with freakishness.
For example, I once estimated a project would take one week. It defied the efforts of four developers to complete, across a span of three years. (It just reached feature-complete last month.)
Welcome to programming. Please mind the furniture on your way down the rabbit hole.
Rule #10: Have you turned it off and on again?
It may sound utterly silly to suggest this, but it works more often than you'd expect, even with software bugs! Once I've been at it for a while, and confirmed the problem isn't me, I restart the computer. There have been several occasions where this fixed the problem entirely, never to occur again.
However...and this is mysteriously important...you have to rule yourself out as the problem first. This phenomenon is addressed in one of the legendary MIT AI Koans:
A novice was trying to fix a broken Lisp machine by turning the power off and on.
Knight, seeing what the student was doing, spoke sternly: “You cannot fix a machine by just power-cycling it with no understanding of what is going wrong.”
Knight turned the machine off and on.
The machine worked.
Rule #11: If all else fails, talk to a duck.
AUTHOR'S NOTE: I originally planned on 10 rules, but I was off by one in my numbering. Of course an article about debugging will have a bug.
Rubber ducking is a popular practice among developers, and for good reason. Often, just talking out loud helps you process the problem differently, leading you to a solution. To this end, a lot of programmers keep a rubber duck on their desk.
Mind you, it doesn't have to be a rubber duck. In my case, it's a Funko vinyl figure of Doctor Whooves (from My Little Pony: Friendship is Magic). Whatever floats your boat...or duck.
You'll be spending the majority of your coding career debugging, and there's no telling what bizarre things you'll uncover. You'll plumb the depth of human stupidity (especially your own), experience the raw power of applied mathematics made angry, and poke your fair share of apparent holes in the time space continuum. Assuming you survive, you'll have plenty of stories to tell your children.
I shall leave you with the immortal, yet misquoted, words of The Legend of Zelda:
It's dangerous to go alone...........WELL THEN, good luck!
Top comments (8)
There is actually a second part, which I feel is quite relevant here:
Ha! That's now going to be included in all my future retellings of that joke.
Great article! Sometimes turning it off and on again actually breaks things 😁(cue in mysterious music)
A few weeks ago a company I'm working for had a power outage over the weekend. The following Monday my colleagues on site noticed that every single online payment request was failing with permission denied error responses. Every development workstation had the same issue.
Did they all use the wrong access credentials? Was the payment processor temporarily down? No, it worked for me (outside the building).
We dug around, scratched our heads. Then at some point something dawned on me. I asked one colleague on the phone: so what's your time on the desktop? He gasped and realized that the desktop time was off by one minute compared to his phone.
The power outage caused all computers in house to lag behind by one minute. The payment processor requests were signed with the current unix time stamp which got rejected by the payment processor API for being too old (probably to prevent replay attacks).
So... better check your time when your turn it off and on again 😆
I'd like to read more of those bug classifications, especially timing bugs, frankenbugs (that change into other, worse bugs during fixing them, which makes you question the fix) and the insidious heteroptera featurus nonimplementus...
A really good article. Thanks, I enjoyed reading it. As someone developing in C++ a lot I know too well what you're talking about. I've seen bugs that defied all the logic and reason. How many time I thought I found a bug in the compiler (and I did a couple of times), but it was just me or my colleague.
When I program in C#, Ruby, Scala or JavaScript I find myself to use the debugger much less. I actually think I've never used the Ruby debugger. I don't even know if there's one.
I abide by Rule #2, which can unfortunately drive one crazy when you're dealing with an actual library, compiler, or OS bug. I really hate those defects!
They used to be rare, but now, with npm and pip, they seem quite common. :(
I don't talk to a duck. I downstairs to smoke and then I fix them.
"#10" - my favorite