Programmers spend a high percentage of the time debugging rather than writing code. You probably had some training in learning a language or framework--but how did you learn to fix the defects in your software?
When you fell in love with programming (or at least decided it was a remunerative career), you probably thought of it as a creative endeavor. You'd design great software, write the code, and poof!—it'd work perfectly the first time.
In the real world, you spent a bunch of your time debugging code rather than writing new stuff. I'm sure I could dig up some obscure percentage of developer time that's devoted to fixing defects rather than creating new functionality, but I doubt you need to hear a number. You can too-easily picture the days you spent looking for the Bug From Hell, and its effect on your project schedule.
Now, there's lots of ways that programmers can and do learn new software skills, whether it's reading a book, attending a tech conference, or visiting sites like JavaWorld.com. (I'm rather glad you do the latter.) However, these usually focus on tools, such as languages or frameworks, and not on meta-techniques, such as "How to find that bug in two hours instead of two days." Languages may come and go, and so will IDE debuggers, but the ability to discern under which rock your bug is hiding is one that will stay with you forever.
A large part of the skill of learning to debug is, of course, experience. That might be your own experience, or the opportunity to be A Grasshopper at the Feet of A Master Programmer. I also suspect that some people have an innate talent for troubleshooting (equally relevant to fixing a broken car as a misbehaving application), and those of us without it can only pout in envy.
However, some of this can be learned. For example, one master programmer of my acquaintance had an axiom: If you've been looking for a bug for a (relatively) long time and can't find it, he said, "You're looking in the wrong place." Obvious-sounding, but certainly true... and how often have you wasted time looking in the XYZ module when the problem was somewhere else entirely?
I asked several developers for the ways they learned or improved their debugging skills. A surprising number of them talked about their mastery of the IDE's debugger or some other tool expertise, but most of what I wanted to know is their advice on improving one's ability to fix errors. Here's a short summary of their responses.
- Be disciplined. Debugging is a process, said one developer, not a series of random events. Don't randomly tweak knobs; follow the code's execution process. Just like fixing a lawnmower, he said. Does part A get the input it needs? How about the output? If that's OK, move on.
- To improve your skills, debug other people's code rather than your own. It will be easier to see the faults in the other person's assumptions than it is to see your own. You might do this as part of a cross-peer code review and cross-peer debugging. You will develop the ability to recognize the common causes of defects more quickly, promised one developer, and teach you to recognize (and abandon) your own bad development practices.
- Pretend you're the compiler. Find and correct as many errors as you can before pressing the Compile button. While most modern IDEs include integrated debuggers (like Visual Studio's Intellisense), you'll learn less from their automation than you will from consciously examining the process. (The same way you'll never learn to spell correctly by relying on a spell checker to do all the work.)
- Learn to fix bugs as early in the development process as you can. That might mean something formalized, such as test-driven development. That also means devoting time to debugging your design instead of barreling into coding.
- Debugging is easiest when you can hold the whole system in your head. Don't make the mistake of focusing in on only one part of an application. Pay attention to the interrelationships between modules. Read the code at multiple levels of abstraction, advised one programmer. "Finding the bug is the hardest part, and it takes clear understanding of what multiple pieces of the code are doing," she said.
- Part of the same bit of advice, I think, is someone else's suggestion: gain a good understanding of the system one level down from what you are working on. "If you are debugging a system level C program, it helps to know some assembly and something about the OS," explained a system software lead engineer. "If you are debugging a J2EE app, it helps to know something about Java threads, RMI and GC." In many cases, he pointed out, error messages come from that one-level-down. "If you can understand what that means, it will help you figure out what is going wrong at your level of abstraction," he explained.
A few developers also recommended extra resources. Among them is David Agan's book, Debugging, which promises nine indispensable rules, and Why Programs Fail: A Guide to Systematic Debugging, which is about to be released in a second edition. The developer who recommended the latter says it teaches a systematic approach to debugging with plenty of hands-on examples. Another suggested an online essay, Ten skills of highly effective software testers.
I like all those answers, but I suspect there's more wisdom to be shared. How did you gain your debugging skills? How have you helped others improve theirs?