An interesting conversation at work today. I was talking with the guy, L (no – a different one), who has been responsible for our IT for the past 3.5 years. L isn’t the guy that always does the work, although he can. His real job is as an engineer in the company and that’s his primary focus. However, he does handle the technology strategy and monitors the folks that do the IT work. (He and I are talking quite a bit since I’m trying to take some of the load off in this area).
Anyway, L made the observation that he is often frustrated by IT people in their approach to diagnosing a problem. His experience, both with our ISP and sometimes our desktop support, is that IT people will poke at a problem system in a non-systematic way, replacing components, etc., until the problem disappears. They may never identify the problem, but they did make it go away.
In my experience, this is not true across the board, there are many IT folks that do a post-mortem on problems, that diagnose issues methodically to pin-point the problem, etc. But he’s right – there are many that don’t.
The funny thing is that the methodical approach comes naturally to L, with his engineering background, but doesn’t come naturally to many other people and particularly people without any training in methodical testing. I recognized some of this back in high school. One thing that separated fair computer users from the great (and yes, this was back in the 80s) was diagnostic ability. My college roommate, for example, was lousy at diagnostics. He knew a fair amount about computers, but didn’t have a methodical approach for diagnosing problems.
KL was suggesting that people should identify experiments, knowing in advance what the different results would indicate. As an engineer, he called them experiments. However, many people with natural diagnostic ability do this instinctively. For example, the symptoms of the problem are known and could be either hardware or software. Some people naturally recognize this and do tests to rule out one or the other. Okay, the problem is in the hardware. Can we tell if it’s system or network? etc.
The funny thing is that I don’t think universities ever teach this skill. Computer Science does in a sense. If you can’t diagnose problems in your program in a rapid, methodical way, then you’ll probably fail out. But this seems to be more weeding out than teaching. IS courses and Engineering courses aren’t any better to my knowledge. I don’t think I’ve ever seen a “Debugging†course offered. So we’re left with people that can either perform diagnostics instinctively or people that can’t and replace/test parts randomly.
Am I missing something, are there courses in debugging? If not, should there be? I tend to think of debugging/diagnostics as a skill separate from coding or engineering. If that’s the case then it can and should be taught. Hell, there’s a whole television show (“Houseâ€) based on medical diagnostics – the least we can do is to teach future programmers, engineers and IT people the same skills.