History of Debugging – Techniques Before Protected Memory

debugginghistorymemory

Now, when I make a programming mistake with pointers in C, I get a nice segmentation fault, my program crashes and the debugger can even tell me where it went wrong.

How did they do that in the time when memory protection wasn't available? I can see a DOS programmer fiddling away and crashing the entire OS when he made a mistake. Virtualization wasn't available, so all he could do was restart and retry. Did it really go like that?

Best Answer

I can see a DOS programmer fiddling away and crashing the entire OS when he made a mistake.

Yeah, that's pretty much what happened. On most systems that had memory maps, location 0 was marked invalid, so that null pointers could be easily detected, because that was the most common case. But there were lots of other cases, and they caused havoc.

At the risk of sounding like a geezer, I should point out that the current focus on debugging is not the way of the past. Much more effort was previously made to write correct programs, rather than to remove bugs from incorrect programs. Some of that was because that was our goal, but a lot was because the tools made things hard. Try writing your programs on paper or on punched cards, not in an IDE, and without the benefit of an interactive debugger. It gives you a taste for correctness.

Related Topic