
Demystifying Debuggers by ibobev
Debuggers exist at the intersection of many parts of the computing ecosystem—they must contend with intricate details of kernels, compilers, linkers, programming languages, and instruction set architectures.
My familiarity with debuggers has improved my programming abilities, the utility of debuggers in my day-to-day programming, and my general knowledge of computing. Back in January, the RAD Debugger—the project I work on full-time—was open sourced to the public, to mark the start of its open alpha phase. I’ve been working on the debugger, or the technology on which it depends, for almost four years full-time now. The project has taught me an enormous number of lessons, through exposure to an enormous number of problems. There is still a lot of work to do, and so I expect it will continue to do so, for many years to come.
But perhaps most importantly, debuggers are an intricate piece of the puzzle of the design of a development platform—a future I become more interested in every day, given the undeniable decay infecting modern computing devices and their software ecosystems.
To emphasize their importance, I’d like to reflect on the name “debugger”. It is not a name I would’ve chosen, because it can give the impression that a debugger is an auxiliary, only-relevant-when-things-break tool. Of course, a debugger is used to debug—which is why it was named as such—but it is also enormously useful to analyze working code’s behavior, and to verify code’s correctness, with respect to the expectations of the code.
A good debugger provides clear and insightful visualizations into what code is doing. As such, they are also enormously useful educational tools—for beginners and experts alike—because they make what is normally opaque, visible. They provide these features by dynamically interacting with running programs—as such, they can also dynamically modify code. At the limit, this approximates (or employs) JIT-compilation and hot-reloading, making traditional compiled toolchains have much more runtime flexibility for developers.
For these reasons, “debugger” is much too special-purpose of a name for the full set of capabilities that debuggers actually provide—they offer glimpses into the lower level inner-workings of a computer. If one designed a computing s
6 Comments
captn3m0
Related: https://nostarch.com/building-a-debugger is close to publication (currently in Early Access) and covers building a debugger for x64.
thasso
If you are interested in debuggers, there was a post series by Sy Brand a few years back:
https://blog.tartanllama.xyz/writing-a-linux-debugger-setup/
Eli Bendersky also wrote about debuggers (I think his post is a great place to start):
https://eli.thegreenplace.net/2011/01/23/how-debuggers-work-…
I was fascinated with debuggers a while back exactly because they were so mysterious to me. I then wrote a ptrace debugger myself [1]. It features pretty simple implementations of the most common stuff you would expect in a debugger. Though I made the grave mistake of formatting all the code in GNU style.
[1]: https://github.com/thass0/spray/tree/main/src
apples_oranges
Very interesting topic. Once you know how they work, the next fun thing is writing code that can detect or prevent debugging (and thus circumventing your DRM or copy protection..) ;)
furkansahin
Amazing! I'll follow. For what it's worth, I owe my career to the Eclipse debugger. At some point I started using it so much that my friends started to call me "debugger". I find writing code together with a debugger extremely educating.
wiz21c
I've used debuggers now and then. What's the state of the art nowadays (in terms of cool functionalities) ? (too lazy to ggl or gpt it)
tilne
> But perhaps most importantly, debuggers are an intricate piece of the puzzle of the design of a development platform—a future I become more interested in every day, given the undeniable decay infecting modern computing devices and their software ecosystems.
I agree with this sentiment, yet still I’m wondering if it’s fully justified. There has never been more bad software than right now, but there has never been more good software either, no?
It’s not super relevant to the main contents of the article. Just a bit that caught my attention with regards to how it made me think.