Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I used to do that for years, and then I discovered gdb and IDE integrations with it (code blocks, codelite) and later visual studio. I have no idea why anyone would subject themselves to developing without breakpoints in 2023 other than to larp as an 80s MIT hacker


I started writing in C with an IDE then realized much later on that I don't really need breakpoints, I just log stuff. Whatever I'm testing often won't easily work in the debugger (or won't reproduce the bug there due to timing), and even when it does, it's not significantly easier than logging.

Also, every new job I've had, I've watched my coworkers spend like a week trying to figure out how to make the IDE work with whatever environment, which then changes later... I just skip that.


Debuggers have logpoints as well as breakpoints. Learning to use a debugger grants you access to this kind of basic "log-debugging" with the option to use more advanced techniques (traditional breakpoints, break on value change, etc) at your fingertips.

> Also, every new job I've had, I've watched my coworkers spend like a week trying to figure out how to make the IDE work with whatever environment, which then changes later... I just skip that.

A single week to achieve better productivity? That's a sweet deal. It's about as much time as you need to figure out how the work email works, learn how to get to the cafeteria from your desk quickly, and so on. It's absurd to think that you'll be 100% productive that first week anyway, so why not use the opportunity to familiarize yourself with the tooling as well?


If it were a week to actually get better productivity, and that setup never broke later, then sure. It doesn't really help because of how few things are actually runnable in the debugger (basically just unit tests). Nobody else on my team even uses the debugger, they only use IDEs out of comfort, and so far they've had to switch IDEs 3 times in 3 years.

Past jobs had similar caveats. The only time I've ever been able to use a debugger consistently was in school.


Personally with visual studio and vcpkg, I never have an issues with setting up the environment. Vcpkg in particular makes it easy to note have to do manual linking, and it handles x86 vs x64 automatically as well.


> It's current year!

But in all seriousness. Using a debugger can be useful, and even though I've given it a try numerous times, I mostly avoid it because it doesn't fit the way I think.

Debugging is about searching for the source of the problem. With print debugging I'm always leaving behind breadcrumbs which I can inspect all at once at the end. If I'm going down a wrong path I delete the wrong ones and add new ones until I find the issue.

With a debugger I have to mouse around, put breakpoints, run the code, inspect where I'm at, step through, decide that this breakpoint is useless, and have to have all this state in my mind. If I get lost, I have to start from scratch.

Print debugging matches my way of thinking much more.


Logpoints! They're like breakpoints, but they don't stop execution. And they don't require whatever you're writing C for to have any sort of console output. Of course you do need to have a debug port enabled, which often isn't the case for production hardware so you get stuck printf debugging by blinking an LED and probing it with a logic analyzer. Royal PITA.


If they were larping as MIT hacker, they would be using debugger extensively.

Now, if they were larping as PC/Mac one... (or certain groups of Unix weenies)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: