Okay, granted I'm talking more about a project on which I have full control and knowledge. On other projects, especially where I'm unfamiliar and/or there are less tests, I'll use a debugger more.
Also true that in some cases I do get out a debugger, but increasingly over the years I tend to already have a test for the code in question, and often it's simply obvious what's wrong (the test has failed on an assert), so I just fix the code because I already know what's wrong.
The time I do pull out the debugger is when either a) I don't have a test (I should write one then - so less likely) or b) it's making no sense at all and my mental model and tests don't match the code.
I'm going to say that in about the past 18 months I've used a debugger three times.
Stats on project I'm working on (alone):
Hence: I prefer not to use a debugger, but rather write tests.
Bonus: Once you fix a bug by first writing a test that fails, you know when you've fixed it, you know you'll know if you ever break it again, and you have one more test.
Just out of curiousity now, what is the lines/files of code split between tests and application code?
And also curious, what programming language?
(*) Name redacted because it's a new unreleased and in-development. Doubt it will see the public eye for a few years or more (if ever). It is however written in itself because "Dog Fooding" (and heavy testing) is the only way to hope to succeed at such insanity.
FYI - The counts are pretty inaccurate, I didn't use sloc tools, just find/xargs/wc. I'd say there's about 15% of blank lines/comments/junk at a guess.
I guess we can add one more unpopular software opinion:
We do need Yet Another Programming Language!
We're a place where coders share, stay up-to-date and grow their careers.
We strive for transparency and don't collect excess data.