DEV Community

sunny
sunny

Posted on

Do you ever wish your pair programming sessions had a neutral third voice?

Hey devs 👋

Curious if others have experienced this:

During pair programming or team discussions, we often get into long debates about how to implement something. It starts with, “Should we do it this way or that way?” and suddenly we’re 40 minutes in, still discussing pros and cons.

Sometimes it feels like the loudest opinion wins—not necessarily the best one.

Lately, I’ve been wondering:

What if there was a way to have a neutral voice in the room—one that’s not emotionally invested, knows your codebase, understands business priorities, and can surface trade-offs objectively?

Like... an assistant that:

Knows when tech debt is okay vs. when it’s a red flag

Helps compare approaches in real-time

Keeps the discussion aligned to delivery goals

I’m not sure if this is something other teams run into or if it's just me overthinking things, but I'd love to hear:

Do you run into these kinds of debates?

How do you resolve them?

What do you wish existed to help?

No pitch, no tool—just genuinely curious how other devs think about this.
Appreciate any thoughts 🙏

Top comments (6)

Collapse
 
kurealnum profile image
Oscar

Should we do it this way or that way?

I would personally never trust an LLM to answer this for me. Unless it's for something fundamental, in which case I either already understand how to answer that question myself, or I need to understand how to answer that question myself. I do run into these debates with my coding partner a lot, but I think there's a lot of very valuable (perhaps contextual) knowledge that comes from having these debates, and I would never feel comfortable not understanding why we made a certain choice (or again, trusting an LLM to make that choice for me).

Collapse
 
sudhanva_mg_3d625287a49b9 profile image
sunny

Totally fair, and I really appreciate the thoughtful response 🙌

I 100% agree that understanding why a decision is made is crucial—especially when it comes to architecture or trade-offs. The goal wouldn’t be to replace that process or make decisions for you, but more to act like a third teammate who can surface considerations you might miss (like hidden tech debt, coupling risks, delivery impact) without being emotionally invested.

Think of it less like “the LLM tells you what to do” and more like “here’s a summary of trade-offs or risks based on your context—now you decide.”

Curious—if there were a way to retain full understanding and control but speed up the back-and-forth or bring more objectivity into it… would that feel more useful? Or still too risky?

Collapse
 
kurealnum profile image
Oscar

Thanks for the response, and I do like that idea! Something that intentionally only acts as a third thought in the room. That's really what I've found AI stuff to be useful for -- it doesn't do anything for me, but it points out potential downsides about the way that I'm doing things. Half the time it has no idea what it's talking about, but half the time I go "oh, yeah, maybe I should tweak that".

Collapse
 
eioluseyi profile image
EIO • Emmanuel Imolorhe

The only sustainable solution to this I see is your AI assistant, like co-pilot or cursor's list of GPTs

Collapse
 
seif_sekalala_81e09fe6b9e profile image
Seif Sekalala

Great idea!

Collapse
 
diso profile image
Wayne Rockett

That is a really good idea.