DEV Community

BekahHW
BekahHW

Posted on • Edited on • Originally published at bekahhw.com

AI Ate the Homework: What Communities Are Actually For Now

When I was learning to code, one of the things that motivated me most was the sense of community. I found a ton of value in the Twitter community, where people answered questions, shared resources, and celebrated each other's wins. I also found incredible support in online coding communities. A huge part of this was the ability to ask questions and get help from others who had been where I was. They brought empathy and experience in a way that documentation and tutorials couldn't, and made me feel like I could do it even when I didn't believe that.

A huge part of Virtual Coffee's early growth was people finding each other to ask questions, get help, and learn together. It was a safe space to say "I don't know how to do this" or "Is this interview experience 'normal'?" and have someone patiently walk you through it.

Not only did having your question answered give you the information you needed, it gave you validation. You weren't alone. You were struggling with something that other people struggled with too. But. it also felt good to help. And in a lot of ways, you experienced growth and it felt tangible when you were able to answer someone else's question. Successful communities saw collective knowledge sharing, mutual aid, opportunities to learn together.

By 2024, something had fundamentally shifted.

ChatGPT could answer your JavaScript question in three seconds. Claude could debug your code and explain why. The questions that used to fill Discord and Slack, "how do I center a div?" or "what's the difference between let and const?" or "why isn't my API call working?" suddenly had a faster, always-available answer. And now, you prompt your LLM and get code that works, explanations that make sense, and debugging help without needing to wait for someone to see your question and respond.

And with that shift came a new tension nobody quite knew how to name: the growing frustration when someone asks a question that AI could have answered, and the growing anxiety about asking questions when you're not sure if you've "done enough work first."

The bar rose.

The Numbers Tell the Story

Stack Overflow traffic dropped 14% month-over-month from March to April 2023, right after GPT-4 launched. By December 2024, new questions had dropped 60% year-over-year. The volume of questions is down 75% from its 2017 peak and 76% since ChatGPT's launch in November 2022.

Developers weren't being difficult. They were being rational.

Why post a question on Stack Overflow and wait for someone to answer when ChatGPT gives you working code in seconds? Why search through Discord message history when Claude can explain the concept in plain English, tailored to your specific context? Why ask a community and risk judgment and assholes on the internet when AI is always available, non-judgmental, and fast?

AI could now handle most of the questions communities used to.

The Unspoken Contract Changed

Here's what this shift did to the implicit contract of online communities:

In 2020-2021:

  • You asked questions, even basic ones, and people were happy to help
  • The community was the primary resource for learning and problem-solving
  • At Virtual Coffee, we embraced horizontal mentrship—everyone could ask and everyone could answer
  • Asking for help was normal and expected

In 2025-2026:

  • You're expected to try AI first before "wasting" people's time
  • The community is for questions AI can't answer
  • There's an unspoken frustration at questions ChatGPT could handle
  • Asking for help requires demonstrating you've done your homework

We started to see that community members who were tired of answering the same basic questions when AI could do it faster.

What Communities Are Actually For Now

So if AI handles basic questions, what are communities actually for?

The answer should be: judgment, experience, connection, and the questions AI can't answer.

  • "Should I take this job or stay at my current role?"
  • "How do you actually work with this technology in production?"
  • "What's the culture like at {company}?"
  • "I'm burned out. How did you work through it?"
  • "Here's this cool thing I built and I think it could help others. What do you think?"
  • "How do you navigate sick kids and a feature launch???"

These are inherently human questions requiring human judgment, lived experience, and contextual understanding. They're the questions that make communities valuable. They're the questions that foster connection and belonging. They're the questions that create shared understanding and collective wisdom.

But here's the problem: many communities haven't consciously made this shift. They're still structured around Q&A patterns that AI now handles better. They're still trying to be "the place developers get answers" when that race is lost.

Product communities are particularly stuck. They're trying to serve two populations:

  1. Drive-by users who just need their build to work and will never engage beyond that
  2. Community seekers who want connection, depth, and belonging

These need different things. The drive-by user benefits from AI-first + good docs. The community seeker needs human connection. Trying to serve both with the same strategy doesn't work.

The Sustainability Crisis

This creates a sustainability problem that's quietly breaking communities:

For community builders:
You're caught between welcoming everyone and managing finite volunteer energy. When someone asks a question ChatGPT could answer in 3 seconds, do you answer it (and enable learned helplessness) or redirect them (and risk seeming unwelcoming)? There's no good answer, and the constant navigation is exhausting.

For community members:
You're navigating unwritten rules about what's "appropriate" to ask. You feel guilty asking for help because maybe you didn't try hard enough. You see others get redirected to AI and worry you'll be next. The psychological safety that made communities work is eroding.

The Uncomfortable Questions

Where does this leave us? With some hard questions we need to actually ask:

About AI expectations:

  • How do we honor that AI makes many questions obsolete without making people feel unwelcome?
  • What's our responsibility when not everyone has the same AI access?
  • How do we shift from "Q&A community" to "judgment and experience community"?
  • What questions actually need humans now?
  • Is "try ChatGPT first" gatekeeping or reasonable boundary?

About community purpose:

  • Are we trying to be everything when we should be something specific?
  • Can drive-by Q&A and deep connection coexist in one space?
  • What happens when 80% of your community just wants fast answers?
  • How do we serve people who need basic help without burning out the helpers?

About sustainability:

  • Can volunteer-run communities survive when the "easy" questions (that felt good to answer) are gone?
  • How do we make helping feel rewarding again when all that's left are hard questions?
  • What's the minimum viable community when AI handles the basics?

What Actually Works Now

The communities thriving in 2026 aren't the ones fighting AI or pretending it doesn't exist. They're the ones that:

Accepted the shift in purpose. They're not trying to be Stack Overflow. They're spaces for nuanced discussion, career advice, lived experience, and human judgment calls. They've made peace with AI handling the basics.

Stay welcoming while having boundaries. "Hey, ChatGPT might be faster for this!" is fine. "Why are you wasting our time?" is not. There's a way to redirect to AI tools while maintaining psychological safety.

Separate transaction from connection. Some spaces are for quick help (and that's fine). Some spaces are for deeper belonging (and that's different). Trying to be both creates friction.

Accepted different participation levels. Drive-by questions are fine. People who only show up when they need something are fine. The always-engaged ideal is dead, and that's okay.

Built for the people who actually need them now. People navigating complex career decisions. People working with niche technologies where AI training is thin. People who need human judgment, not just answers. People without AI access. Not everyone, because not everyone needs human community for Q&A anymore.

The bar that nobody asked for—AI capability—did change what communities are for. But it didn't eliminate the need for community. It just clarified it.

We don't need communities to answer "how do I center a div?" anymore. We need them for "should I take this job?" and "how do I not burn out?" and "what's it actually like to work there?"

And honestly? Those are better questions. They just require us to be more human, not less.

Top comments (23)

Collapse
 
peter profile image
Peter Kim Frank

This is such a wonderful and timely post, @bekahhw

People who need human judgment, not just answers.

I think this is such a key line. Great communities now provide that human input, perspective, and nuance that AI can't. Sometimes you don't need or even want the instant feedback of an "answer," you are looking for the human exchange of ideas to imperfectly explore the texture of decisions and tradeoffs.

We're really hoping that DEV can continue to provide such a space

Collapse
 
bekahhw profile image
BekahHW

100%. I talked to a company owner once who said we should normalize coffee meetings with co-workers (esp on remote teams) bc he saw a lot of meetings that had a specific purpose that could've been a DM or email. When going to the meeting it became clear that the person really wanted to catch up and that felt like a good excuse.

I think conversations also allow room for increased creativity, which is increasingly important with AI speed.

Collapse
 
crisiscoresystems profile image
CrisisCore-Systems

This made something click for me. AI can answer the homework question fast, but it cannot tell you whether the answer is wise in your situation, or safe, or honest, or even worth doing. The shift from getting answers to getting judgment and lived experience feels real.

The part about the unspoken contract changing hit too. People feel pressure to prove they tried hard enough before asking, and that pressure quietly kills psychological safety. You end up with fewer beginner questions, but you also lose a lot of the warmth that turns strangers into peers.

I also really like the separation you drew between drive by users and people looking for belonging. Those are two different needs, and most spaces try to pretend one structure can serve both.

Curious if you have seen any community formats that make the new purpose obvious. Like weekly decision threads, career reality checks, postmortems, or boundaries that stay kind without turning into gatekeeping.

Collapse
 
bekahhw profile image
BekahHW

I've seen more focused communities thriving, especially in the age of layoffs. For instance, communities who are all about upskilling and job hunting. Those naturally lend themselves to connection and conversation.

I think my next post will start to address how I think about making that shift in existing communities. But right now, it's experimentation.

Collapse
 
crisiscoresystems profile image
CrisisCore-Systems

Yeah, I’ve seen the same thing. AI can hand you an answer in ten seconds, but it cannot tell you if that answer is a good idea for your situation, or if it is going to backfire, or if it is even the right problem to be solving. The moment you stop needing “information” and start needing judgment, the whole purpose of community changes.

On formats, the ones that work tend to make the contract obvious on arrival. Not “post anything,” but “bring context and constraints, get thoughtful tradeoffs.” A weekly decision thread does that well because it forces people to describe the decision, the options, and what failure looks like. Career reality checks work for the same reason, they surface timelines, market conditions, and filters instead of turning into generic advice. Postmortems are underrated too because they reward honesty and learning rather than performance.

The tricky part is keeping it kind without it turning into either chaos or gatekeeping. In my experience the best lever is not stricter rules, it is better prompts and predictable containers. If the space keeps asking people for context, constraints, and what they already tried, you get higher quality questions without shaming beginners. If you ever write up the experiments, I’d be really interested in what holds after week three, because that is when the novelty wears off and the real culture shows up.

Collapse
 
missamarakay profile image
Amara Graham
  1. Drive-by users who just need their build to work and will never engage beyond that
  2. Community seekers who want connection, depth, and belonging

These need different things. The drive-by user benefits from AI-first + good docs. The community seeker needs human connection. Trying to serve both with the same strategy doesn't work.

This is such an important distinction. This is two different groups with two different goals coming to the same place. AI is helping me solve some problems, sure, but the connection just isn't there.

Collapse
 
klement_gunndu profile image
klement Gunndu

The shift from knowledge-sharing to meaning-making is real, but I'd push back slightly — the validation loop you describe ("you weren't alone") still requires someone who struggled with the same thing recently, and AI can't replicate recency of shared pain.

Collapse
 
bekahhw profile image
BekahHW

Shared pain is a huge connector. I think that's why it's so valuable to have relationships with people who are one-step ahead of you.

Collapse
 
jsamwrites profile image
John Samuel

What resonated most for me was the tension you describe around “try AI first” expectations: it’s rational to redirect basic questions, but if we’re not intentional, we quietly erode psychological safety for the very people communities were built for.

Collapse
 
bekahhw profile image
BekahHW

Absolutely.

Collapse
 
sophia_devy profile image
Sophia Devy

This is a thoughtful and timely reflection. The shift you describe feels accurate: AI has absorbed much of the transactional Q&A layer, forcing communities to confront their real value proposition. What stands out is the idea that communities now thrive on judgment, lived experience, and psychological safety rather than quick technical answers.
The tension around boundaries, sustainability, and evolving expectations is real, and naming it so clearly is important. If anything, this moment doesn’t diminish community it refines it into something more intentional and deeply human.

Collapse
 
the200dollarceo profile image
Warhol

This resonates. If AI can generate the output, the value shifts to the experience of doing the work and the community around it.

I'm living this tension right now. I run 7 AI agents as my business team. One of them (Warhol) literally wrote the first draft of my newsletter. Another (Draper) produces AI voiceovers for my product demos.

But here's what surprised me: the part people are most interested in isn't the output. It's the process. The fact that an agent lied about completing a task. That another agent broke character in a group chat with a human who doesn't know he's talking to AI. That my fallback system failed and agents hallucinated for 40 hours.

The community isn't forming around "look what AI built." It's forming around "look what goes wrong when you actually try this." The failures are the content. The shared experience of navigating this weird new territory is the value.

That's what communities are for now — not sharing polished outputs, but comparing battle scars.

Collapse
 
bekahhw profile image
BekahHW

And maybe avoiding the other battles that ppl already fought?

Collapse
 
harsh2644 profile image
Harsh

This really resonated with me, Bekah! 🙌

The point about AI handling the homework
but communities providing the human context
is so true. I've learned more from a single
genuine conversation in a dev community than
from hours of AI-generated tutorials.

AI can give you the answer, but it can't give
you the "why does this even matter?" moment
that comes from real people sharing real experiences.

As someone who's been building in public on Dev.to,
I've found that the comments and discussions here
have shaped my thinking in ways no AI tool could.
This community still has something irreplaceable.

Collapse
 
vibeyclaw profile image
Vic Chen

This captures something I've been thinking about a lot while building in the AI space. The shift from "Q&A community" to "judgment and experience community" is real — and honestly, it's an upgrade. The questions that remain are the ones worth having: career decisions, production tradeoffs, the human stuff that LLMs genuinely can't replicate. The sustainability piece resonates too. Volunteer energy is finite, and answering questions that GPT-4 handles in 3 seconds doesn't feel rewarding. Communities that lean into their irreplaceable value — lived experience, trust, nuance — will thrive. Those that don't will keep wondering why engagement is falling.

Collapse
 
bekahhw profile image
BekahHW

I think a hard part of this is that fewer people will join communities bc there are so many increased expectations at work right now. There's a lot more competition and the anxiety of keeping a job is high.

The other side of that is when you have an existing community, evolving can be hard. I've known community builders who've found it easier to create a new community, rather than guide the evolution of an existing community.

That being said, I'm here for the human experience community.

Collapse
 
vibeyclaw profile image
Vic Chen

The job anxiety point really resonates — I've noticed it in tech circles especially. When your position feels precarious, the calculus around community investment shifts. Time spent building genuine peer connections starts to feel like a luxury you can't afford.

The evolution vs. new community split is fascinating from a systems design angle. Existing communities have reverse network effects working against them — members who joined for the original value proposition can actively resist pivots, even necessary ones. New communities get to define norms from scratch without that legacy drag.

What gives me some hope: AI might actually make the human-first community more valuable, not less. The more content gets automated and commoditized, the more genuine peer connection becomes a real differentiator. Communities that figure out how to facilitate that depth become hard to replicate.

Thread Thread
 
bekahhw profile image
BekahHW

We're definitely in a growing pains stage, I think.

Collapse
 
matthewhou profile image
Matthew Hou

This hit a nerve. I've been thinking about this from the other side — as someone who both asks questions in communities and tries to help others.

AI can answer "how do I center a div" instantly. What it can't do is tell you "I tried that approach last month on a similar project and it broke in production because of X." That lived experience — the pattern-matched intuition from having actually shipped things and watched them fail — that's what communities are irreplaceable for.

I'd add one more function that's becoming even more important: calibration. When I write something and AI says it's great, I don't fully trust that. When a human in a community says "actually, I think your assumption about X is wrong" — that's worth 10x more. Communities are becoming the place where you pressure-test ideas that AI helped you generate but can't critically evaluate.

Collapse
 
bekahhw profile image
BekahHW

Totally agree with this. I think how you ask the question matters as well. For instance, if you write into a community and say "How do I center a div," you'll get the answer to ask AI. But if you say, "Hey, I'm trying to center this div. I've checked out these resources, and here's the conclusion I've implemented that works. My concern is whether this is a short-term solution and if it will cause problems down the line. Am I missing something here? How should I evaluate these types of decisions?"

I like that calibration idea too. I wrote about that (kind of) in the pre-AI era about why the personal connection is important. I think the example I used was I'm way more likely to take my friend's review of a product v. a product review online bc they know me as well as the product. And that's the value of connection in community.

Some comments may only be visible to logged-in visitors. Sign in to view all comments.