DEV Community

Cover image for I’m Learning AI in Public, and I Think Developers Need to Chill a Bit

I’m Learning AI in Public, and I Think Developers Need to Chill a Bit

Jonathan Murray on March 24, 2026

I’ve been going hard learning AI and tech. Building, breaking stuff, rebuilding it, reading docs at weird hours, trying to connect dots faster than...
Collapse
 
notfritos profile image
Ryan Lay

I appreciate the your rule: "assume the person is smart, and my explanation is the bottleneck". I think that's a really big perspective shift that can prevent experiences from being condescending and turning into an educative experience. A lot of times, especially if there's a language barrier or somebody is speaking in a non-native language, there's assumptions that their experience and knowledge is only as good as their ability to communicate.

Collapse
 
jess profile image
Jess Lee

Great post. It really is easy to forget that we're early - especially because changes are happening so fast it can feel like we're behind. But we're not!

I had a similar experience teaching my totally non-technical partner how to vibe code. Keeping him encouraged was definitely a big component of figuring out how to translate it all.

Collapse
 
_winter_1314 profile image
Winter

Great post! I do not think the issue is only that developers are “ahead.” The developer population is large, and the field itself is vast, encompassing many subfields. Individuals working in emerging areas such as AI, blockchain, and quantum computing may appear to be at the cutting edge, but these represent only a subset of the broader technology landscape.

Many professionals continue to work on applications, infrastructure, security, and other foundational areas. These roles are critical, even if they are not perceived as being at the forefront of innovation.

In reality, I believe many developers feel “behind” like most of the general population as no individual can master all of technology and they are looking to catch up. The issue that you are discussing has many root causes in my option, but one of the fundamental problem is that impostor syndrome is prevalent within the technology community. The expectation to appear as an expert, even when one is not, is widespread. This pressure has only intensified with the rapid emergence of new technologies. As a result, individuals may feel compelled to project greater knowledge or competence than they possess as not knowing isn’t acceptable, often coming off as arrogant due to lack of knowledge to explain or attempts at gate keeping knowledge.

True mastery and confidence should arise from different sources than most believe. Mastery is not merely knowing a concept, it is the ability to explain it clearly and teach it effectively to others. While acquiring knowledge might be challenging, transferring that knowledge in a comprehensible manner is significantly more difficult. Confidence, similarly, should not stem from displaying expertise, but from a clear understanding of one’s values, strengths, and identity.

Do not allow external perceptions or internal doubts prevent you from supporting others. Helping others creates a positive cycle and empowers your own growth. Reflect on your own early experiences, uncertainty was likely unavoidable where guidance would have been valuable. By offering that support to others, you contribute to a culture of mutual growth and ideally those you help will extend the same support forward, strengthening the community over time.

Life is short, but the impact you can have on others is substantial. That is life’s force multiplier.

Collapse
 
jon_at_backboardio profile image
Jonathan Murray

Substantial thoughts here! Thank you for your thoughtful addition :)

Collapse
 
elyse_west_af9b415e9f8b77 profile image
Elyse West

Such an insightful post!! Your points are not only highly relevant to AI, but leadership, community, and inclusivity in general. The beauty of AI is it's actually a great moment of technical reset. It is rapidly advancing, but you don't need a computer science degree to understand LLMs or even RAG. Hopefully that ultimately leads to less people feeling like imposters and not another ivory tower being built in the tech landscape.

Collapse
 
jon_at_backboardio profile image
Jonathan Murray

Thanks Elyse!

Collapse
 
theycallmeswift profile image
Swift

I really love the point about how early we are in this journey. There are only 27 million software engineers in the whole world today. There are a BILLION people who are about to have the power to code and build agenda with AI. Only 3% of the market that's coming exists today!

Collapse
 
jon_at_backboardio profile image
Jonathan Murray

Completely agree, we don’t know what we’re about to embark on and there are going to be folks who can’t code right now that build a generationally changing product with natural language. 🤯

Collapse
 
earlgreyhot1701d profile image
L. Cordero

Great read, thank you. Feels really grounding and I appreciate your perspective.

Collapse
 
jon_at_backboardio profile image
Jonathan Murray

Thank you for your feedback!

Collapse
 
novaelvaris profile image
Nova Elvaris

Your rule about "my explanation is the bottleneck" is something I wish more technical writers internalized. I've been writing about AI workflows for developers, and the hardest part isn't the technical content — it's resisting the urge to front-load jargon that makes me feel credible but makes the reader feel lost.

The dad-studying-for-a-drone-license story perfectly illustrates something I've been thinking about: the real measure of whether AI tooling is working isn't developer adoption metrics — it's whether a 73-year-old can pick it up and do something meaningful with it on day one. That's a much harder bar to clear than "developers love it."

One pattern I've found useful for the "stadium test" is what I call the "so what" chain. Every technical concept gets followed by "so what does that mean for you?" until you land on something concrete. Context window → your assistant forgets stuff after a while → so keep your important notes in a file it reads every time. Three steps from jargon to actionable. What's your process for finding that translation layer when you're explaining something new?

Collapse
 
itskondrat profile image
Mykola Kondratiuk

honestly the anxious dev reaction makes sense if you think about it from a career risk angle - people are pattern-matching to past automation waves where the bottom of the skill ladder got wiped first. the difference this time is it hits the middle first (boilerplate, glue code, ticket-to-PR stuff) which is where a lot of career progression used to live.

what i keep seeing on the PM side is the people who chill about it fastest are the ones who got curious early. not because AI is great, but because they stopped treating it as a threat to defend against and started poking at the edges. learning in public like you are doing is exactly that.