Imagine it’s 2011. The web is mostly server-rendered PHP templates, maybe a bit of jQuery if you’re feeling fancy, or sometimes no JavaScript at al...
For further actions, you may consider blocking this person and/or reporting abuse
Very interesting point. I once read that the Roman empire could have invented the steam engine and electricity. But manual labor was so cheap that tech progress promised no value.
In 2011, we had more fragmented browser engines compared to 2026, which is why we needed mootools and jQuery in the first place. Visual cross-browser development still isn't the LLM's strongest field right now. I can hardly imagine how they'd cope with it back then, and I guess Flash was also still around. But let's just imagine they did, either we'd be stuck in the millenium mindest or all browsers would support TypeScript natively and Angular, React, and Vue had all converged into a common, AI-written pattern library, and software hype cycles hadn't dared to introduce as many breaking changes. So, honestly, maybe I'd prefer that alternative reality.
I absolutely love this comment 😄
Also… you just reminded me that Flash existed. Somehow my brain keeps conveniently forgetting it 😅 I tried a few times back then, but my system just refused to fully commit to learning it.
And that alternative future you described, with native TypeScript support in browsers and some kind of converged, AI-shaped pattern library… I have to admit, I really like that vision. That said, I’m not sure I’d be that optimistic. My gut feeling is that we’d still eventually arrive at something like what we have today, just… later.
But then again, maybe later would also mean better.
Repeat the same though experiment in 2030! Then we'll know where the current LLM hype went and what it did to coding, maintenance, software architecture and management decisions. I think the current kind of AI arrived to quick and too early and nobody was really prepared, so the software scene is still moving on with their updates and breaking changes and AI will have a hard time catching up with all the outdated code out there. Then again, StackOverflow had a declining user engagement and their top-voted answer becoming obsolete, all without any AI. In the end, maybe AI doesn't make that much of a difference after all in the long run?
Haha, we’ll see! 😄
But this is exactly what I mean. AI often struggles to keep up and ends up suggesting outdated solutions. Even in simple demos I sometimes get older Tailwind versions pushed on me. Small thing, but it says a lot.
And you’re right, the focus is already shifting. At conferences, instead of talking about new technologies, we’re more and more talking about our “armies of agents”.
So yeah, in the best case, maybe AI won’t make that big of a difference long term.
But I really hope I’m wrong and that by 2030 we’ll all have flying cars parked outside our houses 😄
I have a few questions.
If letting the frontend orchestrate the state was such a good idea, why are modern javascript frameworks going back to server side rendering?
What makes the web modern?
Does AI speed up innovation?
What are the new ideas that really made a difference?
If I would answer what makes the web modern for me, it is making it possible to view a website on different screen sizes. Making sure the data you send to a website is not intercepted.
Do things like server side rendering and keeping state in the browser really matter? They are just tools that need to be used in the right circumstances.
I think what most developers shocks about the AI use is how little people care about what makes a technically good website. As a PHP developer I was aware long before AI by the use continued use of Wordpress.
I think as developers we have got ourselves a bit of a god complex. We tend to over-engineer. Only the newest technology is good enough.
And because AI is trained on "old" code I think a lot of developers experience a mental breakdown. Use AI to work faster, but what about the hottest new thing?
I think the developers that work with a more conservative mindset are better equipped to adding AI to their workflow. Because they are not going to use all AI features all at once.
Innovation is not driven by AI. Innovation is driven by humans.
It is the massive compute and memory that is at the base of AI that is accelerating some of the innovations. But that already was driving innovations. Not so long ago I saw the term supercomputers mention a lot when they announced breakthroughs.
So the question is how much is AI contributing to innovations?
For the question are you comfortable with index.php? I say yes, it is an entry point for the application that is found in every language, including javascript. Most call it main.
David, I swear I was thinking about you yesterday, like “he’d have a great take on this”, so I guess I summoned you 😄
I both agree and disagree with the idea that AI doesn’t drive innovation. On its own, sure, it won’t invent anything groundbreaking. But it’s incredibly good at structuring thoughts and speeding up iteration, which can indirectly accelerate innovation. It won’t come up with anti-gravity engines, but it can definitely help us get to good ideas faster.
And on the point about developers overcomplicating things… 100% agree. For me, this connects a lot with things like SSR. I get that we often build it for SEO, but when I see critical bugs in something like React Server Components, I keep wondering: if we know we don’t need SSR and probably never will (like in some internal or government apps), is it really worth forcing it in?
I’m working with Angular in that kind of setup, and honestly, the simplicity and stability saved me more than once.
The funny part is, the longer I write this, the more I realize I actually agree with you 😄 Although I’m not fully convinced that “modern web” is just responsiveness and security. I remember building things 15 years ago and… it was rough. We’ve definitely come a long way beyond that.
I agree with you modern isn't only security and responsiveness. Those were the two things that I came up with when I was thinking about the foundations of a modern website.
A component based workflow is a big improvement over the almost static templates of the old days. But I do think we didn't need to move render engines to the browser. So I consider that less of an improvement than the other two things.
You are right AI is a good tool when used wisely. When you read more than just the headlines, people that are deep in AI development acknowledge that they already had the idea before AI took the same route. This shows the potential, but we are not there yet. And maybe we never will?
Doing the same thing for the same problem/feature is not a bad thing, And that you can leave up to an AI. It are the solutions that aren't that straight forward that will require our attention even with AI.
Again, a great point.
I personally think SPAs are great, but exactly as you said, they’re just a tool, not a silver bullet. For me, they work really well in enterprise or government-type applications, but can be quite painful in e-commerce.
And yeah, the more complex, exploratory problems are still very much in human hands. A friend of mine builds advanced game engines and he often says AI just has no answers to his questions 😄
Absolutely true innovation is made by humans
@steve_fabricetegawendek Yes, BUT!!! 😄
Have you tried creating something with AI, whether it’s a story or an idea for an app? It’s honestly a powerful accelerator. Let’s not pretend it isn’t 🙂
This is a fascinating perspective. The idea that AI acts as a "stabilizer" by reinforcing the most documented patterns is a great counterpoint to the usual "acceleration" narrative. It makes sense that if we remove the friction of 2011-era development, we might have lost the incentive to innovate toward the modern stacks we use today. Excellent food for thought.
Thanks so much! 😄
Yeah, exactly — instead of React newsletters we’d probably be getting “top 10 index.php tips” in our inbox 😂
2011 huh. IIRC PHP and JS were not that dominant.
Flash and Java applets were the real deal around that time.
If there were AI in 2011, we would have really wacky Flash sites !!
And really insecure Java applets !!
Hahaha, thanks for bringing back those beautiful times 😄
And yes!! I remember that too! You could literally download a broken Java applet, poke around, fix something locally and just run it again like nothing happened 😂
Website owners hated this one trick. 😂
Hahaha, I remember that too 😄
There was a moment when people were seriously predicting the end of JavaScript… and here we are, with half the internet running on it and node_modules heavier than entire operating systems 😂
Now that I think about it. Lets say each megacorp guys owns their LLM tech stack. It would be a wild battle royale. We will have modern web but not open web.
1.) Sun making their own Java browser to run exclusive Java applets and content. Everything would be a java desktop app and web would be a minor secondary part. Compatible across all Operating systems. Windows , Macintosh, Linux , iOS , Android , Symbian. Windows Mobile,
2.) Microsoft and Adobe making silverlight browser which will allow them to assert flash dominance. It would run on windows and Windows mobile. But rest of them will have to use some sort of translation layer. You can run macros on microsoft websites.
3.) Apple will have their own browser which would allow you to only visit limited websites. Those websites need to be represented in some sort of Swift UI and they can only be hosted on apple servers which provide their own translation layer.
4.) There will be opera browser available which can browse all internet . People will use to download funny llm ringtones from zedge. And it wont break most websites.
5.) There will one separate bank app for each bank and it can only run bank website. This will be repackaged sun browser . It's like proton but in java.
6.) Google will be a search toolbar and not a website. Every query will open the intended app. Google will pay users to click and download new apps or listen to ads for 45 seconds.
Hahaha yes, exactly 😄
And don’t forget, in the Apple browser those same websites that are free in the Microsoft one would suddenly be behind a paywall 😂
Also I can totally imagine those 45-second ads per search… like “watch this ad to unlock your query” xDDDDD
First, congratulation for your next stand up: Rewrite of Refactor
I am also like to play with time, and I think your prespective of view is valid. LLM / agents is strong on avarage knowledge which cause a stabilization.
But also improve a creativity because speed up creative thinker process in any kind of topic.
Last: I drink my coffe much fasta than AI can solve any problem.
Exactly! 😄 Rewrite or Refactor does feel a bit like a stand-up sometimes, although there’s definitely plenty of technical depth in there too!
And I totally agree on the creativity part. For me, AI is a huge creativity boost as well, it really speeds up the thinking and experimenting process. But… yeah, there’s a catch. To actually use it that way, you need to bring some creativity yourself first. Otherwise it just reinforces the obvious paths.
And also, Mordor coffee absolutely wins this race ☕😂
Interesting Post. I'm seeing fewer projects that are genuinely solving a problem. In the early days of new tools, people would say "are you crazy, that's impossible" and yet these tools ended up transforming the software industry. Now it feels like we are mostly stacking abstraction on top of abstraction, and fewer people are pushing the table with crazy ideas that could actually change the industry by creating something that makes our life easier.
Btw, I wasn't writing code back in 2011, but I experienced that exact era for fun when I started learning before touching fancy frameworks. That raw friction of figuring out things out from scratch helped me a lot.
Oh yes, this is so underrated.
That whole phase of digging into the fundamentals, syntax, how things actually work under the hood, connecting the dots yourself, it pays off for years. I went through a similar phase at some point in my career, spending a lot of time on exactly that, and I still rely on it today.
It really changes how you see everything later. Frameworks stop feeling like “magic” and start looking like what they actually are, just tools built on top of those same fundamentals.
That’s also why I’ve never really been a fan of any particular framework. For me, they’re just tools. Useful ones, sometimes great ones, but still just tools 🙂
Do you got your beer using the htbmcp protocol? 😁
Hahaha Pascal, absolutely NOT! 😄 If it had been the htbmcp protocol, there definitely wouldn’t have been a hangover 😂
honestly, no - I think we'd have gotten to the same place faster. the frontend complexity wasn't driven by a lack of AI, it was product demands. AI would've just compressed the jQuery-to-React timeline.
Haha, we’ll see in 5 years 😄 I really hope you’re right!
And yeah, these thought experiments are fascinating. I love how they force you to look at the same reality from a completely different angle and suddenly things you took for granted don’t feel so obvious anymore.
2031 reminder set. and yeah - the best thought experiments are the ones that make you lose confidence in your priors halfway through
Hahaha, yeah, and then we’ll probably be sitting there wondering what things would look like without AI 😄
Like I said, I really hope I’m wrong and by then we’ll all have hydrogen-powered flying cars parked outside our houses!
the flying cars + no AI doom combo would be genuinely nice :) and yeah - those thought experiments that flip your confidence halfway through are the only kind worth having
Yeah, exactly, nothing beats a good thought experiment 😄 Especially right before sleep, just to make falling asleep a bit harder 😅
Disclaimer: I'm not a fan of "ai" because of a lot of reasons, but most of all because it's essentially really advanced T9/Markov chains on steroids and because a lot of people who drink the koolaid seem to forgo critical thinking. So if there's disdain in my tone, you're absolutely right. However, this disclaimer is relevantt and I am going to respond in-depth.
You touch on exactly the core of "ai".
Because the inconvenient truth is that "ai" has a knowledge horizon problem, and much like how the human brain works, its inherent internal statistical model and inference models merely improve on statistical connections between the internal vectors.
In human terms and as you already mentioned, "ai" favours the most common/valued paths.
Consequently, when an "ai" engine does not have a common path, it will try to extrapolate a solution, which leads to modelling outputs that resemble existing acceptable paths/patterns.
I.e. "ai" hallucinates!
One of the other challenges here is that the way we've trained "ai" implementations/models, has always been focused around externally controlled input. Or in other words, we pre-chewed the input for the machine. We gave it baby food. And because of that, to my best understanding at least, "ai" does not have a fundamental understanding of the more relationships between different concepts.
"ai" knows how to express a mathematical function and "ai" can (re)produce functions but does not have a more fundamental understanding of what E=MC2 than you or I do beyond that it statistically has to do with the speed of light.
Now... With all of that said...
I think the "ai" companies let the "genie" out of the box too soon. Because money needed to be made. I don't think humanity was ready for "ai".
I think that if current day "ai" was transported to 2010 or even 2000, it would've stifled progress. Because like you and others said... It just optimizes what exists and without money driving the need to progress and innovate, I don't believe that creativity would have come to this point.
Maybe some hackers would... But that lands me to the point of my disclaimer...
The magic thinking machines that is "ai" also seems to inhibit critical thinking.
If "ai" was more mature, then maybe. But looking at how management is already wooed by current "ai" and looking at the current state of "civilization", I have a really hard time seeing how "ai" would make a positive change at any point in the past.
Because after all, doesn't programming distill to a lot of below but repeated ad-nauseum?
Which drives home that "ai" as really advanced T9 is great at repeating things.
But "ai" can't synthesize that.
So we need to create content for "ai" to ingest, but if it can't synthesize and distill the underlying fundamentals, progress will always be restricted by our dependency on "ai".
And colonialism/capitalism says that progress isn't as important as making money.
So I have a hard time believing that we'd do better 10 or 20 years before than we do currently.
There's just more content "ai" could be trained on right now.
Just my 2 cents.
Very interesting perspective, there’s definitely a lot of truth in that.
I think tools like this, used consciously and in the right hands, can really speed up the more repetitive parts of the work. They’re also great for quick iteration or even just structuring thoughts.
I especially liked your point about business and innovation, because those two don’t always go hand in hand.
We’ll see how it all plays out. Thanks for such a thoughtful and in-depth comment 🙂
This is a good thought experiment. There is only so much you can stretch improvisation or innovation through upgrades before reaching a saturation point. The present iteration of Bluetooth and WiFi protocols are essentially production grade, with little scope for innovation beyond minor optimizations.
While React may not be a perfect frontend framework, it's likely the peak of what can be achieved within the limitations of current W3C standards. There are no "framework tricks" left in the bag; any meaningful innovation now must come from an upgrade to the standards themselves. I doubt even WebAssembly can push the needle much further on its own.
If LLMs had existed in 2011, they would have hallucinated solutions using the tools of the era - jQuery, CodeIgniter, WordPress, or Backbone.js. Yet, that small group of innovators would have still conceived something like React, because the architectural vacuum for a reactive state library was still there to be filled.
Exactly—and that highlights the difference between optimization and paradigm shifts.
Most progress happens as refinement inside an existing ceiling: faster protocols, cleaner abstractions, better tooling, lower friction. Bluetooth, WiFi, and modern frontend ecosystems are examples of systems that have matured to the point where gains are increasingly incremental rather than revolutionary.
React’s success wasn’t just clever engineering—it arrived because the web had reached an architectural bottleneck. Complexity in UI state management had outgrown the tools of that era, and React solved a structural problem the ecosystem was ready for. That kind of leap doesn’t happen because someone stacks more tricks onto old tools; it happens because constraints expose a vacuum.
That’s also why LLMs, if dropped into 2011, would likely remix existing patterns rather than originate the next paradigm. Models are strongest within the language of the present. They extrapolate from what exists. But genuine breakthroughs often emerge when someone notices what doesn’t exist yet.
Innovation usually comes from tension: when current standards, assumptions, or primitives can no longer support new demands. At that point, the next React, next protocol, or next computing model appears—not as magic, but as a necessary response to accumulated friction.
So the frontier may not be “better frameworks” or “smarter autocomplete.” It may be new standards, new primitives, new interaction models, or entirely new computing surfaces that today still look unnecessary or impossible.
It's interesting! What do you think about WebMCP in this context?
WebMCP is interesting precisely because it fits this pattern.
If we think of today’s web as reaching maturity in many visible layers—frameworks, rendering models, component systems, deployment pipelines—then the next meaningful shift may not come from prettier abstractions on the page. It may come from changing how software systems connect, reason, and coordinate behind the page.
That is where something like WebMCP becomes relevant. Instead of asking, “How do we build another better frontend stack?” it asks, “How do tools, models, browsers, APIs, and services speak a common language?” That is a standards-layer question, not a framework-layer question.
Historically, those protocol shifts matter more than they first appear. HTTP mattered more than any single website. REST mattered more than many apps built on top of it. OAuth mattered more than countless login forms. Shared interfaces often unlock larger ecosystems than individual products.
If WebMCP succeeds, its importance would not be in flashy features. It would be in reducing friction between humans, AI systems, browsers, SaaS tools, and local software. Once interoperability becomes normal, entirely new products can emerge that currently feel too messy to build.
That said, adoption will be the real challenge. If existing integrations, browser APIs, and plugin systems feel “good enough,” many teams won’t move quickly. Standards only win when they solve enough pain to outweigh migration inertia.
So in this context, WebMCP feels less like a shiny innovation and more like a possible infrastructure move—the kind people underestimate early, then later wonder how they lived without if it reaches critical mass.
I’d push back on that a bit.
React is a great tool, but at the end of the day it’s still just a view library wrapped with an ecosystem. It doesn’t feel like the “peak” of what’s possible on the web.
A lot of what we’re seeing now is iteration on similar ideas, just packaged differently. The real limits are probably in the platform itself, not in whether we use React, Vue, or something else.
So I’m not convinced we’re anywhere near the ceiling yet 🙂
This is a fascinating thought experiment! The idea of AI as a "stabiliser" rather than an accelerator makes so much sense—if it only reinforces what's already common, we might never have felt the "pain" required to move past jQuery or PHP. It really makes you wonder if we're currently in a feedback loop that’s slowing down the next big paradigm shift. Great read!
Thank you so much! 😄
And yeah… I really hope I’m wrong! Innovation might actually move faster but the real question is whether people will want to adopt it if it’s harder for AI agents to work with 😉
Nah we won't
you made a valid point and a good question! If we had AI in 2011. It will make my earlier career easier and school also hehehe.... If we had AI back then, our life might be very different today. Thought!
Exactly, thanks Ben! 😄 You can extend this way beyond IT, even to school.
I honestly doubt kids are still writing essays at home the same way we used to 😅 Teachers are not that naive anymore.
Omg, Sylwia! I should ask my cousins who are in universities now. It bring old memories writing essay these days :)
Hahaha, you definitely should! 😄 Ask them how advanced teachers are these days with spotting all this 😄
hahhaha :). I should ask them.
AI doesn't push frontiers, it paves the roads we've already built. The real risk isn't replacement, it's getting so comfortable with 'good enough' that we stop asking what's next
AI doesn’t truly create frontiers—it strengthens and expands what humanity has already built. Its greatest value is making existing systems faster, smarter, and more accessible. But convenience alone has never changed the world.
The real risk isn’t AI replacing people. It’s people becoming satisfied with “good enough” and losing the drive to pursue what comes next. When everything becomes instant and effortless, ambition can quietly fade.
AI should be a tool, not a substitute for curiosity. It can help us move faster, but it cannot decide where we should go. It can provide answers, but it cannot replace vision.
The future belongs to those who use AI to remove limits while still pushing themselves to think deeper, create more boldly, and question what others accept. The issue is not whether AI becomes stronger—it is whether humans remain hungry enough to keep advancing.
That’s a really thoughtful take.
I think there will always be curious people pushing things forward. The real question is whether the more comfortable majority will be willing to adopt those new ideas if what they already have feels “good enough.”
Exactly. Progress has rarely depended on everyone being curious at the same time. It usually begins with a small minority willing to tolerate uncertainty, discomfort, and ridicule long before the majority sees value in it.
The real friction point is adoption, not invention. New ideas often lose not because they are weak, but because existing systems are convenient enough. “Good enough” is one of the strongest forces in society—it protects habits, infrastructure, business models, and identities.
Most people do not compare the future against its potential; they compare it against the comfort of the present. If the current tool works, if the current routine pays, if the current platform entertains, then radical change can feel unnecessary or even threatening.
That means innovators face two battles: creating something better, and making people feel the pain of staying where they are. Until the limitations of the old model become obvious, adoption stays slow.
History shows this pattern repeatedly. Many breakthroughs existed before society was ready for them. Timing matters almost as much as brilliance.
So yes—curious people will always exist. The larger question is whether comfort delays progress by five years, fifty years, or forever.
Exactly—and history suggests the majority rarely adopts change because it is newer or smarter. They adopt it when staying the same becomes more inconvenient than moving forward.
Curious people create the prototypes, take the early risks, and prove what is possible. But mass adoption usually happens later, when the old system starts creating visible friction: wasted time, lost money, poor experience, competitive pressure, or social momentum.
“Good enough” can delay progress for years because comfort masks hidden costs. People tolerate inefficiency they’ve normalized until an alternative becomes dramatically easier, cheaper, or unavoidable.
So the innovators build the future, but the majority often waits for a trigger. Sometimes that trigger is necessity. Sometimes it is status. Sometimes it is simply that the new thing becomes easier than the old one.
That is why many breakthroughs look sudden from the outside. In reality, the idea existed earlier—the conditions for adoption just had not arrived yet.
That might be exactly it!
I don’t think we’d be stuck in PHP forever, but I do think the transition would’ve been slower. Less pain usually means less urgency to reinvent things.
Yeah, there’s definitely something to that! Makes you wonder where we’d actually be today… maybe slowly “discovering” streaming much later instead of being pushed into it by all that chaos? 😄
Interesting take—AI doesn’t just accelerate output, it biases us toward the most “proven” paths. Feels plausible that in 2011 it would’ve reinforced server-rendered patterns instead of pushing SPAs forward.
The real tension now is convenience vs exploration—AI makes shipping easier, but innovation still requires deliberate friction.
Exactly! 😄
But on the other hand… when I look at the frontend ecosystem today, part of me thinks we might actually benefit from a bit of unification in the stack 😅
I know, I’m kind of contradicting myself here… but hey, why not 😄
This really clicked for me. Especially the idea that AI might stabilize things more than it accelerates them.
I’ve been thinking about it from a slightly different angle. It feels less like acceleration and more like information collapsing over time.
If you look at it in phases, it almost tells a story.
Early on, information was scarce. You had docs, maybe a forum, maybe some random blog. You had to figure things out yourself. That forced people to actually understand systems at a deeper level. Slow progress, but strong foundations.
Then came the Stack Overflow and GitHub era. Suddenly answers were everywhere. You didn’t have to solve everything from scratch anymore. You could search, reuse, remix. That’s when things really started to speed up, but also when patterns started to standardize.
After that, frameworks and cloud abstractions took over. Now complexity wasn’t just shared, it was packaged. You didn’t need to know how it worked, just how to use it. Productivity went way up, but fewer people were thinking about what was happening underneath.
And now we’re here with AI. Not just retrieving or assembling, but generating.
That’s the shift that feels different.
The gap between problem and solution is getting so small that exploration becomes optional. Before, friction forced you to think. Now convenience lets you skip that step.
So instead of a lot of people exploring in different directions, it feels like a smaller group is pushing boundaries, while everyone else converges on what AI already knows.
Which doesn’t kill innovation. But it definitely reshapes it.
Less chaos. More consistency. Maybe fewer weird breakthroughs that come from people struggling through things the hard way.
I don’t think we’d still be sitting in index.php 😄
But I do think we might have stayed there longer, just because it would’ve been good enough for longer.
Feels like the real shift isn’t whether we innovate, but how much discomfort we’re willing to sit with before we decide to do something new.
Curious how others see it. Are we trading exploration for efficiency, or just moving where exploration happens?
Exactly, that’s how it feels, we probably would’ve stayed there longer 😄
On the other hand, maybe we wouldn’t have been changing direction every other weekend. Maybe things would’ve been… simpler.
Although at a higher level it always felt like we were iterating on the same core ideas anyway, just in different forms.
Thanks for such a great comment and perspective, really enjoyed this one 🙂
You got the point. Since the generative, transformer-based, AI only predicts “the next syllable” (of course, I know the concept it’s way more complex, but I prefer to focus on this now) per se… it’s basically stuck on the as-is.
So, if it was introduced back in 2011, it would have suggested its better way of coding in PHP. Let me add that it would have proposed WordPress anywhere, too.
This is the risk of using it without focusing on its characteristics. No matter what you think about it, it’s just an efficient predictive model. It can’t introduce new ways of doing things from scratch, it only optimizes the already known ways, if and where possibile.
Generative AI doesn’t “think” the way we do at all. It replicates reality, like most of people do, but without any human-thinking process behind: I work for a BPO company that provides call centers solutions, for example, and I’m really impressed by the way it mimics the human agents tone of voice, etc., but it has never “created” new approaches. It wasn’t build to create anything, despite the hype, but to automate and optimize the existing.
And I think it does this at its best.
Exactly how I see it too, and that’s where my concerns come from.
If AI mostly optimizes what already exists, it makes total sense that it could keep reinforcing current patterns instead of pushing us into new ones. And that’s the part that worries me a bit.
But honestly, I really hope I’m wrong. Maybe this is just the early stage, and in 10–15 years we’ll look back and realize it actually unlocked a completely new level of innovation, something we can’t even fully imagine yet 😄🚀
AI in 2011 might have reshaped the web dramatically—possibly less search-driven, more assistant-driven. But the modern web likely still evolves, just with a very different interface and economy.
Huh, that’s actually a really interesting angle — I hadn’t thought about the web becoming less search-driven!
I was around that whole era. 2011 was peak Ruby on Rails time, and I have to say that it had the same vibes as AI today to me. Suddenly it was much easier and faster to build apps. Lots of new people creating side projects and experimenting. So maybe having AI then wouldn’t have been much different. That experimentation phase and “Web2” boom would still have happened, where new UX patterns emerged.
Oh yes! 😄 I remember that whole Ruby on Rails “I can build an entire app in a day” era so well.
It really did have a similar vibe. Suddenly things felt possible, people were experimenting like crazy, and a lot of ideas came out of that phase.
This is an amazing article and it explains really well what a lot of us are all thinking.
However, using 2011 as an example for not-yet-modern web and old JavaScript is just to make me feel old 😂
Jokes aside, good work. It’s really well written and something to reflect on.
Haha, thank you so much! 😄
And yeah… I know exactly how you feel. Unfortunately none of us are getting any younger 😂
I think you definitely have a point. But I also think you can get around this by doing good engineering with AI: Give it small, focused tasks. To me the "implement me this feature" thing is a mistake. It's tempting to do, and it can work, but it ignores the basic engineering principle of never changing too much at the same time.
Another method of working with AI is to let it basically stay at the level of individual files, or maybe a couple of files if it needs to change them. Honestly the number of files is not important. What is important is: Keep the change small, focused and precisely express to the LLM what you want. If you do that you stay in full control over the technologies and patterns used. You only loose control over those things if you ask the AI to do too much at once.
You can use the AI as a sparring partner for design, but you should ultimately decide on the design yourself and then use AI in agent mode only to implement the design. That's how I use it and so far pretty successfully.
Is this approach potentially slower? Yes, I think so. But I think it's also much safer, and much more capable of getting around exactly the problem you describe. I think as long as a new technology is documented well, even if AI doesn't have many examples yet it can still do a good job. If you keep your changes small and focused.
Yeah, that’s exactly how I had to approach WebGPU 😄
AI did help a bit, but I really had to feed it tiny, focused pieces and then stitch everything together myself. And even then, it meant chasing errors in the console across different browsers xD
Exactly! This has been my thought since AI came to my attention. It's fantastic at ingesting current knowledge and synthesizing based on that. But where do new paradigms come from? Do they appear at all? I don't see how.
The framing is sharp. I've been shipping a React Native + Supabase side project with Claude Code for 6 months now (750+ commits, solo), and I notice exactly what you describe: AI makes me faster at what already exists, not braver at what doesn't.
The flip side though, before AI, a lot of us didn't ship at all. I wouldn't have built 70% of my app without it. The web didn't need 2011-me armed with ChatGPT; the web needed 2011-me forced to read docs for 3 days. But the app ecosystem today is absorbing a wave of people who wouldn't have participated at all, and some of them will invent things.
So maybe both are true: AI flattens innovation at the frontier, but widens participation at the base. Net effect on "what gets built" might be bigger, not smaller, just from different people.
I actually see it a bit the other way around.
The people who are genuinely interested in innovation will probably iterate even faster with AI. It’s more the “average” developers (which is most of us, realistically) who might stick to the safe paths and not push beyond them.
I do agree that more people will be able to build things now, and that’s great. But without the underlying fundamentals, without going through that phase of really understanding the logic, conditions, variables, all the messy bits… I’m not sure how far they’ll get when it comes to delivering something truly production-ready.
At some point, that gap tends to show.
But hey, we’ll see how it plays out 🙂
Counterfactual that bites for a solo dev: in 2011, you had to slowly understand the entire stack before you could ship. The grind was the moat. In 2026 I shipped 850+ commits in 6 months on TAMSIV with Claude Code, and the difference isn't speed, it's which problems get attention.
Without AI in 2011, my evenings would have been swallowed by boilerplate, build configs, RN gesture-handler quirks. With AI in 2026, those evaporate, and what's left is the part that AI can't help with much yet: figuring out what's actually worth building, what to cut, how the pieces should fit together over months.
So I don't think we'd have the modern web. But I'm not sure the version we get now is more "innovative". It's just less filtered. The cost of trying a bad idea dropped to near-zero, which floods the surface with attempts. The signal-to-noise ratio of the modern web in 2026 might be the actual question.
I kind of feel like, in a way, it’s always been like this.
Sure, building a full end-to-end product used to be much harder, no doubt. But there were still tons of creators out there, people experimenting, shipping things, trying ideas.
Maybe what changed isn’t the number of attempts, but how visible and easy they are now 🙂
My lecturer once said in class that in this era there are so many innovations everywhere that it is difficult to develop a truly new idea and I agree with my lecturer's opinion because when I look for a new idea, LLM only gives very generic answers based on previously existing data even when I give a detailed prompt. I can't imagine that in 2011 there was already LLM or whatever, maybe we will really end up in a very "modern" world of web development according to AI where in fact it is only the stability that AI creates. I am not an expert in the field of web development but the arguments and questions you gave are very good. But maybe the development of the web to reach a modern point will continue even though it is very slow and will not be as big as it is now. But then again im not an expert just yet, Im just a newbie and I just wanna share my opinion
Yeah, I think you’re absolutely right, LLMs tend to lean toward what already exists, so a lot of the output feels… safe and familiar.
But at the same time, for someone who’s ambitious and likes to experiment, AI can actually feel like a superpower.
It won’t give you a truly new idea on its own, but it can help you explore faster, test more variations, and push things further than you normally would. So in a way, it depends a lot on how you use it 🙂
Very interesting questions - I'm on the fence about this ...
Yes, AI tends to strengthen the status quo when it comes to languages, frameworks, and tech stacks - but, I think "real" innovation (like the invention of the SPA/API paradigm) would have happened anyway - as you already said (very true), innovation is mostly done by a relative small group of people, and those people would/will forge ahead, AI or no AI ...
So, if we had AI in 2011 I think we still would have 'invented' new paradigms like SPA, because the logic (you could even say the necessity) of that happening is inevitable ...
I don't know if it would have gone quicker or slower, but I suspect it wouldn't have mattered much.
Where I do think AI might have a "suppressing" effect is in the proliferation of competing languages, frameworks, and tech stacks - but, you can question whether having a dozen different frontend frameworks (which for the largest part do the same thing, just with different syntax/semantics) is real "innovation" to begin with ...
I think people were reaching a point where they were getting tired of the framework proliferation and the fragmentation anyway - "new framework fatigue" had set in a long time ago, before AI came on the scene.
So I'm of the opinion (and have been for some time) that a bit more "standardization" is a good thing ... but, having only one option (e.g. only React) does take things too far - if that would be the effect AI is having then we need to do something about it.
P.S. of course there's plenty of innovation at the moment - it's just that most of that innovation is focusing on AI itself (LLMs, agents, MCP, RAG, etc etc ...)
Yeah, I get that feeling too! At some point the frontend stack really went a bit wild 😄
We had this phase where things were changing constantly. Redux was “the best thing ever”… and a year later it was already considered outdated. Even the dominant frameworks eventually stabilized, but that earlier chaos was something else.
And you’re right about the shift. If you look at what people share now, a lot of it is AI-related. Achievements, experiments, tools… much less about webdev itself.
That part does worry me a bit too. It feels like innovation in webdev has slowed down quite a lot.
Anyway, thanks for such a great comment, really enjoyed reading it 🙂
Thank you! Well maybe it's that webdev has matured a lot, and there's a bit less space for innovation - but that doesn't (shouldn't) mean it's stopping ... obviously AI is brand new (well, relatively speaking) so that's the thing people are "going wild" ;-) at now ...
For me the most interesting aspect about AI is how we "tame" it, with good structured processes around it, so that it doesn't all end in producing mountains of unmaintainable "slop" full of bugs and security leaks - knowing when to use AI and when not, using it 'responsibly' - and not prematurely firing those devs (even juniors) which still have to guide and check the AI !
That's kind of the point that I'm trying to make in almost every comment ;-)
But your article was certainly food for thought!
Absolutely 😄
I was actually talking to a hacker from AWS recently, and she said she loves AI, because all the code it generates makes her job much easier 😂
This is a really interesting angle, especially the idea that AI might stabilize things instead of accelerating them.
What stands out to me is that AI doesn’t just help us build — it shapes what we consider “buildable.” If it’s trained on what already exists, then by default it keeps reinforcing the current path, not necessarily the next one.
So in a 2011 scenario, I honestly think we wouldn’t have rushed into SPAs the way we did. Not because the idea wasn’t valuable, but because the friction that forced that shift would’ve been reduced. And historically, most big shifts in the web came from discomfort, not convenience.
At the same time though, I don’t think exploration disappears — it just becomes more concentrated. The difference is that instead of many developers experimenting out of necessity, you’d have a smaller group pushing boundaries while everyone else optimizes around stable patterns.
So the web probably still evolves, but the distribution of innovation changes. Less chaotic exploration, more centralized breakthroughs.
The real question then becomes: does that make the ecosystem stronger… or just more predictable?
Yeah, I really like that idea of “concentrated innovation”.
It kind of flips the usual narrative. Exploration doesn’t disappear, it just shifts into the hands of a smaller group, while everyone else builds on top of what’s already stable and well-supported.
And the more I think about it, the more interesting that trade-off becomes. On one hand, you get more consistency and less chaos. On the other, you risk losing that messy, widespread experimentation that used to push things forward in unexpected ways.
I’m honestly not sure which version is better. Maybe we get fewer breakthroughs, but more mature ones? Or maybe we just end up optimizing the same ideas over and over again.
Interesting perspective — your point about AI potentially stabilizing existing patterns instead of accelerating innovation really stands out. Sometimes the biggest shifts in web development came from discomfort, experimentation, and even bad tools forcing better ideas. It makes you wonder whether convenience can quietly slow creativity even while making developers more productive. Great thought-provoking read.
Thanks a lot! 😄 Exactly, I like pushing people (and myself!) to think a bit deeper about these things 🙂
Yes, convenience can kills discovery. AI should be used post-learning, not as a replacement for foundational knowledge
I absolutely agree with that.
A friend of mine is currently learning C++, and it’s honestly kind of beautiful to watch. Seeing someone go through the fundamentals, things we now take for granted, really reminds you how much depth there is underneath everything we do 🙂
This question genuinely made me pause — always the sign of a great thought experiment.
The "cognitive miser" point is what sticks with me most. And this might just be my though but but i believe that the modern web wasn't built because PHP was uncomfortable for one developer — it was because a whole generation hit the same walls simultaneously and collectively decided to push forward. AI in 2011 might not have killed innovation, but it could've quietly defused that shared frustration before it had a chance to boil over.
No shared pain, no shared revolution.
And the unsettling part? That widening gap between the few who'd still push boundaries and the majority comfortably optimizing what already works — that might be less of a hypothetical and more of a description of right now. Congrats on JSNation
Thanks so much for the kind words and the JSNation congrats!
And yeah… that’s exactly the unsettling part. What if there was no shared pain at all? If everything in PHP just felt smooth, fast, and “good enough” for everyone?
Would we ever feel the need to move forward?
That’s such an interesting question, and maybe (luckily?) one we’ll never actually get to answer.
Really thoughtful piece, Sylwia. The "cognitive miser meets AI" framing hit hard 😅
I've noticed this exact pattern working on frontend tooling: when I ask an LLM for a solution to a common problem, I get 5 clean, production-ready examples. When I ask about something emerging (like View Transitions API or native module federation), the answers become generic or confidently wrong.
Your 2011 thought experiment is especially compelling. I wonder if the real question isn't "would AI have slowed us down?" but "would AI have changed who gets to innovate?"
Because friction doesn't just slow people down — it also filters for who's willing to endure it. If AI removes that filter, maybe we get more experimentation from a broader group, even if the average suggestion is conservative.
Either way, your talk at JSNation sounds like the right conversation to be having. Safe travels (and maybe skip the non-alcoholic beer next time 😉).
Oh yes, exactly! The human + AI combo for innovation is like a turbo boost. But I’m afraid most people will just take the easier path and get a bit lazy with it.
And yeah… after that whole situation I’ve developed some kind of aversion to beer 😅 I can’t even drink the alcoholic one anymore. Yesterday I poured half a glass and didn’t manage a single sip xD
Well… looks like I’m heading into a forced beer abstinence era 😂
The premise is wrong - but you aren't - just missing the point.
First things first - AI was here from the 1950s. It has influenced coding since there was coding. Language models were also around for a long time - what we added in the last few years is the "Large" part of LLM. and yes - it has changed coding significantly in the last few years - but you're confusing coding and frameworks with change in software. the fact that modern frameworks like Next.js and alike are solving for convenience and not actually adding features to web technology, Everything that Next.js does today was possible 20 years ago - and as one who did code web back then - we actually did it all. In fact we did more then what it offers. it just wasn't that easy to do. If we had LLM based coding agents then, we probably wouldn't need modern frameworks. But actual advances in technology and software - those needs would still appear. They are generated from scale and market changes, not from "coding vibes". stuff like docker would still be needed, since having a repeatable environment that you can deploy everywhere and works consistently between dev and production, that is a need that grew from scale issues. and that problem wouldn't have been effected (much) if we generated the code it bundles ourselves or with a coding agent.
Personally, I think web development is in a terrible state - where the same (already solved) problem is being re-solved 15,000 times.
Bottom line, if LLM coding agents existed 15 years ago as they do today, I think we would miss some "hip" frameworks, that solved for convenience rather then actual technology advancments - which is just fine in my eyes.
That’s a fair perspective 🙂 And just to clarify, I’m fully aware that AI has been around for decades, I actually mentioned that in the second paragraph of the post. My point was more about how the recent wave of LLMs changed the experience of building software.
I’d also push back a bit on the idea that frameworks like Next.js are “just convenience.” Saying that everything was possible before, just harder, is a bit like saying a washing machine is just convenience because we could always wash clothes in a river. Technically true, but it changes how, what, and how much we build in practice 🙂
And I’m not sure I agree that web development today is just re-solving the same problems over and over, unless someone is only building landing pages, forms and dashboards. Once you get into more complex systems, scale, integrations, or performance-sensitive apps, it still feels like very real, evolving challenges.
But I do get your point about trends and “hip frameworks”. There’s definitely some truth there 😄
Awesome article! It’s true that AI agents go with the most trained trends by default, which currently seems to be Next/React + Tailwind + ShadCN. I asked ChatGPT what it would do if it didn’t have to worry about a human reviewing the actual code but rather reviewing the results only, and it said it wouldn’t pick a framework at all, which I thought was interesting.
I think there’s an opportunity to let AI make its own system and iterate on it using the same sort of antagonistic feedback that it uses in physical engineering, where it is asked to take an existing mechanism like a car chassis and make it 40% lighter without loss of integrity. Those result in things we would never have dreamed of but perform better than anything else before it; who knows what would be created for the web if the same approach was taken there?
That’s a really interesting angle.
A friend of mine (CTO) actually experimented with something similar. He gave non-technical managers Claude Code and asked them to build an app. And yes, it did build one… and yes, without any framework, just huge chunks of JS stuffed into a single HTML file. It technically worked, but was completely unmaintainable.
He was laughing that, well… “we’re not there yet” 😄 and said next time he’d at least give them a framework upfront.
So yeah, I love the idea, but based on that experience I’m not quite that optimistic yet 🙂
This is such a fascinating 'what-if.' In 2011, we were still figuring out the 'why' behind patterns like Responsive Design and MVC. If we had GenAI back then, I suspect we would have traded understanding for velocity.
We might have ended up with a 'Modern Web' that works, but is fundamentally un-auditable because the underlying logic was generated rather than reasoned through. We’d be dealing with 15-year-old 'Black Box' technical debt today. Sometimes the friction of manual coding is what actually builds the architectural integrity we rely on later.
Exactly, that could totally have happened! I’m honestly a bit terrified of a 2026 where I’m still writing index.php 😂
This is really interesting, mainly the part about the AI stabilizing things. In other hand for innovation, i do not think we will be have everything we have now if it was the case. We will have different things an less advanced like now.
People will be focused on stabilizing, create pattern than creating SPA. I do not think we will not have it but we will have it more later as the focus has shift because thing because less uncomfortable. As you said, we are cognitive misers so less will have the need to push forward but some will still try it.
Yeah, there might be something to that. I also have a feeling we’d eventually end up in a similar place as today, just… later.
Even the shift in what people value kind of shows it. Back then, people were proud of knowing frameworks, new tools, new paradigms. Now we’re starting to see people being proud of knowing agents and AI workflows.
It’s like the focus moves, but the underlying dynamic stays the same 🙂
Sylwia, you just casually dropped a philosophical nuke on my Wednesday. If we had LLMs in 2011, we wouldn't just be stuck on PHP...we'd be running a heavily optimized, AI-hallucinated version of jQuery that somehow requires 128GB of RAM just to animate a tag.
You're dead right about the friction. The only reason we invented SPAs, the Virtual DOM, and the 47,000 JavaScript frameworks currently haunting my node_modules folder is because we were suffering. Pure, unadulterated developer trauma. If an AI had been there to pat us on the head and hand us a flawlessly formatted, working index.php file, we would have taken the blue pill, shipped to production via FTP, and gone to the pub.
Honestly? A timeline where I don't have to debug Webpack configs because an AI just wrote the greatest PHP monolith ever conceived sounds like a vacation. Plug me back into the Matrix. I'll see you in the index.php mines! 😁
Hahaha I love this 😄
And oh wow, you just unlocked a memory I almost deleted from my brain… deployment via FTP. Just dragging files and hoping nothing breaks in production 😂
And yeah, on one hand that simpler world does sound like a vacation… but then I think about it and realize we might still not have promises and would be stuck deep in jQuery callback hell...
Suddenly modern tooling doesn’t feel that bad anymore 😄
How will we get enormous data to train the AI in 2011. Sites like reddit, twitter shut down easy access to scrapping as soon as people figure out that data is super valuable. Browser vendors would focus on AI instead of improving web standards as much. It would not be same
I totally get that, the world probably wouldn’t look anything like it does today. My example was more of a thought experiment to frame what’s happening now.
But yeah, that’s a fascinating observation too. It really could have gone in a completely different direction!
The counterfactual that actually keeps me up: without 2010s-era JS framework churn as forcing function, we might never have gotten streaming/SSR solved end-to-end. A lot of what makes RAG and agent UIs feel responsive today (incremental token rendering, partial hydration, resumable streams) is the distilled scar tissue from React/Vue/Svelte arguing about hydration for a decade.
An "AI-in-2011" timeline probably skips server-rendered PHP straight to something like an RSC-ish model — but also probably never evolves the streaming primitives we now rely on to make LLM output feel alive. Cleaner stack, worse UX for generative interfaces. Net: I think we'd have a quieter, saner web and a much clunkier AI-native one.
That’s actually a really great perspective — I hadn’t thought about it that way!
I especially like the idea that all that framework churn and “hydration wars” might have indirectly shaped what makes modern AI interfaces feel responsive today. It’s funny how something that felt like pure chaos at the time can end up leaving behind really valuable building blocks.
Makes me appreciate that whole messy phase a bit more 🙂
This is actually uncomfortable to think about, but it makes sense.
Feels like AI doesn’t just help you move faster — it quietly narrows your choices. You stop exploring because the “good enough” answer is already there in 2 seconds.
I’ve caught myself doing this a lot. Earlier I would dig around, try weird approaches, break things. Now it’s more like: prompt → get something working → move on.
Not sure if that’s progress or just efficiency at staying in the same lane.
Curious if people who started coding with AI will even feel this difference.
I’ve actually been wondering what it’s like for people who started coding in the AI era. My gut feeling is that the more ambitious ones eventually hit a point where they go: “okay, stop, I actually want to understand what’s going on”… and they circle back to the fundamentals.
Because at some point, “it works” just isn’t enough anymore.
This is such a fascinating question and honestly, a little uncomfortable to sit with.
I think you're right to ask. The modern web was built on experimentation, failure, and let's try this and see what breaks. Responsive design wasn't a prompt. React wasn't generated. People struggled, argued, built things that didn't work, and eventually figured out what did.
If AI had been there to optimize everything, would we have ever gotten CSS Grid? Would someone have prompted "make a component system" and stopped at the first working answer? Probably.
The scariest part isn't what AI would have done. It's what we wouldn't have done because AI gave us a good enough answer before we had a chance to be curious.
That said maybe AI would have accelerated things too. Maybe we'd be 10 years ahead. But we'd also have missed the messy, beautiful process of figuring it out ourselves.
What's one web technology you're glad we struggled to invent without AI?
Thanks for this — made me think. 🙌
I love that framing, especially the “what we wouldn’t have done” part. That’s such a good way to put it.
It’s easy to think about what AI helps us build faster, but much harder to think about what it quietly removes from the process. Curiosity, trial and error, even those “this is probably a bad idea but let’s try anyway” moments.
And that’s where a lot of real innovation used to come from.
As for your question… honestly, things like CSS layout evolution come to mind. Flexbox, then Grid. They didn’t appear because someone asked for the “optimal layout system”. They came from years of people struggling with floats, hacks, and workarounds.
Would we have gotten there faster with AI? Maybe.
But would we have explored as many different paths along the way? I’m not so sure.
The AI isn't stabilizing anything. Devs are. It defaults to React because that's what 90% of prompts ask for. The people building WGSL shaders still build them tey just get less autocomplete. Nothing changed except the floor got higher.
That’s a really fair point, especially about the floor getting higher.
I agree that AI is just a tool and devs make the decisions. But the interesting part is how it reduces friction and plays into our natural tendency to take the easier path.
In the past, discomfort pushed people to experiment. Now a working solution is often one prompt away. AI won’t challenge us unless we want it to. It just gives us what already works.
So it doesn’t stabilize things on its own, but it can make it much easier for us to stay in familiar territory 🙂
Yes but it will definitely be in an another way may be different tools and may not even exist but for sure the modern web would be a lot more developed
If you had to play a bit of a prophet/philosopher here, what do you think the web would actually look like today in that scenario? I’m really curious!
May be static web pages would not exists and everything will be powered by AI. Even small businesses and communities will have more interactive web pages . Also by this time may be the cost of creating a website will be suppressed on a large scale. I guess frontend frameworks will may lose it popularity as such great user experiences can be easily be generated by AI and specific roles may only exists in the backend.
Very possible, honestly!
I see AI as an accelerator—it boosts you in whatever direction you’re already heading.
My journey started on public desktops in 2001.
My curiosity drives me to use AI to explore the "future web," which ironically looks like the old way: offline-first, simple, and fast.
If AI existed in 2011, the lane-split would be the same as it is today. Some would use it to innovate, while others would still be struggling with recognizing a paste icon. 😅 For me, AI is just the tool that helps me return to the simplicity of vanilla, resilient architecture the Gnoke way.
Yeah, there’s definitely something in that simplicity.
And funny you mention it, because when I first heard about React Server Components, my immediate thought was… wait, I’ve seen this pattern before somewhere 😄
Really sharp framing, and I think you're closer to right than you give yourself credit for. But I'd push the argument one step further: the 2011 thought experiment actually understates the effect, because AI wouldn't just have reinforced PHP — it would have made PHP genuinely good enough for longer. Better scaffolding, cleaner patterns, instant boilerplate. The pain that pushed us toward SPAs wasn't purely technical; it was the friction of writing the same form-handling code for the thousandth time. Remove that friction and the motivation to invent Backbone, Angular, React gets a lot weaker.
Exactly, that’s a great point.
There’s also that well-known argument, and honestly a valid one, that if something is already good and proven, why rush into new shiny things.
We’ll see where this goes. In theory, you can train AI on a new language or framework just from the documentation. But the real question is everything around it. The ecosystem, the libraries, the patterns that only emerge over time through real usage.
That’s the part you can’t bootstrap overnight.
Really interesting times 😄
This is one of those posts that will stick with me for weeks You've articulated something I've felt but couldn't put into words
The cognitive miser point is especially brutal We're not just offloading boilerplate to AI we're offloading architectural decisions And the scary part? Most of the time it works well enough that we don't question it
I wonder if we're heading toward a world where 'good enough' becomes the new 'best practice'not because developers are lazy, but because the path of least resistance is now algorithmically reinforced
Your 2011 thought experiment is terrifyingly plausible. PHP would have lived another decade, maybe more And someone suggesting React in 2013 would have been laughed at because 'AI doesn't recommend it.
We're not being replaced. We're being… pacified. And that's almost worse.
Brilliant post, Sylwia. Definitely following for more takes like this 😆😆😅
Exactly, that’s such a good way to put it.
Innovation might be happening faster, but for it to actually matter, someone has to use it. And the average developer, especially under business pressure, will most likely stick to what’s already proven and “good enough”.
So even if new ideas appear, adoption is a whole different story.
And thank you so much for the kind words, I really appreciate it 🙂
Great read! I saw a post from ChatGPT comparing the introduction of AI to a productivity revolution. I thought it was pretty accurate. I would surmise that if AI had existed before the ripple effect would have transformed the modern web we see today as well.
Thanks, Evan! 😄 Now I’m really wondering what that would actually look like today.
The WebGPU example really clicked for me. Hit that same wall myself. Everything works until you leave familiar ground, then it just starts inventing things with total confidence. You end up spending more time cleaning up what it gave you than if you'd opened the docs from the start.The cognitive miser part stuck with me most. I do this on boring Tuesdays, nothing urgent. Something runs, I ship it, and whether it was actually the right choice just never comes up. Hard habit to break once you see it.On 2011, faster execution, slower imagination. Cleaner PHP apps, not React. The people who actually built SPAs were mostly just annoyed enough to try something nobody asked for. AI takes that annoyance away before it becomes anything.Congrats on JSNation. That kind of topic only comes up after the damage is done.
Exactly! 😄
A friend of mine works on game engines and says the same thing: AI just isn’t that helpful there, because that knowledge simply isn’t widely available online.
With WebGPU it wasn’t completely useless either, but it definitely needed a lot of hand-holding and corrections along the way.
Great question 🤔 .....
AI in 2011 could have been a great tool to improve PHP. In some ways, I think about PHP's evolution toward something similar to SPAs. The need to resolve paradigm shifts with new frameworks every six months would probably have been slower than what we experience today.
The AI impact could have been smaller due to the limited tech stack we had at that time. Awesome, yes but not as big as it is right now.
I truly love the entire journey I've had up to today to learn and become a frontend developer. It probably wouldn't have been as fun and challenging if I had had an AI tool from the very beginning. I would surely have taken the easy path and wouldn't have learned as much.
Oh yeah, that’s a really good point. AI back then wouldn’t have had nearly as much “fuel” to learn from.
And exactly, today we definitely code faster with AI. No doubt about that.
But at the same time, if you don’t understand the fundamentals, you’ll hit a wall sooner or later anyway. At some point, “it works” stops being enough, and that’s usually where things get… interesting 😄
This is such a fascinating thought experiment! 🤔 As someone working with ML systems daily, I've definitely noticed this pattern. AI tools are incredible at optimizing what already exists, but they tend to push us toward established solutions rather than truly innovative ones.
I love the "cognitive miser" concept - it perfectly captures what I see in data engineering. We reach for AI-generated solutions because they're efficient, but then we end up building the same architectures over and over. It's like we're training ourselves to stay within the boundaries of what's already been done. 🔄
What really resonates with me is how this affects cutting-edge work. When I'm implementing something completely new, AI tools often feel like they're actively working against the innovation, suggesting safer, more conventional paths instead. 🛤️
Really looking forward to your next piece in this series! ✨️
I don't know, But I do believe, I would be more than a web dev if it was released in 2011
Haha, we’ll see in 2031… or maybe even 2041! 😄
Guys, so many comments and not a single “this was written by ChatGPT”? I’m disappointed! 😂
Right Now many guys do this only 😁
Really interesting perspective. The idea that AI reinforces existing patterns more than it drives new ones feels very real—especially when you see how quickly it defaults to “safe” stacks.
Thanks, glad it resonates 🙂
Loved it
Thanks a million 😊💖
AI did exist in 2011 but not like what we are seeing in the past 3 years. i think it would have been great for modern web. developers wont be nagging about vibe coders all day.
Yeah, exactly — the early experimental stuff goes way back, even to the 70s 😄
No