DEV Community

Cover image for LLMs, DevOps, and Big Data Musings
bfuller
bfuller

Posted on

LLMs, DevOps, and Big Data Musings

I have had the privilege of talking with several folks over the past few weeks around the state of the world. Well, the tech world to be precise. I was inspired by my chat this morning with the wonderful Beth Glenfield. We talked about Product, trends, AI, and potential future pain points.

One of the things both of us love about being a product manager is that we can watch trends. See the ebb and flow of ideas and how they are implemented. See the gaps and the impact of those gaps.

Remember when, at the beginning of the DevOps movement before the cloud was pervasive and Sys Admins didn’t see a reason to automate? They didn’t because while automation allowed for efficiency it didn’t necessarily reduce the complexity. So the first things they automated were the annoying boring processes and tasks. Automation adds layers and layers add complexity. Cloud came along and forced the need for different layers and complexity. Cloud is faster and faster is better, right? For many, to be competitive it felt like you HAD to move to the cloud. Which created a movement. A successful one.

Now we are on the precipice of another movement. LLMs and ML are that movement. You can see it. I can almost hear the earnest manager I spoke to years ago. He asked, “Do we have the DevOps now?” I’m excited to hear what the LLM version of this is going to be.

But here’s a lesson from the DevOps movement to the next wave of DevOps. Be mindful of the bottlenecks you are creating. If you are only making one team 10x more efficient but not the others, you haven’t really gained anything. This is where internal discoverability, guardrails, reliability, tools that allow teams to reduce toil from code to security, and tools that help your TPM will become critical. These products are out there, right now. While some have LLM features, others are there because you frankly have a gap today and that gap is only going to get wider as your usage of LLMs becomes more pervasive.

Further, AI tooling will need to move beyond, I can code a thing or “create art” and become that dedicated data store or focus. LLMs to be effective need clean, unbiased, and relevant data to provide the answers that make the most sense. Our grand AI experiment will need to incorporate the lessons from the past. Learning from the big data camp, or even the battle between Betamax and VHS.

Image description
While Devin might be able to code LLMs are always lagging. Humans are the innovators. Moreover, if you don’t have juniors who learn how to code, debug, and write tests, you don’t get mid-level devs who start to architect and build more complex features, or seniors and principals who do some gnarly problem-solving and are systems thinkers. It serves no one to replace a person with “AI”. I’m pretty sure that’s the start of a dystopian story.

Why do I mention this? I mention it because I don’t believe “AI” is going to replace us as people. I do think, similar to DevOps, it is going to change how we work and the kind of work we do. That’s not a bad thing. It means systems thinking, internal discoverability, reliability, operational excellence, and context are going to rule the next wave.

We need those system thinkers more than ever. What if “AI” turned every dev into a 10x dev. Do we have 10x CS, PS, TPMs, RelEng, Ops, QA, AppSec, Product, Marketing, Sales, Product Marketing, Tech writers, Pre and Post sales eng? Those systems thinkers will help us avoid the impending bottleneck(s).

Building a product is a team effort. I think we’ve gotten over the factory analogy, right? Products from idea to support to sunset are complex. It is Jeremy Bearimy in the best way possible.

Image description

LLMs are here to stay. What’s going to be interesting is seeing what we’ve learned from our past mistakes. What I know is, that the leaders I’ve met who are building the Next Wave tools understand the space, how we got here, and where we need to go. In short, I'm excited for the future.

Top comments (0)