🦄 Making great presentations more accessible.
This project enhances multilingual accessibility and discoverability while preserving the original content. Detailed transcriptions and keyframes capture the nuances and technical insights that convey the full value of each session.
Note: A comprehensive list of re:Invent 2025 transcribed articles is available in this Spreadsheet!
Overview
📖 AWS re:Invent 2025 -Scaling Global Energy Transformation: Kraken's AI-Powered Journey on AWS-IND3324
In this video, Tyler from Kraken presents their AI philosophy for the energy and utilities sector. Kraken operates as a cloud-native operating system for energy utilities, built on AWS, encompassing customer care, billing, flexibility management, generation optimization, and field workforce tools. The platform features a unified data model that consolidates traditionally siloed systems, enabling 200 deployments daily. Their utility-grade AI includes Magic Ink for automated email responses, Customer Intent Dashboard for sentiment analysis, and AI-driven field scheduling that opens 150,000 additional site visits annually. Kraken prioritizes security through data redaction before feeding information to LLMs and offers clients multiple LLM options. The roadmap includes Agent Studio, enabling agentic AI where energy specialists manage autonomous AI bot teams. Proven at scale with Octopus Energy's 8 million customers, their AI tools achieve 89% acceptance rates and improve both customer satisfaction and employee retention.
; This article is entirely auto-generated while preserving the original presentation content as much as possible. Please note that there may be typos or inaccuracies.
Main Part
Introducing Kraken: An Operating System for the Energy Utility Industry
Happy Wednesday everyone. As you mentioned, my name is Tyler. I'm from Kraken. Thank you for joining. I think the greatest gift that you can give is your time, so I very much appreciate that you've stepped aside from all the craziness happening to hear a little bit about Kraken and our AI journey. We are in Las Vegas, and it feels like if you've hit the poker tables, you may be familiar with the term or the concept of ante, and it feels like in this world, in this room, talking about AI is the ante. It is the table stakes. And so I'm here to throw in our ante to talk to you a little bit about AI, and I'm going to zoom in specifically into the vertical of energy and utilities.
So over the next 20 minutes, I want to introduce you to Kraken, sort of familiarize yourselves with us, and then take you on the AI journey that we've been on, and specifically introduce you to our AI philosophy that is unique and specific to energy in the utility space.
So first and foremost, you may not be familiar with Kraken, so let me just sort of start with our why. Kraken is a company that is focused on transforming the energy system. Specifically, we think about ourselves as an operating system for energy utilities globally. Similar, I was thinking about this slide, similar to what AWS has tried to do to provide a platform for everyone to innovate, Kraken is trying to do that in a similar way for an industry that, to be honest, has been a bit stale for a number of decades, which is the energy and utility industry. They have not had modern technology to move forward how our energy system is operating. That's what Kraken is seeking to do.
How are we doing that? There are really sort of four fundamental aspects of what Kraken is. First and foremost, we are a customer care and billing engine. If you're familiar with the energy space, you may be familiar with acronyms like CIS, Customer Information Systems. But what's unique about Kraken Customer is that it unifies a lot of pieces of technology that historically have been disparate. A CIS, a CRM, a meter data management system, all those have not existed as a singular thing. That's where Kraken starts.
As we move around the circle clockwise, we also have a flexibility arm. So the control of electric energy assets, whether it's a thermostat in our home, a battery or an EV, these can now be used to balance the grid. That's what the residential flexibility is doing. We're also helping on the generation side. So power plants, big grid scale batteries, Kraken is optimizing those to deliver value. And then lastly, we have a field workforce tool, so the rolling of trucks to actually do work in the field. And when you look at this, this is the platform that if you're an energy company today, you need all these pieces working together. And that hasn't existed before.
We're built on AWS. So digging a little bit into our sort of the core of our technology. What's unique about Kraken is the breadth of it. There have not been technology solutions that sort of cover those four areas before. Another key aspect which is really important as we're going to talk about AI is the fact that Kraken has a singular data model. Historically, when you, in our space of energy, you've had different pieces of data siloed, whether it's in your customer relationship management system that's separate from how you're doing meter to cash. What's unique about us is we've built a data model specific for energy, and that's fundamental to the AI I'm going to talk about.
We're cloud native. We grew up in AWS. We've been there forever. We have a layered architecture. I'll introduce you to that in a second. What's also unique about our space, everyone in this room is probably familiar with the acronym CI/CD. It is not an acronym that is familiar in the utility space. Continuous integration, continuous deployment. The fact that Kraken changes 200 times a day, that's really, really scary to the people that Kraken's working for, because these are utilities that are used to change maybe once a year. And so it's a fundamental shift for them. And we're doing a bunch of AI which I'm going to talk about.
We're built on AWS. AWS has been sort of a foundational partner because it is a brand that our utility space knows. Also, the fact that they have a geographical presence across the globe gives a lot of our clients comfort when it comes to sort of the resiliency that these systems need to have.
So I was told it's table stakes as well to have an architecture slide, so I'm going to have a few. In terms of Kraken's architecture, really we sort of are a layered cake. The core of Kraken is sort of a global core base. 80% of the innovation we're trying to do spans across the globe. Our CTO has a sort of a motto, core first. But we recognize that because Kraken operates globally, we're going to need sort of bespoke elements. And so our code base has a territory layer that's maybe specific to the UK versus the US versus Australia. And then we also have a utility and client layer, which allows us to push code across this layered architecture.
I think another fundamental aspect as we're going to, before we talk about AI, is a philosophy that has to root the product that we are building. And in our space, if you ask this question to an energy utility, what is behind the meter, they will say the customer. But what Kraken is saying fundamentally is it's different. What is behind the meter to the users of energy is actually the utility. And it's a shift in that mindset that is fundamental to Kraken's philosophy. It's about customer centricity. And this has been sort of our ethos from the beginning. We feel like if in order to serve the energy of today, you need to put the customer first. And I provide you all that context because I want that to sort of be rooted in the AI story that I'm going to introduce you to.
Kraken's Utility Grade AI: A Unified Data Model Built for Energy at Scale
So, Kraken's utility grade AI. What is unique about what we're doing with AI is sort of three fundamental aspects.
First, it is this unified data model that I mentioned earlier. AI is so enriched when you can get it and feed it rich data sets. In our space, that has been hard to do because data has been siloed and it has not been built around a data model that is intelligent and intuitive to folks to use. And so what Kraken has done is centralized all of that data in a way that can be fed into AI. So I'll talk a little bit more about that.
The second is the fact that what Kraken is doing is bespoke built for the vertical that we operate in, energy and utilities. And that is really important because a lot of the legacy technology that exists in this space was not myopically focused on utilities and their use cases, and that is specifically what Kraken is trying to do. And I think what's also, lastly, what is unique is that Kraken is doing this in our space at scale, that is unique and not familiar or used to in a lot of our legacy solutions.
And the way that we're doing that, a little bit of the origin story of Kraken, we've grown up in parallel to an energy company. You may have heard of Octopus Energy. Octopus Energy is the largest energy retailer in the United Kingdom. They now serve about 8 million customers, and they're using Kraken as their technology, and we've grown up in their business. So what's unique about Kraken is that we as a company, a technology company, have actually grown up inside of one of our clients. And that allows us to test and prove our AI at scale.
So let's talk a little bit about the data model. This is basically the system architecture in a very simplified way of most of the utility clients that we work with. These are their silos. These are their key technology aspects, but they're all disparate things. And what that leads to is this spaghetti soup that makes innovation, change, and access to data just impossible. And if you were to overlay AI solutions on top of this, it would get even more messy, because the source data is just a mess.
Enter Kraken's unified clean data model. All of this exists inside of a singular instance of Kraken, which empowers data teams inside of our clients to access and generate data and make decisions a lot faster. But also, it allows us to feed into AI tools in a way that can create richer and better customer experiences. And I'll introduce you to a few of those stories here in a second. So what does our AI infrastructure fundamentally look like?
At its core, we have a core platform that spans the data across multiple types of verticals that we are supporting. And then we've built our own fundamental AI infrastructure layer. Sitting on top of that is an access layer to both products that we have built, but also this concept of Open Kraken. Part of what we're trying to do with this operating system for energy is to provide our clients the opportunity to bring their own types of technology and tap into this ecosystem because we don't want to be simply just, you're locked into Kraken, we want to build and enable innovation. And I'm going to introduce you both to Agent Assist and Agent Studio here in a second.
I think another fundamental aspect of what we're trying to do is that AI itself is not a product. AI is embedded throughout what Kraken is doing. I introduced you a second ago to these four key areas of where Kraken works. This is a view into some of the specific AI capabilities that span across the Kraken platform. On the customer side, we have this suite called Agent Assist. Magic Ink, we are rooted in nautical sea beasts, and so sticking with the squid, they have ink, and so Magic Ink is a tool that is used to generate suggested email responses to a customer. I'm going to introduce you to that a little bit more.
The Customer Intent Dashboard measures all the calls and communications that are coming in from customers. When you serve 8 million customers or millions of customers, understanding the sentiment from all their communications is really important. Storyline gives a singular view of what's happening across the customer. On the flexibility side, we're using AI to support grid relief and optimize devices in a very smart way. On the generation side, it's similar. How do you scale and optimize a wind turbine? AI can support in that. And then in field, I'm going to walk you through a bit of this as well, but I think the point is, AI is across what Kraken's trying to do because that's the value.
AI in Action: From Field Optimization to Magic Ink's Customer Service Revolution
So a few different specific use cases and zoom into a few of these. On the field side, we are using AI to intelligently schedule a field workforce, the rolling of trucks. And when you have thousands of different field engineers, that is a complex problem if someone calls out sick and optimizing that schedule. And so we're using AI to optimize schedules. And what we found over the course of a year for a workforce of about that size of thousands of engineers, we can open up 150,000 site visits by the optimization of scheduling, which can be a significant impact on carbon just by rolling trucks.
On the residential flexibility side, so I talked about Kraken is also optimizing a bunch of devices. What you see here is a combination of a user experience, but also behind the scenes, the intelligent optimization of electric vehicles and batteries to deliver grid relief.
What I've mentioned before is that you have to put the customer at the center of what you're doing. And when you do that, you can yield programs that have an extremely low churn. The point is, create value for the customer and they're going to stick around, and AI is helping us with that.
So let's zoom a little bit further into the tech side of this. A key aspect of the space that we operate in, energy utilities, is risk avoidance. These are organizations that are our clients that have to deliver power safely, reliably, and affordably. So security really matters to them. Within our own infrastructure, each of our clients has a single tenancy and their own instance of Kraken that is hosted within an AWS region of their choosing.
Another key aspect of what Kraken has built internally is a data redaction tool that, before we feed information into an LLM, we're doing data screening using our own extraction to make sure that we are keeping personally identifiable information or sensitive data within our own secure infrastructure as we then feed into other LLMs. And we're also providing our customers multiple optionality around which LLM they prefer to choose.
So let me walk you through at a deeper level just one more example of how this works. I introduced you a second ago to the Magic Ink. The problem that this is trying to solve, if you are an energy specialist, if you're answering a phone call or an email from a customer saying, you know, why is my bill high and what can I do to reduce the cost of my energy, if you didn't have technology or AI, that would be a fairly hard problem to solve. You'd have to go find multiple types of data to get the context around that customer. You'd have to get information about what types of programs could support them.
So this is a prime use case of how AI can empower a human, in this instance, a customer service representative, to better serve their end customer. So how are we doing that? This starts with a query, basically a human that says, you know, I need help answering this question. That then feeds into our prompt engineering, and that's primarily what we've built in-house, which is how can we feed an LLM an intelligent prompt to then generate an intelligent answer to send back to a customer.
So how are we doing that? We feed the question into our own engine. On the top here, we have retrieval augmentation generation, where we're getting a bunch of different information from the client. Could be their knowledge base, could be information about that specific customer. And then marrying that up on the bottom here with customer data. What is your energy usage? Where do you live? What sort of other information can we use to then feed in, in this instance, into OpenAI?
We've extracted that personally identifiable information. We're feeding OpenAI an intelligent prompt that is then fed back to us. We transpose back in the personally identifiable information and then generate and quality check it against key data elements that are in our database to then generate a suggested answer for this customer. So this is a workflow that probably would have taken 10 minutes. Now it's automated through AI in a matter of seconds.
Delivering Value at Scale: Results and the Future of Agent Studio
This only really matters if you're delivering value at scale. So a few data points from a number of our customers around the value that AI is generating. I've talked about productivity. I think a key element here too is that these tools, when done best, both deliver customer value but also make the employees that are serving them better. And that is a significant value for customer call centers that have significant turnover because their employees are not satisfied.
And so what we find is that the agents that are using these AI tools, Magic Ink for example, love it. It makes their job easier and faster. The folks that are getting the responses, the end customer, like the answer more than the human generated answer. And it's creating a significant amount of efficiency across the board, and you can see 89% of the responses that are proposed are accepted ultimately at the end.
So lastly, where is Kraken going for our customers? The roadmap of Kraken around this space is what we call Agent Studio. And the idea is to create, using agentic AI, all that we've learned to create autonomous agents. And the idea is that today, as opposed to energy specialists that are using our tools to answer questions, that energy specialist can actually manage their own team of AI bots and have access to dashboards, and basically they operate as their own little mini team that is overseeing AI and supervising it, and it is answering questions. And so this is what Kraken is focused on for the next year, and again, all built on top of AWS.
That's all I've got. Thank you. I really appreciate the time and the space, and again, the best gift of your time, so thank you very much. Enjoy your time in Vegas.
; This article is entirely auto-generated using Amazon Bedrock.



















Top comments (0)