DEV Community

Cover image for AWS re:Invent 2025 - Vibe modernize your .NET applications using AWS Transform and Kiro (MAM343)
Kazuya
Kazuya

Posted on

AWS re:Invent 2025 - Vibe modernize your .NET applications using AWS Transform and Kiro (MAM343)

🦄 Making great presentations more accessible.
This project aims to enhances multilingual accessibility and discoverability while maintaining the integrity of original content. Detailed transcriptions and keyframes preserve the nuances and technical insights that make each session compelling.

Overview

📖 AWS re:Invent 2025 - Vibe modernize your .NET applications using AWS Transform and Kiro (MAM343)

In this video, Alex and Prasad demonstrate modernizing a legacy .NET Framework application called Contoso University through seven steps using AWS Transform for .NET and Kiro. They port the .NET Framework 4.8 application to .NET 8, fix compilation errors, replace SQL Server with PostgreSQL, migrate MSMQ to Amazon SQS, extract a notification microservice from the monolith, refactor Razor Views to a React frontend, and deploy everything on AWS using CDK. The session showcases both vibe coding and spec-driven development approaches in Kiro, with spec-driven development creating requirements, design documents, and implementation tasks automatically. The final architecture runs on Amazon EC2 Linux instances with Aurora PostgreSQL, CloudFront, and SQS, transforming a Windows-dependent monolith into a cloud-native microservices application.


; This article is entirely auto-generated while preserving the original presentation content as much as possible. Please note that there may be typos or inaccuracies.

Main Part

Introduction: Modernizing Legacy .NET Framework Applications with Agentic AI

Good afternoon everyone. Welcome to this code talk. I assume that you love C code more than seeing PowerPoint, so that is the goal of this session. I also assume that most of you have a .NET developer background and are dealing with some legacy .NET Framework applications that you need to modernize. Welcome everyone. I'm Alex, and I'm based in Stockholm, Sweden. I'm Prasad, and I'm based in London, UK. In this session, we will show you how you can modernize your legacy .NET Framework applications using agentic AI tools like AWS Transform for .NET and Kiro. We'll do a lot of live coding because this is a code talk.

Before we get started, I'd like to ask: how many of you are still managing some kind of legacy .NET Framework applications? It looks like almost everyone in the room. Perfect, so you're in the right session. Let's get started.

Thumbnail 0

When we talk about legacy .NET applications, there are three components that we need to modernize. The first is the framework and the language in which it is written. .NET Framework, as you know, can run only on Windows and is proprietary. We would like to modernize it to the latest version of .NET, such as .NET 8 or .NET 10, both of which are long-term support versions. That is the first part of modernization: code modernization.

The second is architecture modernization. This means that your application might be a monolith, and when you are modernizing, you would like to use best practices and architectural patterns. You will be using microservices and event-driven architectures. That is architecture modernization. Finally, there is infrastructure modernization. Your .NET Framework applications can run on Windows, but you would like to run them on Linux. If it is an ASP.NET application, it will probably be running on IIS. Instead of that, you would like to run it on serverless containers or EC2 Linux.

Thumbnail 120

In this particular session, we will look into modernizing all three aspects of the legacy application. To do that, we have taken an open source application called Contoso University. It is an ASP.NET application with standard features. It is a university management application which has features like student enrollment, course management, and instructor assignment. As you can see, it is built quite heavily using Windows components. For example, for queues, it is using MSMQ. For the database, it is using SQL Server. Looking at the presentation layer, it is using Razor views with MVC controllers. It has a service layer with a notification service, which sends out notifications via MSMQ. In the data layer, it has Entity Framework, which it uses to connect to and communicate with SQL Server. That is the architecture which you are going to modernize.

Thumbnail 140

Understanding the Contoso University Application and the Seven-Step Modernization Plan

Let's look at the code and how the current codebase looks. This is quite a typical .NET Framework application which contains components that you probably see in your applications from the last twenty years. Let me switch to the code.

Thumbnail 200

I have this application in my GitHub repository. As you can see, it is a .NET Framework 4.8 application. It has a huge project file, so all the configuration is in web.config. We have a bunch of controllers, a database with some initializer and database context. We have some views. As I said, this is a university management application, so we have users to manage courses, departments, instructors, and students. This is not a small hello world application, but it is not an enterprise application either. However, this is a very good application for demo purposes, which we picked.

Thumbnail 210

Thumbnail 230

What are we going to do with this application? Let's switch to the presentation again and see what we are going to do. We have seen the codebase and architecture of the legacy application. Let's talk about the tooling that we will be using to modernize this. First, we will be using AWS Transform for .NET, through which we will be porting your .NET Framework application to the latest version of .NET. In this session, we will be doing .NET 8. Then we will be using Kiro. We will take the ported code from AWS Transform for .NET and then use Kiro to fix any compile-time errors, build it locally, and then modernize the codebase. We will look into each of the steps and what we are going to do when I say modernize.

Thumbnail 260

Thumbnail 290

Thumbnail 310

Thumbnail 320

Thumbnail 330

Thumbnail 340

Thumbnail 350

Step number one is automatically putting your codebase to the latest version of .NET using AWS Transform. Step two would be fixing any compile-time errors and building them locally. That's when we'll start refactoring. We'll replace your SQL Server with PostgreSQL. Then we are going to replace MSMQ with Amazon SQS. What we are trying to do is remove all the Windows components and legacy components to make it a more cloud-native application. Then we're going to extract microservices from this monolith. Then we are going to refactor the UI. Currently, as you have seen, the UI is based on MVC controllers and Razor views. We are going to move it to a React app. Finally, we are going to deploy it on AWS by creating a CDK project. In the next 45 minutes or so, we're going to go through all of these steps and why you should modernize. With that, let's start with step one, updating your .NET Framework application to .NET 8.

Thumbnail 370

Thumbnail 380

Thumbnail 390

Step 1: Automatic Code Porting with AWS Transform for .NET

Let's switch to the code. As Prasad said, the first service we're going to use is AWS Transform, and yesterday we completely changed the UI of AWS Transform by adding new features. This is the new UI of AWS Transform. I already have a workspace created. If I go to this workspace, I already have a connector configured to my GitHub repository. AWS Transform can access the code in my GitHub repository. This is the sample Contoso University project. The connector is active, and it's a connector for GitHub. What I'm going to do is create a new job by saying create job.

How many of you here have used AWS Transform? A few of you. Have you heard about this service? It went GA earlier this year. A few of you. Cool. Now it asks me what kind of transformation job I want to do because AWS Transform can do multiple things. It can transform your mainframe applications and VMware-based applications, but in this case, we're going to do .NET modernization. Now it's called Windows modernization because yesterday we launched a new feature, so it can not only modernize your .NET Framework applications, it can also modernize your SQL Server database. I'm saying I want to do Windows modernization, and in this case, I'm going to do .NET modernization.

Thumbnail 460

Thumbnail 470

Thumbnail 490

This is the job it's going to create for me. As you can see, it's a more agentic AI experience with a modern UI to go through. This is not a fully automated experience, so it involves me as a developer in the loop. As you see, it's saying I'm awaiting user input, and what it wants me to do is say where I get the source code to be transformed from. We have two types of connectors supported. One is through code connections, so if your source code is in GitHub or Azure DevOps, you can do it through code connections. Or if your source code is in your S3 bucket, maybe you're not storing it in GitHub or elsewhere, you can connect to your S3 bucket. In my case, it's in GitHub, so I'm using code connections to connect to my repository.

Thumbnail 520

Thumbnail 540

Now it will start discovery, so it will connect to my source code repository and try to see what kind of .NET Framework projects or older versions of .NET Core projects I have in my repository. Again, it's awaiting user input. It discovered that in my GitHub repo, I have two projects: Gadgets Online and Contoso University. The beauty of AWS Transform is that it can transform multiple projects at the same time. I don't need to go project by project. I can select multiple projects and transform them in parallel. But in this case, I want to do a transformation of only one project, so I'm selecting Contoso University. Then I'm saying start assessment.

Thumbnail 580

In the assessment phase, it will look at my project to understand what kind of .NET versions it's using and what kind of NuGet packages it's using. It will look at any dependencies to third-party NuGet packages or dependencies to other repositories, so it can automatically discover what I have. I have two projects in my GitHub repository, and actually this project is using a NuGet package which is produced by another project, so it will automatically map these dependencies.

Thumbnail 630

Thumbnail 660

I will select the project, submit the job, and then run the job. For this project, the job took approximately half an hour to run. Since we don't have time to do a live run of this job, I'll go back to my workspace and show you the results of the previous run. I chose the connector type, discovered the repository, and performed the transformation. Here are the transformation results. Because the tool can perform transformation of multiple projects at the same time, the outcome of transformation is a new branch in each project. In my case, it created a branch which I named "1-transform" because it's the first step of our seven-step process. It transformed approximately 3,000 lines of code in one repository and one project. I can go deeper and see all the transformation details, but this is our starting point. We performed the transformation using AWS Transform, and the transformed code is in this transformed branch.

Thumbnail 680

Thumbnail 690

Step 2: Fixing Build Errors and Compiling Locally Using Kiro's Vibe Coding

What's next? What Alex showed here is AWS Transform for .NET. For those who are not aware of this service, it's an agentic AI experience for modernizing your .NET applications. There are other features as well, such as VMware applications and mainframe applications, but specifically for .NET, what Alex has shown is large-scale porting. By large-scale porting, we mean that you can select more than one application to be modernized in one go. We also have an IDE extension, a Visual Studio extension for AWS Transform for .NET, for developers who want to work in a developer-friendly environment. They can use the Visual Studio extension and have the same experience.

Thumbnail 760

Thumbnail 780

Thumbnail 800

Thumbnail 810

Thumbnail 820

Regardless of which pathway you choose—whether you go via the web console that Alex demonstrated or the IDE experience—the tool does three things. First, it analyzes your codebase, which is the assessment part. Then it performs the transformation, replacing your code in a way that makes it compatible with the latest version. Finally, it validates the results. The goal here is to come up with a codebase that is Linux-ready and compatible. Now that we've done this, we will take this ported code and use Kiro to fix any compile-time errors and build it locally. Let's look at step two now. I'm switching to Kiro. This is the branch that was created by AWS Transform. If I go to this branch, this is my "1-transform" branch, and you can see that it was committed by AWS Transform with 55 files changed. There's a lot of code changed. If I go to the source code, you see that it's now a .NET 8 project. If I go to the project file, you see that the project file is much smaller because it's .NET 8. The target framework is .NET 8. My configuration is now moved to application settings, and all the static files were moved to wwwroot. I have program.cs and startup.cs. It was transformed and did all the undifferentiated heavy lifting of porting my .NET Framework application to .NET 8.

Thumbnail 840

Thumbnail 850

Thumbnail 900

However, there are still a few things left. If I go to this project and run a .NET build, there are a few build errors. In many cases, the transformed output will be good and compilable. In some cases, like this one, there are still some things missing. Here it's a mismatch with some NuGet package versions. As a developer, I would probably be able to fix this in a few minutes to half an hour, but let's see if we can fix it automatically using Kiro. I'll clear that, and then I'll chat with Kiro. I'm saying, "I got build errors. Please fix it." I'm trying to be polite with Kiro, so it works.

Thumbnail 910

Thumbnail 920

Thumbnail 930

Thumbnail 940

Thumbnail 950

Thumbnail 960

Thumbnail 980

Thumbnail 990

Thumbnail 1000

Thumbnail 1010

Thumbnail 1020

Thumbnail 1030

The first thing it's going to do is build the project itself so it can hook to the console output and see what the build errors actually are. It got the output and says, "I see the issue. There is a package conflict." So it reads the project file, updates some versions, and I can see all the changes that were done. It says, "OK, I want to build it again." So it builds it. There are two more errors. Some code is still using the MVC namespace which doesn't exist in .NET 8. It did a few more changes and tries to build it again. These are the steps which I would do myself as a developer. I'm offloading this debugging experience to Kiro, and I'm just following along to see if it makes sense from my developer perspective. Then it found there is also an issue in the Error.cshtml Razor view. It's using System.Web.Mvc which is not available in ASP.NET Core. It fixed it. There's one more error. It says there is a post-build copy task which it fixed. It keeps patting itself on the back. Great progress. The .NET build is complete. No more errors, but there was a warning. Kiro says, "There is a warning. Let me fix this warning as well," so it's trying to make it as good as possible. We run it. It's fixed. Build succeeded, zero warnings, zero errors. How long did it take for me and Kiro to fix these build issues? One or two minutes, probably. Now our project is compiling locally and it's a .NET 8 project already. I'm running it on Mac. It's not .NET Framework, it's .NET 8. What's next?

Thumbnail 1050

Thumbnail 1060

Thumbnail 1080

Thumbnail 1130

Now it is building locally, so let's see what we can do next. But how did we do that? We did it using Kiro and the AI IDE, which went into general availability a couple of weeks back. It was on preview for some time. What's beautiful about this AI IDE is that you provide the intent of what you want to achieve in natural language, not the actual code. You don't write the actual code; Kiro writes the actual code on your behalf. There are two ways to provide your intent. The first is what Alex showed, which is mainly known as vibe coding or prompt engineering. You provide the prompt and Kiro does the changes on your behalf. It's pretty good for tasks like fixing errors, troubleshooting, or building rapid prototypes. I use it all the time for creating proofs of concept for customers. But if you are doing more complex tasks and building production-grade applications, you might want to use spec-driven development. Vibe coding might not be the best way to go forward. Kiro supports spec-driven development, and we'll dive deep into it in a moment when we do the more complex aspects of the modernization that we are showing. For now, let's look at where we are in our journey. We completed the first two steps. The third step is now replacing SQL Server with PostgreSQL. Alex, I believe you are going to continue with vibe coding for that.

Thumbnail 1150

Thumbnail 1160

Thumbnail 1190

Thumbnail 1200

Step 3: Migrating from SQL Server to PostgreSQL Through Interactive Debugging

Yes, that's right. If you look at the architecture diagram, the first and second steps are done, and we're going to focus here on the SQL Server. We'll replace this with PostgreSQL. Let me switch to the code. I'm running it on Mac, and the project depends on SQL Server LocalDB, so I still cannot run it. But I have a local instance of PostgreSQL running on my machine, so I want to switch from SQL Server to PostgreSQL. First, let me switch to my PostgreSQL, so I can create a new database. It's a brand new empty database. Create database Contoso99. I don't know why 99, but OK. I just created this new database and then let me switch to Kiro.

Thumbnail 1210

Thumbnail 1260

Thumbnail 1270

Thumbnail 1280

Thumbnail 1290

Thumbnail 1300

I have a local running database named contoso99 with a username and an empty password. This is my intention for what I want to do with this application. Kiro says it will help me switch from SQL Server to PostgreSQL. Let me check your current setup and make necessary changes. It's checking my project file, my application settings, and database context startup file, and now it's starting to make the changes. I'm doing everything live. Nothing is recorded. Let's see how it does. What model are we using? We're currently using Claude Sonnet 4.5, which is the latest version of Claude Sonnet. It's asking me not to restore. Can you point on the screen where the model can be changed just for everyone? In your case, you haven't used Kiro, so we can switch between different models.

Thumbnail 1320

Thumbnail 1330

How many of you here have used Kiro or have hands-on experience with Kiro? Very few. Has anyone used Kiro to modernize old applications or only for prototyping? I highly recommend trying it out. It's not free, but once you install it, it gives you a good amount of credits, and then every month you get some credits free. It's a fork of VS Code, so you'll still be in a very friendly development environment. You can simply install the IDE and that's pretty much it. You don't need to install VS Code for it. You can simply go to Kiro and install it.

Thumbnail 1380

Thumbnail 1390

Thumbnail 1400

Thumbnail 1420

Thumbnail 1430

Thumbnail 1450

Thumbnail 1470

Thumbnail 1480

Thumbnail 1500

It did the first batch of changes. It says it updated application settings, updated the project file, and updated the database context. Let's try to run it. While you're running, you had a question about Visual Studio plug-in. For Visual Studio, we have a plug-in for AWS Transform. For Kiro, I'm not sure yet. I tried to run it and there is an error. It says unhandled exception Entity Framework. DateTime2 is not supported. I'm probably using DateTime2 in my SQL Server code, but DateTime2 is not available in PostgreSQL. As a developer, I could fix it myself, but I'm going to copy the error and ask Kiro. I have a runtime exception and I'm pasting the error. This is a new way of development. As a developer, we keep guiding Kiro, let's say Kiro is a junior developer, and they keep doing it for us. It's a kind of programming pairing where a friendly developer is always there for you. It's trying to find the references of DateTime2 in different projects, updating the notification, accepting the edits to it, and then saying let me check the other models. This is something I as a developer would do. If I fix an error, I will try to see if similar errors exist in other code files. I was able to build it. Let me run it again. It looks like there is DateTime2 somewhere else.

Thumbnail 1510

Thumbnail 1530

Thumbnail 1540

Thumbnail 1550

Thumbnail 1560

Thumbnail 1570

Thumbnail 1580

Thumbnail 1590

Thumbnail 1600

Thumbnail 1610

Thumbnail 1620

Thumbnail 1630

Thumbnail 1640

Thumbnail 1650

Thumbnail 1660

Thumbnail 1670

Thumbnail 1680

Thumbnail 1690

Thumbnail 1700

Thumbnail 1710

Thumbnail 1730

Thumbnail 1740

Thumbnail 1750

Thumbnail 1780

Thumbnail 1790

Thumbnail 1800

It looks like there is a runtime error somewhere else. Now it's in the Person table, but let me just copy this stack trace and paste it in the same session so Kiro knows the context. This is my conversation about PostgreSQL migration. Kiro says he found issues in the Person model, Students model, and Instructor model, so he's trying to fix it in all the other classes. While he is fixing it, do you have a question, please? What makes this different than using something like Cursor with the same model? It has much deeper integration with your AWS services, and we'll show you how you can deploy it all using CDK project in AWS. OK, so let me build it and run it again. Dotnet build. OK, so now at least I don't have runtime errors and it's running on localhost 5000, so let's see. This is my application. It's up and running. It's a .NET 8 application already running on my Mac, and if I go to the Students page. OK, something is not working. I get another runtime error, so let's see. The page is loading, but it's not able to load the data, right? The connection string was not initialized. Let me just copy it. While Alex is fixing it, please go ahead and ask the question. Does it support other models than what we've seen? We'll show you in a moment what models it supports. Currently, as you can see, there is a difference. You can try it out and experiment. You see there is Claude Sonnet 4.5, Claude Sonnet 4, Claude Haiku, and Claude Opus. And then yes, it's open coder. But now it's trying to fix another runtime error. The connection string was not connected correctly or initialized. It's trying to find whatever control I mean. Please go ahead. Regarding the AI part of it, if you have GitHub Copilot or some other thing, would it integrate with that, or is this a separate service? It's an IDE in itself, right? And it has the inbuilt AI capabilities. I don't think it is going to integrate with GitHub Copilot now. I will try it out. OK, so we're still in this, maybe it's not VIP coding mode, it's like VIP debugging mode. We're trying to fix this issue with PostgreSQL. Let's run it again. I think it's still on port 5000. OK, so now the page is loaded. It looks like I don't have any runtime errors, but there is no data on this page. I know that this project has a database initializer which sets the sample data. So again, I can ask Kiro or not ask, I'm just making the statement with sample data. He's not seeing it. As a developer, you still need to know what needs to be done, and then you keep instructing Kiro, but OK, can you please do this on our behalf, and they'll keep doing it, hopefully. Again, this is like my pair programmer. I'm talking to Kiro and trying to solve the issues together. Yes, please. It's a lot more efficient, I would say. Similar capabilities, and as I mentioned, it has much deeper integration with the AWS services when it comes to deployment. We'll look into that.

Thumbnail 1900

We'll look into how to create a CDK project and deploy it on AWS. The database initializer was not being called, so now we are calling the initializer. Let's build. Yes, please continue. We're using Visual Studio Code for the Kiro IDE, which automatically comes with Python, right? Yes, so now we have to install Counsel. You can continue using that, but if you would like to, Quiro is a great alternative. Maybe let's leave it for discussion after the session because we're really short on time and want to show what we need to show. We're still progressing with this upgrade from SQL Server to PostgreSQL, and there is this error that I hope is the last one. There's an error when it's trying to see the data.

Thumbnail 1920

Thumbnail 1930

PostgreSQL requires DateTime values to have a Kind specified, whether it's UTC or local. This is a PostgreSQL-specific feature. Let me make some changes. Let's build it and run it. Let me update the page. Now we have the data. We just upgraded the application, and it's now running locally on my Mac OS. It's a .NET 8 application, and instead of SQL Server, it's using PostgreSQL. It took us about twenty minutes in total of vibe coding with Kiro. That's great. What's next?

Thumbnail 1960

Thumbnail 1980

Thumbnail 1990

Thumbnail 2000

Steps 4-5: Replacing MSMQ with Amazon SQS and Extracting Microservices Using Spec-Driven Development

Let's see where we are in our journey. We modernized from SQL Server to PostgreSQL. The next step is to migrate from MSMQ to Amazon SQS. If you look at the architecture diagram, where the number three is written, we replaced SQL Server with PostgreSQL there. Now we're going to focus on MSMQ and make it Amazon SQS. I think we're going to do this with spec-driven development, and this is where the actual differentiation comes into the picture. Let's switch to code. I will start a new session, and instead of vibe coding, I'm now going into spec coding mode. I already have my prompt ready. I don't want to type it.

Thumbnail 2030

Thumbnail 2040

Thumbnail 2060

Thumbnail 2070

Thumbnail 2090

Thumbnail 2100

I'm saying the current application is using MSMQ for notifications, and I want to switch to SQS implementation. I want just base functionality with no extensive error handling, no unit tests, and no local testing, so I want to keep it at MVP level. I already have SQS created, and this is the URL of the queue. Now I'm in spec development mode. What does it do in spec development mode? First, it's going to create the requirements document. It will create a requirements document based on my project. What does it mean upgrading MSMQ to SQS? We will see this document in a second. It's created the requirements, and now it will ask me to review it. You see there is a new folder created called .kiro, and under .kiro there is a specs folder. It created this SQS notification migration folder. The name makes perfect sense based on my intent that I want to migrate to SQS. It's created with a requirements document. This specification defines the migration of the notification system from in-memory queue implementation to SQS.

Here are the requirements. As an administrator, I want notifications to be sent to SQS instead of an in-memory queue. So notifications are persisted and can be reliably delivered.

Thumbnail 2110

Thumbnail 2120

Thumbnail 2140

Thumbnail 2150

The second requirement is that as a developer, I want to use the AWS SDK for .NET so that the application can communicate with SQS. That also makes sense. As a developer, I want the queue URL to be configurable so that I can have different queues for my production environment, my testing environment, and so on. Then, as a developer, this is quite important: I want the existing notifications API to remain untouched. I want to change the underlying implementation but not the interfaces. All the services that use the notification service should remain untouched.

Thumbnail 2200

All these requirements make sense to me. I'm moving to the design phase. These requirements are quite generic, so I would say these are requirements for every project where you migrate from MSMQ to SQS. Now what Kiro will try to do is apply these requirements to my specific project. What do I need to do in this project to implement these requirements? This is the design phase, and it takes into account the whole project structure, my classes, my dependencies, and so on. Kiro is working on this design document.

Thumbnail 2220

Thumbnail 2230

Thumbnail 2240

Thumbnail 2250

The design document is created. It says, look, this is the current architecture. Kiro went to my source code to understand what the current architecture looks like. I have a notification service, a base controller, and a notification controller. Then it shows how the target architecture should look, these are the changes that have to be implemented. It shows some code pieces and how the new notification service should look like. It shows the interfaces, some key design decisions, and the configurations that I will add into appsettings.json, the configuration of the queue, and the new dependencies. Now the project must be dependent on the AWS SDK for SQS.

Thumbnail 2260

Thumbnail 2270

When the notification is saved in SQS, it has to be serialized. This is the class and this is how the serialization will look like. I can review this design document as a developer and make changes to it, but for me now it is good enough. I'm moving to the implementation plan. The whole process is that once you provide the intent, you first get requirements, then design, and then the implementation plan. Kiro is now creating these tasks, which are single units of implementation that Kiro is going to do with me. It is asking me if I want to make it fast, keeping some tasks as optional ones, or if I want to make all the tasks required.

Thumbnail 2320

Thumbnail 2330

Thumbnail 2340

Thumbnail 2350

We are short on time, so let's do an MVP level of implementation. Kiro has created the task list. These are the tasks. The first task is to add the SDK dependency. I need to add the NuGet package and then add the configuration to appsettings.json. Then I need to update the notification service to use SQS. I need to implement sending notifications with SQS instead of using MSMQ. I need to do the serialization, send the message, and get the result. I need to implement retrieving notifications and so on. These are the tasks that have to be implemented to fulfill my requirements.

Thumbnail 2370

For each task, I can just say start task, and then Kiro will do the implementation of that specific first task. It will add the dependency to the SDK, add the new package, and update my configuration so that my appsettings.json is updated. Kiro is working on this task. It follows the full SDLC cycle: first creating the requirements, then the design, then the tasks, and then it starts implementing the tasks on my behalf. It follows my own mental model. As a developer, I would probably do the same steps. I would figure out what the requirements are, then figure out how these requirements are applicable to my specific project, and what I need to do—exactly what tasks.

Thumbnail 2430

Thumbnail 2440

Thumbnail 2450

Thumbnail 2460

Thumbnail 2470

Thumbnail 2490

Kiro is working, and this is the beauty of a live demo: you don't know what to expect. You got the idea, so let me quickly switch to a new branch. I will just undo all the changes. Discard everything, and I already have a branch with the implemented feature. Let me quickly switch to it. I have this branch for SQS. Switch. This is where the feature is, these are requirements, and I've already implemented this feature. Sure. Let's fast forward. What's next? Let's go to the presentation mode.

Thumbnail 2520

What we have seen is spec-driven development, and that's the main differentiating factor. White coding is good and available in multiple other tools as well, and that is good for rapid prototyping and quick issue fixing. But if you need more in-depth thinking and production-level projects to be built, that's where spec-driven development comes into the picture, which we have seen.

Thumbnail 2530

Thumbnail 2540

Now let's talk about the journey and where we are right now. We did the MSMQ to SQS migration. The next part we're going to do is extract our microservice from this monolith. That's what we're going to work on next. We're going to extend this notification service into its own service, again using spec-driven development. Right, Alex?

Thumbnail 2560

Thumbnail 2570

Thumbnail 2580

Yes, let's switch to demo. For the sake of time, I will show what kind of specification was generated and how the project was changed. I will switch to another branch for the notification service. If I go to the Kiro folder, I now have another specification that was created with notification service extraction. If I go to requirements, it says these are requirements for extracting the notification service from this monolithic application into a separate .NET 8 microservice so I can deploy it independently and scale it independently. This is the beauty of microservices.

Thumbnail 2600

Thumbnail 2610

Thumbnail 2630

Thumbnail 2640

Thumbnail 2650

This is the kind of intent that you provided with which it created this specification. The first requirement states: as an architect, I want the notification service to be extracted into a separate WebAPI project so it can be deployed and scaled independently. As a developer, I want to expose REST API endpoints so I can send notifications and the main application can send notifications using HTTP. The endpoint should be exposed at API slash notifications and so on. As a developer, I want the notification API to expose endpoints to receive notifications. As a developer, I want to use HttpClient to communicate with the notifications API. I don't want to do direct references; I want to decouple it and use HTTP calls between the main application and notification service.

Thumbnail 2660

Thumbnail 2670

Thumbnail 2680

Thumbnail 2700

As an operator, I want everything to be configured in the URL, so the URL of the notification API can be changed based on the environment. The last one is proper dependency injection. Everything makes sense from my developer standpoint. Proper configuration files and I want existing controllers to continue working without changes. This switch between in-process calls to HTTP calls should be done at the lower implementation level so that the APIs are not changed. It then created a design based on this. The design shows the flow for the notification service.

Thumbnail 2710

Thumbnail 2720

Thumbnail 2730

Thumbnail 2740

Thumbnail 2760

The Identification Service is a separate application that uses an Identification API. The Identification API is using SQS. The design document shows how this specification is implemented in the codebase. This resulted in a list of tasks, including creating a new project , setting up appropriate models and dependency injection. I have already implemented all these tasks . There is a new project created alongside the Contoso University project called NotificationAPI. It has one controller, a service, and a model. It uses the AWS SDK SQS to communicate with the queue, and I removed the SQS references from the main project.

Thumbnail 2770

Thumbnail 2780

Thumbnail 2800

Thumbnail 2810

Steps 6-7: Refactoring the UI to React and Deploying to AWS with CDK

We are a bit short on time, so we will take questions at the end. We will hang around even after the session is over to address all questions. We have around 14 minutes left to showcase a few extra things . We completed step 5, which is extracting a microservice from the monolith . From an architecture perspective, this is how it looks now. We are incrementally modernizing the application. The next focus is refactoring the UI. When you talk about ASP.NET Core applications, their UI is in Razor Views, for example, and we want to move it to modern UI frameworks like React . That is what we are going to do in the next step, refactoring the UI part .

Thumbnail 2830

Thumbnail 2850

Thumbnail 2860

Thumbnail 2870

Thumbnail 2880

Thumbnail 2890

Thumbnail 2900

Let me switch to the branch where I already worked on the UI. I will show the tasks for this React migration . The list of tasks is really extensive because we are planning to set up backend API infrastructure, switching from MVC controllers to API controllers . We need to create all the response models and data transfer objects, create the Students API controller, create the Courses API controller , create the Instructors API controller, and handle departments and so on. Then we initialize the React application using TypeScript , add the reference to Material UI, and configure the API . Then we start implementing the forms one by one. First, we build the Students page , create the student list page, create the student form page, and create the student create page. The same goes for departments and courses . It is a very extensive list of tasks. These are tasks which I as a developer would need to do myself. If I get a task to convert from Razor Views to a React application, I know there are 50 forms which I need to create manually, and these are tasks which I need to do form by form, implementing each new form.

I have already implemented these tasks. To answer your question, Prasad, I think it took maybe 4 or 5 hours to implement all these tasks. You might think that 5 hours working together with Kiro is a significant amount of time. However, if you do the same migration manually from Razor Views to React, it could take maybe a couple of weeks for me to implement all 50 forms, APIs, and controllers. Together with Kiro, it is less than a day. This is with the Contoso University UI application implemented with everything—pages, components, and everything else—structured in a very nice way. This is also a way for me to learn what the best practices are nowadays for implementing React applications.

Spec-driven development really stands out because you just create the specifications. Then you can manually edit those specifications if needed. As it implements a task, you can revert a task if needed. You can keep guiding it and it will keep doing it on your behalf. When it comes to vibe coding, that is pretty straightforward, but spec-driven development is where you actually need to follow that specific structure. It will help you implement really complex aspects of it.

Thumbnail 3010

Thumbnail 3020

Thumbnail 3030

So now we did refactoring the UI and the final part of it would be actually creating the CDK project. But in case if you want to look at the diagram, this is how it is. If you look at the presentation layer, it changes to React app and instead of MVC controllers, it is now API controllers. Right? So now let's do the last part of it, which is deploying this whole modernized applications. The codebase is running locally and it is modernized locally, but let's deploy it on AWS by creating a CDK project.

Thumbnail 3040

Thumbnail 3070

Yeah, so again, I already created a specification and I already created the CDK project, so I think for CDK it took me maybe a couple of hours to create a CDK project and deploy it and then there were some small issues to fix. But again, instead of maybe a couple of days, it took a couple of hours. So let me switch. So again, I was using spec-driven development, so this is the specification created at the CDK deployment.

Thumbnail 3080

Thumbnail 3100

So from requirements perspective, I want to deploy this Contoso University application using CDK using C because CDK supports multiple languages. We want to use C and when it detected that the application has three components React front end, main backend, and also identification service API. Also, it must provision required infrastructure, compute database, so instead of my local PostgreSQL database, it should provision Aurora PostgreSQL database and supporting resources like application load balancer, secrets manager to securely store my database credentials and so on.

Thumbnail 3120

Thumbnail 3130

Thumbnail 3140

Thumbnail 3150

And then it ended up creating the tasks, create CDK project, create the ECS-based deployment. Deployment scripts and so on, so the output is this CDK project, so it has multiple constructs, constructs for my EC2 instances for backend services, construct for a database, for front end for messaging, and then also it created a number of scripts because when it deploys it to EC2, it has to provision resources first, then get the URL of a load balancer and then update the React application with this URL of a load balancer for backend API and then redeploy React application. So Kiro automatically discovered these dependencies, the deployment must be in these steps, in this order, and it created it for me.

Thumbnail 3170

Thumbnail 3180

Thumbnail 3190

So this is the application which is deployed on AWS on CloudFront. And something failed recently, my load balancer died, but this is the page with courses, now it's failed to load courses, but again this is the React application connected to the Amazon Aurora PostgreSQL database through the load balancer. So again, everything was deployed by the CDK project which was created by me for me by Kiro.

We can show the architecture diagram. So let's switch to the slides and let's look at the architecture slides. So we did the last step, CDK project, but when it comes to the architecture diagram, you can see the front end, the React app is deployed on S3 bucket, and it is fronted by Amazon CloudFront. The web API is deployed on Amazon EC2 Linux instance with ASP.NET Core API controllers, and it has in the same code which has Entity Framework to talk to Amazon Aurora PostgreSQL. So it's not just the local PostgreSQL instance, but we've deployed Amazon Aurora PostgreSQL.

And then if you can see there's another service notification service, it is deployed on a separate EC2 instance and that talks to Amazon SQS. So this is all the components that were modernized and deployed on AWS. And this is a simplified version of the architecture, right? So this is very simplified version because we have various application load balancer. It has route-based path-based routing. So like for authentication service, there is also secrets manager to connect to database and so on, but here we're just showing the deployment components so you can mentally map our legacy monolithic application to a new architecture.

Thumbnail 3290

Thumbnail 3300

Summary: From Legacy Monolith to Modern Cloud-Native Architecture

So that's where the whole differentiation of Kiro comes into the picture, the spec-driven development of structured way of doing it, and then deploying it on AWS. But if you look at, if I try and summarize what we have achieved so far in this session is that we started from this legacy ASP.NET application with Razor Views MVC controller and Microsoft components like MSMQ and SQL Server, and what we ended up is this by doing the seven steps.

Thumbnail 3340

By following these seven steps, we ended up with a modernized application. The first step was porting your .NET Framework application to .NET. Then we fixed the errors and built it locally. We replaced SQL Server with PostgreSQL and replaced MSMQ with Amazon SQS. Next, we broke up the monolith by doing microservices, specifically the notification service. Then we refactored the UI to React, and finally, we used CDK to deploy it onto AWS with all the components required to make it a functioning application.

That is pretty much what we have for today. We do have a few resources for you to get started. There is AWS Transform for .NET for upgrading your .NET applications on AWS, and Kiro for doing not only vibe coding but also spec-driven development, which is where the actual strength of Kiro comes into the picture with the whole AI IDE. I will pause for a moment for everyone to take pictures.

Thumbnail 3350

Thumbnail 3380

Thumbnail 3390

We also have a few resources, including many agentic AI courses on AWS SkillBuilder. Feel free to go to it, as many of them are free courses on agentic AI on AWS. Thank you very much. Hopefully you enjoyed it, and we are here for any questions.


; This article is entirely auto-generated using Amazon Bedrock.

Top comments (0)