DEV Community

Cover image for AWS re:Invent 2025 - Amazon GameLift Streams Powers Bandai Namco Entertainment's Metaverse (IND394)
Kazuya
Kazuya

Posted on

AWS re:Invent 2025 - Amazon GameLift Streams Powers Bandai Namco Entertainment's Metaverse (IND394)

🦄 Making great presentations more accessible.
This project aims to enhances multilingual accessibility and discoverability while maintaining the integrity of original content. Detailed transcriptions and keyframes preserve the nuances and technical insights that make each session compelling.

Overview

📖 AWS re:Invent 2025 - Amazon GameLift Streams Powers Bandai Namco Entertainment's Metaverse (IND394)

In this video, Bandai Namco Entertainment and AWS Professional Services present their joint project creating the Gundam Metaverse, a global 3D space accessible from any device. The team of up to 80 engineers utilized Amazon GameLift Streams to deliver browser-based streaming at 1080p/60fps across 20+ countries, implementing multi-location stream groups to achieve under 20ms latency in Tokyo and under 150ms between Tokyo and Oregon. They developed custom auto scaling logic using Lambda and EventBridge to balance cost and user experience by dynamically adjusting capacity based on surplus rate thresholds of 30% and 50%. A single Unreal Engine 5 build supported PC, iOS, and Android through virtual gamepad implementation in the web SDK. Puppeteer-based automated testing on Amazon ECS enabled large-scale multiplayer simulation. The project also integrated Login with Amazon and Amazon API to create a seamless in-metaverse shopping experience using overlay web screens to handle authentication while leveraging PC cache. The solution is being standardized for partner adoption across character metaverse, digital twins, and training applications.


; This article is entirely auto-generated while preserving the original presentation content as much as possible. Please note that there may be typos or inaccuracies.

Main Part

Thumbnail 0

Introduction: Bandai Namco and AWS Collaborate on Gundam Metaverse Project

Hi, my name is Masamichi Tanaka. I'm a Senior Principal Strategic Engagement at AWS. This project is a joint project between Bandai Namco and AWS, and we'll be talking about how we created the metaverse space throughout the session.

Thumbnail 20

In this session, we'll talk about some of the technologies we used in this metaverse creation, namely three different solutions that we provided. First, AWS Professional Services. Have any of you ever used professional services in your projects before? Anybody?

Thumbnail 40

AWS Professional Services is an offering that we provide to most of our customers when there are complicated issues going on in a project. We offer strategic consulting and technical consulting, offering best practices from worldwide practices to jumpstart your project. We also offer actual integration using professional services engineers, and we have the partner network to utilize. So we can take on large projects to give you the best results. Amazon GameLift Streams is a technology we use to provide on-demand, low-latency streaming of video to all our customers regardless of which devices they were using. In this project, we were able to stream the Gundam metaverse space to any users using a smartphone, for instance. We also used Amazon API to connect Amazon to the space so that one-click shopping was possible.

Thumbnail 120

Let's dive into the team structure. On the left side, we have the Bandai Namco Entertainment team. We're fortunate to have two of the gentlemen from the team on the stage today. On the right, we have the AWS Professional Services team. At the peak, the number of engineers went up to 80 engineers in this project. Bandai Namco took care of the strategy planning, creative direction, and quality control of this project, and AWS Professional Services took care of the infrastructure, game servers, and 3D space development as well as project management.

Thumbnail 170

I'm not sure if you're aware of the IP Gundam, but we'll start off the session with a video so that you have an idea of what Gundam Metaverse is about.

Thumbnail 180

Thumbnail 190

Thumbnail 200

Thumbnail 210

Thumbnail 220

Thumbnail 230

Building a Global Community: The Vision and Business Challenges Behind Gundam Metaverse

  Let's welcome Daisuke Omori to the stage. Thank you for joining us today. I'm Daisuke Omori from Bandai Namco Entertainment. The Bandai Namco Group is built on character brands. We cover toys, games, animation, music, and amusement.
Enter fullscreen mode Exit fullscreen mode

Thumbnail 260

Within the group, we are a digital entertainment company focused on console and mobile games. We handle over 500 IPs each year. From long-run titles to new ones, we bring them to people worldwide. This scale shows why we must connect fans across generations and regions. In this project, we are taking on a new challenge with Mobile Suit Gundam. Mobile Suit Gundam started in 1979.

Thumbnail 280

Thumbnail 310

Mobile Suit Gundam started in 1979. It's known for realistic world building and human drama. Since then, many TV and film titles have followed. For decades, people around the world have enjoyed this IP. How do we give fans a place to gather worldwide and help the community grow? That is why we are building the metaverse.

Thumbnail 340

This is the concept for the Gundam metaverse we're building. First, fans can experience and dance in iconic worlds together. Second, accessibility, so anyone can join anytime, anywhere on any device. Third, one smooth journey where you can watch, join live, and shop in one place. With these three core ideas, we aim to create a place where fans around the world gather, have real conversations, and help the community grow.

Thumbnail 390

That's the background. Now, our business challenges. First, global connectivity. With different regions and networks, we required scalable ways to bring everyone into one shared space. Second, accessibility—a smooth 3D experience anytime, anywhere on any device. Third, one combined experience where you can watch, join live, and shop in one smooth journey. Working with AWS Professional Services, we solved this with cloud technology.

Thumbnail 450

System Architecture Overview: Leveraging Amazon GameLift Streams for Multi-Regional, Multi-Platform Delivery

On the next page, Akinori from AWS will present the technical details. Hello everyone. I'm Akinori from AWS Professional Services. In this section, I will introduce the system architecture of the Gundam Metaverse, which has been provided to Gundam fans in more than 20 countries worldwide.

Thumbnail 460

Let me first highlight AWS Professional Services' role in this project. We provided comprehensive support across multiple areas beyond cloud infrastructure, from strategic planning support to the build, learn, and growth phases of the metaverse application development. Our deep understanding of both strategy and applications enabled us to build a global scalable cloud infrastructure that maximizes value. Today's session will specifically focus on the cloud infrastructure area.

Thumbnail 500

Here is the architecture overview of the Gundam Metaverse. Let me walk you through each component following the user flow. In step one, players authenticate using BANDAI NAMCO ID to access the web UI. Next, step two is a key point. When players access the web UI, the game application is launched on GPU instances in AWS cloud using cloud gaming technology, and then the game session is established.

In step three, players access the game session through their web browser by sending mouse and keyboard input. Moving to step four, the game server backend provides online multiplayer functionality to clients by sharing information such as other players' positions and movements. In step five, we have implemented integration with Amazon to provide an immersive shopping experience within the metaverse. In step six, all player activity logs are collected in the analytics platform, providing dashboard insights in near real time.

Thumbnail 600

Finally, in step seven, we have also implemented a CI/CD pipeline optimized for cloud gaming. This allows developers to deliver updates to players smoothly.

Thumbnail 650

This slide introduces the key technologies used in our architecture. Key technologies include Amazon GameLift Streams for browser-based metaverse experiences and Amazon API for seamless shopping experiences. The game platform is built with Unreal Engine 5 and Nakama game server. In our development process, we leveraged JetBrains TeamCity and Perforce HelixCore to enable automated build and deployment systems. Today we will explain in detail about Amazon GameLift Streams and Amazon API, which were key components of this project. Those two services played crucial roles in addressing our business challenges. Amazon GameLift Streams enabled multi-region and multi-platform delivery, achieving global reach and accessible experience. Amazon API enables seamless purchasing of Amazon products within the metaverse, facilitating immersive shopping.

Thumbnail 690

In the following slides, we'll take a deeper look at Amazon GameLift Streams. Let me explain what Amazon GameLift Streams is. It's a recently released AWS service that enables browser-based game delivery. With GameLift Streams, games can run on GPU instances in AWS cloud, allowing on-demand low-latency streaming to players worldwide. This streaming functionality supports up to 1080p resolution at 60 frames per second.

Thumbnail 720

GameLift Streams operates through a global network of endpoints. As shown on this map, we have access to 10 regions worldwide, including beta access locations. This extensive coverage enables us to provide low-latency streaming by connecting players to their nearest endpoint. For the Gundam metaverse, this global infrastructure has been essential in delivering consistent performance across different regions.

Thumbnail 760

Another powerful feature of GameLift Streams is its device accessibility. As shown here, streaming is possible on any device equipped with a web browser, from TVs and tablets to computers and various streaming devices. This allows us to deliver gaming experiences even to devices with limited processing power. As a result, our potential player base is significantly expanded.

Thumbnail 800

This slide shows actual gameplay footage from the Gundam Metaverse. By utilizing GameLift Streams, we have successfully provided a browser-based metaverse experience, enabling Gundam fans from over 20 countries worldwide to interact with each other. Furthermore, while we initially only supported PC, we were able to quickly expand support to mobile devices using GameLift Streams.

Thumbnail 830

In the following sections we will explain detailed GameLift Streams architecture from two key aspects. First, multi-regional scalability. The Gundam metaverse required global game delivery, and GameLift Streams enabled us to scale by region based on demand. We also optimize capacity by analyzing real-time and cross-regional access patterns. Second, multi-platform accessibility. Browser-based access through GameLift Streams allowed us to deliver to multiple devices using a single build. We also ensured quality through automated end-to-end testing, integrating with standard browser-based testing tools.

Technical Deep Dive: Network Latency Optimization and Dynamic Capacity Management with Multi-Location Stream Groups

For more detailed insights on these aspects, I'll hand over to Natsuhiro Maruyama from Bandai Namco Entertainment. All right, thank you everyone. I'm Natsuhiro Maruyama from Bandai Namco Entertainment and I'd like to walk you through some of the thought process that went into the technical implementation of the Gundam metaverse. So there were two key requirements for making it globally accessible.

Thumbnail 910

Thumbnail 940

We needed to provide a comfortable experience from anywhere in the world and enable access from any device. We went with a multi-region, multi-platform strategy supporting over 20 countries and 3 platforms. However, this came with 4 technical challenges: network latency optimization, dynamic capacity management, platform experience equity, and multi-platform testing framework, each involving complex trade-offs.

Thumbnail 960

Network latency optimization came with 3 additional challenges to address. We needed to optimize latency in each region, secure thousands of GPU servers to match user demand, and manage the operational complexity of a globally distributed infrastructure. To solve this, we utilized Amazon GameLift Streams' feature called multi-location stream groups. The advantage of this feature is that once you configure a build in the primary location, GameLift Streams automatically deploys the same binary to all other locations.

This centralized the infrastructure management across multiple regions, significantly reducing engineering resources and operational burden while enabling global deployment. In terms of performance, we achieved low latency of under 20 milliseconds within the Tokyo region and under 150 milliseconds between Tokyo and Oregon regions. Operationally, we can deploy to multiple regions with a single CI/CD pipeline, which has shortened the release cycle and maintained development velocity. By leveraging this feature, we were able to clearly separate client-side and server-side responsibilities, greatly simplifying both development and operations.

Thumbnail 1030

Thumbnail 1050

Let me explain the specific implementation details of how we select regions using GameLift Streams. First, the client measures latency to each region. In this example, the latency to the Tokyo region was 20 milliseconds and to the Oregon region was 118 milliseconds. Based on the latency measured on the client side, we pass regions ordered by latency as the locations parameter in the start stream session API. In this example, Tokyo is first and Oregon is second. GameLift Streams evaluates regions in this order and automatically allocates a session in the optimal region. You can confirm the allocated region with the get stream session API.

Thumbnail 1090

As you can see, the region selection logic was quite simple to implement. Now, let's see how it behaves when a region has no available capacity. Just like before, we call the start stream session API with Tokyo first. GameLift Streams checks Tokyo's capacity first, but let's say there is no availability in Tokyo. It then automatically checks the next region, which is Oregon in this case, and allocates a session if there's availability there. What is important is that there is no need to implement retry logic on the client side.

Thumbnail 1140

Thumbnail 1160

This clarified client and server responsibilities and really simplified our development. By sequentially trying multiple regions, we improved our success rate in allocating sessions and made overall operations more stable. Also, it's easy to add regions with multi-location stream groups. As the number of users increases, we can flexibly add new regions. In this example, we've used the GameLift Streams console to add 4 regions: Ohio, Frankfurt, Ireland, and North Virginia. These additional regions work immediately by simply passing them as a list in the locations parameter. In this example, we've specified 6 regions. If there is no available session in Tokyo, the system will allocate a session in Ireland, which is the second region on the list.

Thumbnail 1190

As you can see, multi-location stream groups enabled us to optimize latency in a global environment. However, we faced a new challenge: how to predict capacity in each region and adjust it according to demand. Let's take a look at this dynamic capacity management challenge. There's a trade-off we have to consider. Having more available sessions improves user experience, but it also increases infrastructure costs.

Thumbnail 1230

GameLift Streams offers two options. Always on provides instant access, but it's costly. On-demand takes over 60 seconds to start, but scales dynamically. Each had its own pros and cons, and neither alone could really resolve this trade-off. For the Gundam metaverse, we chose the always on option to prioritize player experience. To balance this with cost optimization, we implemented two approaches.

First is custom auto scaling. This dynamically adjusts always on capacity according to demand. I'll come back to this later in the next slide. And the second component is adaptive region auto scaler. Under normal conditions, we consolidate users from Asia in the Tokyo region and users from other areas in the Oregon region, which offers the lowest infrastructure cost. But when available sessions run low or network latency increases, additional regions automatically activate, expanding to as many as 6 regions. This allows us to operate with the minimum necessary regions while flexibly responding to demand surges.

Thumbnail 1290

These two approaches optimize the balance between cost and user experience. So going back to the custom auto scaling, let me explain in a little more detail. First, let me go over the foundational concepts in GameLift Streams. There are 3 key metrics: desired capacity, allocated capacity, and idle capacity. Desired capacity is the total capacity you will request, and it consists of two parts: allocated capacity, which is actively in use, and idle capacity, which is running but not yet assigned to any session.

For example, if desired capacity is 100, allocated capacity might be 70 and idle capacity would be 30. To maintain a good user experience, we need to secure a certain number of idle capacity while appropriately performing scale in and scale out operations to reduce wasted costs. Therefore, we implemented custom logic that takes all three metrics into account. The next slide will explain this implementation in detail.

Thumbnail 1360

The custom auto scaling logic dedicated to GameLift Streams is implemented in a Lambda function triggered every minute by EventBridge. This diagram shows the Lambda function implementation. First, Lambda retrieves the current desired capacity, allocated capacity, and idle capacity metrics with the GetStreamGroup API. Then it calculates the surplus rate, which is idle capacity divided by desired capacity.

If this surplus rate is 30% or less, it determines that the idle capacity is low and cannot accept new users and calls the UpdateStreamGroup API to increase the desired capacity. Conversely, if surplus rate is 50% or more and capacity is excessive, it scales in. By achieving dynamic capacity management like this, we were able to balance cost and user experience.

Thumbnail 1420

Achieving Platform Experience Equity: Virtual Game Pads and Automated End-to-End Testing for Cloud Gaming

Next, I'll explain the technical challenges related to the multi-platform strategy. Here we face two technical challenges. First is platform experience equity. How do we provide a fair experience across all devices? And second is multi-platform testing framework. How do we ensure quality across diverse platforms? I'll explain how we address these two challenges.

Thumbnail 1470

The Gundam metaverse is designed to be playable on any device: PC, iOS, or Android. We identify the device using the user agent header sent by the web browser and switch the web UI layout accordingly. To properly send player control inputs to the game on PC and mobile devices, we implemented GameLift Streams' web SDK in the browser. The web SDK comes with default functionality for keyboard and mouse inputs as well as automatic controller recognition.

Thumbnail 1500

Additionally, it's possible to add virtual game pads and specify custom key mappings. For the Gundam metaverse, we implemented virtual game pads for mobile device operation. This slide shows a sample implementation of the virtual game pad for mobile users. On the left side you can see the UI component definitions. We define joysticks and buttons along with event listeners for player operations.

On the right side you can see the SDK integration code. First, we create a virtual game pad object and register it with the GameLift Streams SDK. Then, when browser events occur, we receive them and convert them to virtual game pad axes and button inputs. Finally, by calling process game pads, these inputs are sent to the Unreal Engine client on GameLift Streams.

Thumbnail 1550

To wrap up this technical portion of the presentation, let me talk about the development environment unique to cloud gaming. In traditional multi-platform development you typically need to create separate binaries for various devices and conduct testing in QA for each platform. However, with GameLift Streams, a single binary supports multiple platforms because the browser handles all input and the server only streams video. This drastically simplifies the development process and enables faster iterations. Additionally, because it's browser-based, web automation tools work seamlessly, allowing us to execute large scale multiplayer tasks with ease.

Thumbnail 1590

Thumbnail 1610

To test multiplayer scenarios at scale, we spin up multiple GameLift Streams instances and simulate browser interactions. Using Step Functions in Amazon ECS, we launch Puppeteer-based headless browsers that perform automated operations and record sessions to Amazon S3. This video shows how Puppeteer enables scalable end-to-end testing. It automatically spins up multiple clients, controls the browsers, handles logins, and performs in-game actions. By running hundreds of these sessions in parallel, we can simulate thousands of simultaneous players. All tasks are recorded, so we can verify issues anytime, whether after hours or after fixes. This helped us identify issues that occurred only when multiple users connected to the same server simultaneously. Interestingly, these problems were not visible in server monitoring. This brought unexpected benefits to our cloud gaming project and significantly improved the overall quality of our game development.

Thumbnail 1690

Seamless Shopping Integration: Using Login with Amazon and Amazon API for Immersive Commerce Experience

So, that concludes the technical walkthrough about cloud gaming, and I would like to bring up Mass from AWS again, and he will talk about the integration with Amazon. Thanks, Natsu. I'll now introduce the immersive shopping experience in the metaverse space realized through Amazon. In the Gundam metaverse, we utilize two technical elements, namely Login with Amazon and Amazon API to create a seamless purchasing experience within the metaverse space.

Thumbnail 1700

Thumbnail 1710

Thumbnail 1740

In the seamless purchasing experience integrated with Amazon, players can discover and purchase products without leaving the metaverse space. More specifically, users can obtain Amazon product details within the space along with pricing and customer shipping information, and they can choose when they decide to purchase the item. They can just press a button and that information gets sent to Amazon and it gets delivered to you. In the Gundam metaverse, we developed an Amazon integrated metaverse shop that enables such seamless shopping experience using two technical elements which we'll talk about in detail. One is Login with Amazon and the other is Amazon API.

Thumbnail 1760

This diagram shows the architecture of Amazon integrated metaverse shop. When the player visits the shop for the first time, they're prompted to log in using Login with Amazon. Once they click the button, they input their information and their information is connected to their account within the metaverse space. Then the player can browse and purchase the products. All they need to do is just click a button and all that information gets sent to Amazon via API. After the order is placed, the order information gets processed within Amazon and you'll get your purchase through Amazon. You can also use the Amazon website to access your customer support.

Thumbnail 1810

Let me dive deep into how we integrated logging with Amazon. Login with Amazon provides ways for users to link their accounts to any spaces, such as the metaverse space. In Gundam Metaverse, when a user presses the Login with Amazon button, the metaverse shop pops up a web screen displayed as an overlay. We'll describe why we use the overlay to do this.

Thumbnail 1850

The Gundam metaverse is running on cloud gaming, and because it's not running on the cloud, we needed to link the Amazon account constantly. This presented a couple of issues. Normally, linking an Amazon account requires displaying an authentication screen showing terms of service and privacy policies, and then users need to input their username and password. After that, you have to display a permission confirmation screen. That's a lot, right? So during that process, if they drop off, they will need to redo the task again.

Thumbnail 1910

We decided to use an overlay to overcome this issue. While a web screen can utilize cache on the user's PC, if you're doing this on the cloud, that's not possible. We addressed these issues by implementing an overlay web screen external to the metaverse game on the player's PC using the PC cache to implement authentication screens. This allowed us to obtain authentication tokens for subsequent Amazon API users and also created a way to obtain user actions during purchases.

Thumbnail 1940

Thumbnail 1960

This page shows the implementation example. As you can see, we implemented a small amount of code which allowed us to actually display the Amazon standard authentication screen without having to create everything from scratch. Next, I'll describe examples of Amazon API usage within the metaverse. Using the API, we are able to pull any information available on Amazon. For instance, users are able to see product details and delivery timing, and preview orders. Once they see all that information and decide to purchase, all they have to do is click a button and all that information gets sent to Amazon via API.

Thumbnail 2010

Through the API, we were tracking where users actually dropped off during the purchase steps, enabling us to collect data useful for shop design. Finally, the post-purchase process is definitely easy to explain. It's the same as when you make a purchase on the Amazon.com website. After you purchase your order in the metaverse, you'll receive the goods from Amazon and you can use the website to do any type of interaction with Amazon, including customer service. This mechanism basically offered Bandai Namco the ability to create a shopping experience without having any backend. All they had to do was create a connection to Amazon using the Amazon API and place their goods on Amazon. That's all they had to do.

Thumbnail 2060

Thumbnail 2070

So that concludes our presentation on Gundam Metaverse. Some of the takeaways from the session are listed here. First is the development of a regionally scalable metaverse. Second, we optimized cost and experience through custom auto scaling. Third, we enhanced service quality with automated end-to-end testing. And last but not least, we integrated Amazon into the space to provide a seamless shopping experience. I hope this insight benefits you in the future, and I'd like to pass the mic to Omorian for what's next.

Thumbnail 2120

Thumbnail 2180

In closing, here is what comes next. Metaverse technology can be used across businesses. We are going to start with a character metaverse and with the same base, we can expand to digital twins, simulation, and training. This setup allows us to launch small 3D spaces quickly and take them worldwide with steady quality. We are making the system standard backend, front-end, and client so partners can use it as well. Several partners have already begun building new spaces with us. We are working with partners on real use cases. If you're interested, please reach out here at re:Invent. Thank you for your time and attention. It's been an honor to share our work with you.


; This article is entirely auto-generated using Amazon Bedrock.

Top comments (0)