The gaming industry, once a small, niche market, earned $120 billion in 2019.
It shows no signs of slowing down as experts predict that these figures will reach $196 billion by 2022.
Cloud computing is one of the most important factors that have contributed to this tremendous growth. We have already witnessed how video and music streaming services such as Netflix and Spotify have raised the bar and revolutionized the entertainment industry.
Cloud gaming is the next big thing, but its huge potential to dominate the market depends on low, near-zero latency.
Here’s what this means for data centers.
What Is Cloud Gaming?
The concept of gaming has been evolving over the years, and now gamers purchase discs for their consoles or PCs or download a game onto their hard drives.
In both cases, the gaming experience depends on the speed of your computer processor – if it’s not fast and powerful enough, the overall gaming performance will suffer. Although other components such as RAM and graphics play an important role too, they won’t be able to operate at their full potential if the CPU is too weak.
With cloud gaming, you don’t need a disc, console, or even computer in order to play your favorite game. You can stream it on any device, just like you do when you want to listen to your playlist on YouTube or watch a movie on Netflix.
In other words, your game exists on a remote server – in a datacenter packed with servers, to be more precise, and it consists of a series of compressed video frames. But unlike Netflix or YouTube, where these compressed frames are sent to your device through one-way streaming signals, cloud gaming requires a very fast two-way connection.
Gaming is an interactive process. Namely, after receiving a streaming signal from the server on your device, the inputs you make have to be communicated back to the server in order for the actions you want to take in the game to happen. The remote server then makes adjustments that correspond with your inputs.
What Is Latency?
Lag is the gamers’ archenemy.
This phenomenon, also known as latency, refers to a delay occurring between the action of gamers and the reaction of the server. If we bear in mind that streaming signals travel from the remote server to the end-user and vice versa, it’s clear that this communication has to be extremely fast in order to prevent lag.
Otherwise, gamers will experience reduced responsiveness, as their video signal will freeze, fragment, or stutter. This is annoying when you watch videos, and when you’re playing a game, lag can completely ruin your experience.
Besides the fact that servers are miles away from gamers, there’s another important factor that can lead to latency – millions of players engaging with a particular game at once. That’s why gaming companies need to be ready for such surges and build the infrastructure capable of handling resource-intensive tasks and processes.
In order to deliver a high level of performance and reliability, data centers require a vast amount of electricity, which is why preventing power-quality issues is essential for improving their efficiency, reducing costs, and improving user experience.
Why Are Data Centres So Important for Cloud Gaming Experience?
A simple answer would be that without data centers, gamers would struggle with crippling latency. They practically couldn’t play any game without frustration.
Let’s take some of the most popular games as an example.
With 200 million players from all around the world, Fortnite is quite a demanding game in terms of infrastructure. To be more precise, it relies on 12 AWS data centers providing 24 availability zones. As the peak load is ten times higher than the smallest one, each of these data centers has to allow for such extreme differences.
The World of Warcraft is another immersive game, and it’s powered by 17 data centers processing 100GB per second.
All these figures demonstrate that seamless delivery won’t be possible without a number of reliable data centers. Let’s not forget that gamers invest in state-of-the-art equipment in order to ensure the best gaming experience, and they won’t put up with latency issues.
That’s why it’s only logical that cloud gaming services do everything they can to deliver a near-zero latency experience.
How to Improve the Data Center Infrastructure?
There are various tools and solutions designed to help data centers support cloud gaming and reduce latency, and this goal is one of the top service providers’ key differentiators.
Low and predictable latency depends on the proximity to the end-user, which means that service providers have to build powerful data centers and edge nodes closer to their users.
Edge computing technology does this by physically shortening the distance that gaming content has to cover. In a nutshell, this is a so-called distributed architecture that combats latency and improves response time by placing applications, data storage, and other resources geographically closer to the locations where they’re needed.
As of recently, the three biggest public clouds – AWS, Microsoft Azul, and Google Cloud Platform – started providing edge computing capabilities.
This is particularly important for cloud gaming because, in 2019, Google launched its highly-anticipated Stadia, a platform that allows gamers to stream and play games on their TV, laptop, mobile, and all that up to 4K resolution. With all its massive network and computing resources, Google can focus on delivering superb performance.
Similarly, Microsoft launched its xCloud game streaming platform with which users will be able to play games on their mobile phones.
Cloud gaming providers that don’t have their own centers can take advantage of colocation data centers, that is, data facilities that rent out rack space, equipment, cooling, bandwidth, and other resources. This allows them to cache their content there and deliver it to the end-users based on their proximity, thus significantly reducing latency.
The cloud has revolutionized the gaming industry, as it offers different benefits and functionalities to players. However, to keep pace with this technological advancement, data centers have to adjust their infrastructure in order to eliminate latency.
Top comments (0)