Game on! How broadband providers can monetize ultra-low latency services for gamers
There are more than 2.5 billion active gamers around the world, and they really care about the performance of their connectivity. So much so that online gaming represents a significant opportunity for innovative service providers. According to PWC, the gaming industry topped USD 118 billion in 2018, and will continue to grow across all segments and markets.
It's not just teens playing online games: gamers between 18 and 35 years old represent over 40% of the overall gaming community (ESA, 2019), so we are talking about a demographic with significant disposable income. Gamers spend an average of 6.5 hours a week online, with 30% gaming for over 12 hours per week (Limelight Networks, 2018).
Service providers must offer value to compete in the massively popular battle royale and cloud gaming arena. It’s not about having a 100 Mb/s, 1 Gb/s or even 10 Gb/s residential broadband connection; it’s about offering the best and most consistent latency profile. A 500 Mb/s connection won’t make you a better gamer than your neighbor who’s on 100 Mb/s, but a consistent low-latency service will. And let’s face it, service providers can only squeeze so much extra revenue from high-tier broadband services, having to double the bandwidth every 18 months for the same price just to keep-up with the competition.
So how can service providers capitalize on the trend for online gaming? Let’s take a look.
The various dimensions of online gaming
The online gaming segment has a multitude of facets and dimensions.
- Platforms: PC, console (PS4, XBOX, etc), mobile
- Multiplayer: battle royale (Fortnight, Apex Legends, Call of Duty, etc.); multiplayer (2-8 players); Massively Multiplayer Online Role-Playing Games (MMORPG) (World of Warcraft, Final Fantasy 14 Online, Elite Dangerous, EVE Online, etc.)
- Cloud gaming: Nvidia GEForce NOW, Google Stadia, etc.
Each has unique bandwidth and latency profiles and requirements, from the very relaxed to the very stringent. With cloud gaming, for example, all of the heavy processing is offloaded to a cloud gaming engine, which then streams the game as a video feed back to the end-user. This allows low-end devices like SmartTVs or mobile handsets to play high-end gaming titles as if on a PC or console. In this situation the latency profile required is roughly 100 ms round-trip, of which the network transport shouldn’t be any more than 30 ms. The breakdown is like this:
- I press a key in my game to move my character. This command must make its way across the internet to the cloud gaming server in under ~15 ms
- The gaming server will then take ~45 ms to process the input and required output (including rendering and encoding the video frame)
- At which point the traffic will take ~15 ms again to reach me
- And another ~15 ms for decoding and rendering on my device.
And we are at 90 ms. This might seem relatively simple to deliver, but there are a few complications.
- The first is that delivering 1080p resolution at 60fps needs a minimum 25 Mb/s network connection, and that bandwidth will be used constantly during your gaming session. Compared to playing this game online in a traditional way, which might generate 5 Gb/month of data, the same usage over a cloud gaming service would generate 500 Gb/month of data, a 100x increase!
- The second is the variability in latency. There are multiple latency choke points between you and the cloud gaming provider, such as the performance of your device, the amount of traffic being processed in your home network using the same broadband connection, whether you are connected over Wi-Fi, Ethernet or 4G/5G, etc. These different points compound the latency effect and can lead to a bad gaming experience.
This chart helps illustrate the user experience.
This shows the average latency of a cloud gaming session over a 10 minute period. We can calculate the average latency of the session as 57.44 ms, which sounds very good—almost half of our 100 ms target. Unfortunately, we can see that on at least two dozen occasions we’ve crossed that threshold, causing significant visual degradation issues and creating a bad gaming experience. This is what we mean by the importance of not just having a low-latency profile, but a consistently low-latency profile.
What can innovative service providers do to cater to this valuable market segment?
As I touched on earlier, latency is a cumulative effect of all the various packet processing points in the network. The worst offender is in-home Wi-Fi. Nokia has pioneered and developed unique technology to address this particular challenge, with our “Low Latency, Low Loss, Scalable Throughput” (L4S) protocol, an IETF standard, and our next generation “Active Queue Management” (PI2), which support both classic IP traffic and L4S traffic at the same time. Here is an example of how it improves latency in real-world cases with a 100 Mb/s residential connection, in which we increase the network traffic load over time.
Software-defined access networks (SDAN)
Once we’ve addressed the first link in the chain, it’s time to look at the access network itself. Sure, latency profile characteristics vary on different access networks, whether DSL, DOCSIS, FTTH, fixed wireless access, etc. And while one might argue that latency is no issue on a FTTH network, let me present a few counter arguments. As we already discussed, it’s not just about low latency but a very consistent latency profile. FTTH offers latency profiles other access technologies can only dream of, with the exception of 5G. But there are still ways to improve it.
- SDAN offers methodologies to create a network “slice” just for gamers; a slice with improved traffic profile characteristics well below 1 millisecond!
- This slice makes it much easier to sell, manage and operate a gaming service offering, keeping gamers separate from other FTTH clients and services.
- SDAN can also be an effective driver for lucrative mobile-edge computing services, that have already started showing up in the gaming space.
IP and optical networking optimizations for gaming latency
So far we’ve been able to optimize the latency in both the home and access networks. But what can we do about the rest of the network, which typically means the internet in general? Most popular online games are hosted in a variety of datacenters geographically dispersed around the world. Some gaming studios build and manage their own, while others rely on cloud platform providers like Microsoft Azure and Amazon.
Most service providers take advantage of numerous public and private IP peering points in order to offload traffic to its end-destination as efficiently and quickly as possible, managed by popular routing protocols like BGP or internal traffic policies. Unfortunately, this is traditionally non-deterministic of the traffic type and profiles and may send traffic over a path that has more bandwidth available yet offers a worse latency profile.
Nokia offers a very innovative solution to address this, using the Segment Routing Interconnecting Controller (SRIC), a Nokia Network Services Platform (NSP). NSP provides operators with best-in-class methodologies for assuring and optimizing gaming services across various networks, and even multiple equipment vendors.
Service providers can now create latency sensitive templates that monitor latency to all of the various gaming datacenters and services, across all available peering-points. This allows for real-time traffic routing decisions that would always offer the best latency profiles regardless of what game is being played, and what datacenter is being used. NSP/SRIC also ties-in very nicely with our previous SDAN solution, to create a true end-to-end, best-in-class gaming service.
Share your thoughts on this topic by joining the Twitter discussion with @nokia or @nokianetworks using #WiFi #Connectivity #broadband #FTTH