Optimizing LTE Networks and Communication Protocols of Reliable Real-time Applications

31 October 2019

New Image

We study the options for supporting reliable realtime use cases using commercial Long Term Evolution (LTE) networks. We consider two different approaches to facilitate such services, namely Network Optimization that is available to the operator, and optimizing the Application Communication Protocol which is available to the application developer. To test the scope of applicability of those methods, we use latency measurements of a commercial LTE network deployed in a lab environment where we can fully control the over-the-air traffic and radio configuration parameters. We are also able to freely change the communication protocol of the reliable real-time application. The experimental results show that the achievable latency and reliability is mainly limited by two features of the LTE radio system: The configured Discontinuous Reception (DRX) pattern at the UE and the Scheduling Request (SR) periodicity. By disabling or applying less-restrictive DRX and SR patterns and using a grant-free uplink technique we show how the packet latency distribution improves but at the cost of network efficiency. For cases where such optimization is not feasible, we show different approaches available to the application developer for selecting an optimal communication protocol. For instance, by frequently transmitting small packets to increase the network activity and thereby reducing the probability of the UE going to DRX mode, or by coordinating the application-layer message generation interval in line with the SR opportunities and DRX patterns of the underlying radio system.