Validating 5G device and network performance in a lab environment

Article By : Jessy Cavazos

Understanding 5G handset and network equipment performance prior to live deployment is vital to a timely and smooth 5G service rollout.

The number of 5G device announcements is accelerating. Data from GSA from July 2021 shows that the number of 5G phone models announced by device makers nearly tripled between June 2020 and June 2021. Continuous hardware and software development for network components and the emergence of new features and improvements from 3rd Generation Partnership Project (3GPP) releases also drive a steady increase in demand for new and better testing capabilities.

The constant flow of new device announcements presents a significant challenge to mobile network operators (MNOs) and network equipment manufacturers (NEMs). For a start, the market is being flooded with software releases and bug fixes for devices and network equipment. Building a nationwide 5G network may mean 10 major base station software uploads per year, not including mid-releases and bug fix builds.

In addition, reducing performance issues in a production network is a significant challenge for operators. This challenge can be minimized by addressing it upstream in earlier phases of the product development workflow. The need for early device and network equipment performance testing is even greater with the additional complexity and network configuration combinations brought by Open Radio Access Network (O-RAN) technology.

Here, Test and Integration Focus Group (TIFG) in O-RAN Alliance has formalized and released specifications for end-to-end O-RAN system performance testing using real user equipment (UEs) in the laboratory and the field.

Test execution and results analysis is another problem area for engineers at both NEMs and MNOs. The volume of data gets overwhelming quickly. Many engineers struggle to get the insights they need for calibrating device and equipment performance and understand if performance is improving or degrading between models, versions, and network configurations.

Collecting a huge amount of data is not the issue though. The challenge lies in understanding and analyzing test results over time. Many engineers are looking for a single score to help them make quick comparisons with seamless capabilities to troubleshoot performance degradations and discrepancies. They are also looking for ways to improve the efficiency, flexibility, and cost-effectiveness of test campaigns.

Overcoming performance benchmarking challenges

Overcoming the challenges mentioned above requires rapid test plan configuration and execution, easy access to historical results and trends, and in-depth analytics for troubleshooting. Getting a single performance score from a group of key performance indicators (KPIs) also accelerates the comparison of devices, hardware builds, and software versions under test.

First and foremost, consistency in test case execution is key to make comparisons easy to understand and track over time. Next, automating the entire workflow across different devices and network components is a must. Finally, comparing parameters to spot configuration changes in the network also helps avoid degradation in device and network performance.

Other factors like the ability to integrate lab equipment and instruments into the test setup and the availability of defined, packaged, and modifiable test cases help increase test campaign efficiency, flexibility, and cost-effectiveness while enabling more sophisticated test cases for stress testing and handover scenarios.

Figure 1 provides an overview of Keysight’s Performance Benchmarking (PBM) solution for 5G mobile devices and base stations in a laboratory environment. It consists of three main software components. First, Nemo 5G Device Analytics helps configure the test parameters, run the test cases, and perform the analytics. Second, Nemo Outdoor connects to the device to collect diagnostic information and create a measurement file. Third, OpenTAP is an automation platform that provides a scalable test framework for equipment to communicate with each other.

Figure 1 The test solution enables performance benchmarking of mobile devices and base stations in a laboratory environment. Source: Keysight

To evaluate 5G devices and versions using live base stations, you need to connect the device to a live network, as shown in Figure 2. For 5G base stations and versions testing, you need to use a reference device so the base station becomes the device under test (Figure 3).

Figure 2 A device connected to a live network and a test solution enables 5G device benchmarking. Source: Keysight

Figure 3 A 5G reference device connected to a network element and a test solution enables 5G base station benchmarking. Source: Keysight

Accelerating time to insight during test execution

With a performance benchmarking solution, engineers can configure test in a few steps, whether they are benchmarking a device or network element. Engineer can select the network and the device types, the logs directory to store the measurements, and the device and network element under test. Then they name test project and test plan and choose the test cases to be part of their test plan. Parameters are pre-defined for each test case, helping accelerate the setup process; but engineers can modify them to suit their needs and save these settings, as shown in Figure 4.

Figure 4 Pre-defined test cases accelerate the test setup. Source: Keysight

Pre-defined scoring methods further speed up the test setup process. Engineers can customize the test cases taken into consideration by the scoring method as well as change the importance of each test case (Figure 5). Engineers can also see the parameters for each test case, modify them, change their individual weights, and save this information for future use (Figure 6). Then execute the test as many times as they need to.

Figure 5 Modify the weight of each KPI in the scoring method to match your needs. Source: Keysight

Figure 6 See the details making up each KPI and change the values as needed. Source: Keysight

Understand performance using analytics

Understanding the performance of a 5G device or base station requires a range of new capabilities. For starters, being able to view the KPIs that affect scoring in real time during the execution of the test plan can help spot issues ahead of time (Figure 7).

Figure 7 Viewing the KPIs in real time enables you to spot issues early in the test process. Source: Keysight

Access to the test plan execution history helps when comparing test runs. Being able to select each test run is important to analyze the results further, as well as being able to compare the KPIs for different devices or software versions (Figure 8).

Figure 8 The benchmarking solution enables the comparison of KPIs for different devices and software versions. Source: Keysight

Beyond this, drill-down and troubleshooting capabilities are essential when faced with unexpected results. An important capability is the ability to open specific KPIs and see the underlying data over time, as shown in Figure 9.

Figure 9 Viewing the data of specific KPIs side by side facilitates quick troubleshooting. Source: Keysight

Access to messaging information is another useful capability for troubleshooting purposes as well as understanding how the signaling flows between the device and the base station. The benchmarking solution lets engineers select messages to see the decoding information (Figure 10). They can also calculate the latency between signaling messages.

Figure 10 Ability to see decoding information from specific messages helps in troubleshooting. Source: Keysight

It’s also helpful to identify differences between parameters in a network. The benchmarking solution, for example, lets test engineers select a view that only shows the differences, and that accelerates the analysis.

Figure 11 Seeing the differences in parameter configuration accelerates the troubleshooting process. Source: Keysight

5G is moving fast. Mobile network operators and network equipment vendors must effectively overcome the challenges posed by the constant flow of new devices and the evolution of 3GPP release standards. Understanding 5G device and network equipment performance prior to deployment in a live production network is vital to ensure a timely and smooth 5G network rollout. Equipped with test solutions that enable rapid test configuration, execution, and in-depth analysis, engineers can overcome these challenges and accelerate time to market.

This article was originally published on EDN.

Jessy Cavazos oversees 5G solution marketing at Keysight Technologies.

 

Leave a comment