Ubiquitous wireless intelligence requires engineering 6G technology to deliver much higher performance than 5G.
6G will bring new, exciting use cases above and beyond 5G. In addition to the next-level evolution in automated driving and smart manufacturing, 6G will enable innovative applications by combining sensing, imaging, and precise timing with mobility and truly leveraging artificial intelligence (AI) and intelligent networks. Further integrating communications technologies into society, 6G technology will bring mixed reality experiences and telepresence to life, while playing a pivotal role in achieving global sustainability, improving society, and increasing productivity across industries.
However, ubiquitous wireless intelligence requires engineering 6G technology to deliver much higher performance than 5G with capabilities measured in updates to existing key performance indicators (KPIs) as well as new KPIs driven by the uniqueness of the 6G vision. Early 6G targets represent a 10-100× increase over 5G for KPIs like peak data rates, latency, and density. 6G also increases the importance of KPIs related to jitter, link budget, and other technology aspects.
Figure 1 6G targets represent an increase in key performance indicators like peak data rates, latency, and density.
The need for wider bandwidths will require the use of frequencies above 100 GHz to enable ultra-high data-rate short-range networks. Precise timing requirements will bring time-engineered networks and new applications, but require changing the way networks operate. Delivering on 6G targets will demand significant advances in computing architectures, chipset designs, and materials.
Privacy and security concerns will get increased focus on all network layers, including new focus on physical layer security. The use of AI will also be necessary to optimize network operation, fight attacks, and facilitate recovery. A digitized, data-driven society with instant and unlimited wireless connectivity will lead to continued exponential growth in data traffic and connections and demand a hyper-flexible network. These aspects will bring spatial spectral efficiency and connectivity challenges.
Extreme modulation bandwidths, even shorter wavelengths, and higher propagation and atmospheric loss in the sub-terahertz (THz) and THz spectrum mean requirements for reduced beam width and increased antenna and device integration. Radio resource optimization and intelligent networking will also require innovation in RF, baseband, and system design to reduce power consumption.
Significant technical challenges lie ahead for wireless design and test engineers. 6G is in its infancy and will require years of research, but wireless communications technology evolves fast; as we build 5G, we need to prepare for 6G. Understanding challenges associated with sub-THz and THz frequencies is particularly important.
6G design and test challenges for higher frequencies
Sub-THz and THz frequencies involve extreme information bandwidths. Optimizing sub-THz systems performance operating over wide or extreme bandwidths requires consideration to the following key parameters:
Optimizing SNR is an important consideration to achieve the best error vector magnitude (EVM) performance. However, while maximizing signal power can achieve the highest SNR, reducing the signal power is necessary to avoid compressing components along the signal chain because of the statistical peak-to-average signal characteristics of complex waveforms. The noise contributions for SNR can also be problematic for wideband applications because the noise power is integrated over wider signal bandwidths.
Upconversion and downconversion between an intermediate frequency (IF) and sub-THz frequencies involves frequency translation with local oscillator (LO) signal source(s) and frequency converter(s). Frequency multipliers, which are often used in the LO path rather than the signal path to avoid impacting the signal modulation characteristics, will increase the phase noise. The multiplier can also introduce additive phase noise that will further degrade the multiplied LO phase noise. Low residual EVM test system performance at sub-THz frequencies requires high-quality, low-phase-noise LO signal sources.
To illustrate the importance of these parameters, let’s use a simple converter design with a modulation IF source set to a center frequency of 6 GHz.
In this example, the modulation could be set to quadrature phase shift keying (QPSK), 16 quadrature amplitude modulation (QAM), or 64 QAM. The symbol rate is set to 8.8 GHz with a root-raised cosine filter alpha of 0.22 and the modulated IF is upconverted to 144 GHz using a mixer with a low-side LO. The LO source frequency is set to 23 GHz, followed by a 6× multiplier. The mixer LO frequency is 138 GHz. With the 6 GHz IF, this yields an upconverted frequency of 144 GHz. LO phase noise is specified in terms of dBc/Hz at different frequency offsets. The phase noise is modeled with frequency offsets greater than 100 kHz. This example also uses a vector signal analysis (VSA) sink at the upconverter output to analyze the simulation results.
You can see the signal centered at 144 GHz, with approximately 10 GHz of occupied bandwidth. The 16 QAM constellation appears on the upper left. If you zoom in on one of the constellation states (circled in white), you see some minimal dispersion as a result of the LO phase noise. This minimal dispersion of the constellation states corresponds to the 1.56% EVM shown on the summary on the right.
If we increase the phase noise by 10 dBc/Hz at the higher frequency offsets, the constellation states rotate and dispersion increases, increasing EVM to 4.07%.
Removing undesired image products, LO feedthrough, out-of-band spurious products and emissions, and other undesired spectral artifacts of nonlinear mixing often requires filters. Filters, as well as other components in a test system such as mixers and amplifiers, can introduce linear amplitude and phase error over extreme signal bandwidths. An adaptive equalizer helps to mitigate the linear amplitude and phase errors, similar to what might be implemented in a receiver. Typically, a receiver system needs some baseband equalization because the signals it receives from the source are never ideal, and include channel impairments.
In a wide or extreme bandwidth test system, the test equipment receiver (for example, IF digitizer) can use adaptive equalization to remove linear amplitude and phase impairments across the extreme signal bandwidth. However, the adaptive equalizer will operate only on the linear amplitude and phase error. Noise and nonlinear impairments will remain and will impact EVM, regardless of whether the equalizer is enabled. The adaptive equalizer cannot remove nonlinear impairments from compressed amplifiers in the test system signal path or LO phase noise, which may impact the millimeter-wave (mmWave) test system’s residual EVM.
Here is another practical example to illustrate this challenge with a bandpass filter centered at 144 GHz and a power amplifier (PA) added to our upconverter design. The amplifier has gain and an output 1 dB compression point specified. An output third-order intercept (TOI) point is specified for the mixer to model nonlinear characteristics.
EVM reaches 15.99% without the adaptive equalizer enabled and you can see the associated dispersion in the constellation states. It is difficult to determine, however, if linear amplitude and phase error from the bandpass filter or nonlinear distortion from the PA or mixer causes the dispersion.
Now, let’s turn on the adaptive equalizer. EVM is better than without equalization, but worse than what it would be without the bandpass filter and PA because of nonlinear impairments from the mixer and the PA. The adaptive equalizer only removes the linear amplitude and phase errors from the bandpass filter. The remaining nonlinear impairments increase the EVM result.
These simulations used single-carrier QAM waveforms, but you can model and simulate other waveforms to assess their performance through the sub-THz upconverter design. Remember that waveform definition is not finalized until the physical layer standards are defined. A sub-THz test system needs to provide the flexibility to test and demonstrate candidate waveforms that may be custom or even proprietary in nature. Moving toward a 1 Tb/s data rate requires rethinking traditional waveforms such as single-carrier QAM or orthogonal frequency division multiplexing (OFDM). System design simulation will play a key role in evaluating predicted system performance under a variety of simulated scenarios.
Jessy Cavazos is part of Keysight’s Industry Solutions Marketing team.