Channel sounding of 5G signals started years ago with the intent of gaining knowledge on how 5G signals will react in a real-world environment. From those experiments, engineers have developed channel models. 5G New Radio is, however, much more complex than LTE given beam steering and mmWave signals. Thus, developing mathematical models and adapting them to modems has become far more difficult with 5G. Channel models are important because wireless PHYs must encode, transmit, receive, and decode data in such a way that bit-error-rates (BERs) stay acceptably low. Artificial intelligence (AI), in the form of machine learning (ML), is becoming a tool for characterizing wireless channels in the digital domain.

At the 2019 Brooklyn 5G Summit, Prof. Tim O'Shea from Virginia Tech and Prof. Andrea Goldsmith from Stanford presented their work where they use ML to characterize entire signal chains that include encoders, decoders, and the wireless PHY between them. That is, from the ADC output of a transmitter to the DAC at a receiver (Figure 1). O'Shea and others have started a company called DeepSig and have developed products that use ML to characterize a wireless channel.

machine learning to optimize wireless channel encodersFigure 1. Machine learning can take transmitted and received baseband data and use the results to optimize wireless channel encoders.

Both O'Shea and Goldsmith have adapted ML from other applications to wireless systems. O'Shea uses concepts developed for vision while Goldsmith adapted a concept used for molecular communications.

Dr. Tim O'Shea

The optimizer, a neural network (NN), in Fig. 1 compares the data input at the encoder to the data from the decoder. No RF knowledge is needed because the signal processing takes place at baseband and the entire signal chain is treated as a single entity. While it's possible to optimize encoders with no knowledge of the hardware nor RF channel, the learning process can run faster if it has some prior mathematical knowledge of the channel. 5G applications can start with a 3GPP channel model. (NYU Wireless maintains an extensive list of papers covering 5G channel models.)

Through a training algorithm, the NN can create a model of the channel based on probability of correctly detecting a symbol. The data fed into the ML algorithm can come from channel measurements—data captured in the lab or in the field—or from theory. The more data fed to the optimizer, the better it can learn and produce more accurate coefficients. The result is a set of weights (coefficients) that can then be incorporated into cellular modem encoders and decoders.

The video below, from O'Shea's 2019 Brooklyn 5G Summit presentation, depicts how the ML algorithm improves a 4QAM constellation diagram. You can see the clusters converge as the algorithm learns the channel characteristics and applies results to encoders and decoders.

Dr. Andrea Goldsmith

The job of any wired or wireless communications receiver is to detect transmitted symbols in the presence of noise and distortion. In digital communications, the receiver must recreate the transmitted symbols that represent bits. To accomplish detection in 5G mmWave signals, Goldsmith described an ML technique called a sliding bidirectional recurrent neural network (SBRNN) that she and Nariman Farsad at Stanford developed when they couldn't find a suitable ML algorithm for optimizing wireless channels.

SBRNN calculates the probability of a symbol by capturing a sliding consecutive symbol set. That is, it finds the average of the symbol probabilities in the sliding window to estimate the new symbol's state. Think of it as averaging of amplitudes when digitizing an analog signal. As the algorithm captures a new symbol, it uses what it knows about previous symbols to calculate the probability of the state of the new symbol (Figure 2). Such sliding is common in signal processing but, according to Goldsmith, is new to ML.

textFigure 2. A sliding bidirectional recurrent neural network calculates the highest probability of a symbol, learning from previous symbols.

Goldsmith said the ML technique produced better results than using traditional optimizing algorithms such as the Viterbi algorithm, except in certain cases (Figure 3).

textFigure 3. SBRNN produces better results than the Viterbi algorithm in most cases.

While watching and listening to these two presentations focused on using ML for wireless channels, I couldn't help but wonder if ML could also be used for wired copper or fiber communications channels. After all, high-speed digital links use pre-emphasis, adaptive equalization, and error correction to improve BER.

One such paper on the topic is High-Speed Channel Modeling With Machine Learning Methods for Signal Integrity Analysis, published in IEEE Transactions on EMC. DesignCon 2019 held several sessions on ML, neural networks, and signal integrity, including a boot camp.

While ML can learn about a signal chain and produce results without prior knowledge, don't expect ML to replace the knowledge of RF engineers, at least that's what O'Shea said when asked. Even so, there is some concern in the engineering community. The 2019 IEEE International Microwave Symposium will feature a panel session on Tuesday, June 4 titled Will Artificial Intelligence (AI) and Machine Learning (ML) take away my job as an RF/Analog Designer?.

Martin Rowe covers test and measurement for EDN and EE Times. Contact him at martin.rowe@AspenCore.com.