New measurements such as channel operating margin and effective return loss have entered the measurement fray, powered by PAM4 and 112 Gbit/s data rates.
This year’s DesignCon “closed eye” panel focused on some of the problems that SerDes designers are fighting with this year. Speed increases always bring new problems, and 112 Gbit/s PAM4 links are the latest iteration. Once again, engineers from the signal-integrity industry highlighted their latest problems and discoveries, with representatives from test and measurement explaining what’s needed to measure the problems. This year’s panel consisted of, in order of appearance:
“In 2002, we couldn’t measure jitter, so we came up with random jitter (RJ), deterministic jitter (DJ), periodic jitter (PJ), sinusoidal jitter (SJ), duty-cycle distortion (DCD),” said Stephens. At that time, if you measured jitter on different test equipment, you’d get different answers. That’s no longer the case.
Then came PAM4 with its four levels and three eyes. Now, the problems have shifted to noise because of PAM4’s greater noise sensitivity caused by the smaller eyes. Today, we’re grappling with calculations of channel-operating margin (COM), effective return loss (ERL), signal-to-noise and distortion ratio (SNDR), and dispersion eye closure penalty quaternary (TDECQ) measurements. “As we embrace these figures,” said Stephens, “can they give us diagnostic information? Yes.”
Broadcom’s Cathy Liu brought us up to date on die to die, chip to chip, and chip-to-optical-electrical module. “NRZ is for shorter reach, PAM4 used in longer reach. PAM4 may become dominant from die to die up to backplane and through copper cables.” Liu noted that we will need different forms of equalization depending on application. According to Liu, no continuous time linear equalization (CTLE) needed for shorter reaches and need lower power. Depends on channel loss and reflections. Might need DFE or FFE. At 112G, SerDes core might not cover multiple applications, especially for shorter reaches. Problems will be power and die size. We see a trend to use different SerDes for different reaches.
“How long is ‘long’ for 112G,” asked Liu, “which has a Nyquist rate of 28 GHz?” She asserted that insertion loss of 28 dB to 30 dB loss is not long reach anymore, at least not for backplane or copper cable given a rule of thumb loss of 0.1 dB/in./GHz. A 10-in. trace has 28 dB of loss at 28 GHz. Plus, you you can easily get 3 dB to 5 dB insertion loss. “Remember that you have two packages (Tx and Rx). I don’t think 28 dB loss is good enough.” Liu described using flyover cables to reduce insertion loss. Flyover cables can extend trace length.
“FEC (forward error correction) is no longer optional,” she continued. “IEEE 802.3bs and 802.3cd KP4 require FEC, which can relax designs by providing 10E-6, 10E-5, or even 10E-4 bit-error ration (BER). At 112 Gbis/s, what kind of FEC will be needed as we double the baud rate? Can we relax designs to 10E-3? But we have latency caused by FEC. Some apps may not tolerate the latency.”
Liu, a Optical Internetworking Forum board member, noted that for compliance specifications, COM was proposed years ago and it will still be the main compliance tool adopted by OIF. “Why COM? We had defined the receiver, transmitter, channel separately, but what about reflections, crosstalk, etc? People proposed COM to address it all. Reference receivers need CTLE and DFE, which can be defined into COM. COM is useful for getting S-parameters, but it can do more. I want to know what causes a failure. We must decompose impairments and recalculate the COM contribution. We must look to see which impairment is dominant.” Liu gave an ISI example. “Knowing that, what can I do to improve performance? Do I need more DFE or more shielding? If the link still fails, what is the next dominant parameter?”
Inphi’s Mark Marlett took to the podium where he discussed the difficulty in tuning a channel for best performance. “If we tune a channel for BER and tune a channel for another measurement such as TDECQ, we get discrepancies.” He then asked, “Can we use jitter decomposition to get a clue by correlating jitter to BER? It turns out, we can.”
To prove his point, Marlett used Dual Dirac analysis. After creating jitter histograms, he converted them into Dual-Dirac models. Applying Dual-Dirac calculations, Marlett found that using Dual-Dirac deterministic jitter, which is really deterministic noise, you can use the calculations to tune channels.
Intel’s Mike Li continues to question the use of PAM4, as does Lee Ritchey in Is PAM4 really necessary? “In 2000, jitter and noise were big topics” he told the audience. “We had tons of NRZ margin. PAM4 is more sensitive to noise, especially at the eye of the receiver.” Li brought up ERL, a new parameter discussed heavily at DesignCon. ERL represents reflected energy at the receiver. He described ERL as a figure of merit for DUT reflection impact. “It’s all about what the receiver sees and about interoperability.” [EDN will provide a detailed explanation of ERL in a future article.]
As one of two panelists from test-equipment companies this year, Keysight’s Greg LeCheminant took a system-level view of the 112 Gbits/s problem. “We need to take a system-level view of a transmit/receive channel. Whether its two chips a few millimeters apart or connected through 80 km fiber, its still a complete system that must work with worst-case transmitters, receivers, and the channel between them.”
LeCheminant noted that the system’s components can come from a variety of sources. While that provides alternate sources of components, it does require interoperability. “When testing transmitters,” he said, “we need to look at them from the receiver’s perspective. The problem, however, is interoperability, especially given that a PAM4 receiver is far more complicated than an NRZ receiver.”
He also noted that people don’t like to talk about what’s inside the receiver chip. That makes writing specs and designing test equipment difficult. How you define the measurement affects uncertainty. The problem comes in how equalizers such as DFE and CLTE are implemented in oscilloscopes. LeCheminant noted that such optimization is usually left to the test-equipment companies, much in the same way different oscilloscopes used to implement jitter measurements. “Even worse,” LeCheminant continued, “is we lose margin in PAM4 and we don’t like to give up that margin in our measurements.” Today, measurement uncertainty is more than just hardware uncertainty.
Every time we increase data rates, we unmask new problems. For example, the speed increase to 28 Gbaud, whether NRZ or PAM4, unmasked a skew problem in PCB laminates. Pavel Zivny from Tektronix noted some new problems, though this time, new problems are being cloaked. “We have measurements of bandwidth lower than in the past [because of PAM4]. We have to see what the receiver sees and because the receiver has an equalizer, test equipment also needs an equalizer.” Zivny claimed that emulating a receiver is good, but it also means we don’t know what we are seeing in terms of the signal as it enters the receiver. You learn just as much from the eye as you do from the equalizer’s taps. You don’t want to use a transmitter that makes the Rx job easier.
The bandwidth of measurement equipment affects the noise of a measurement. “With wideband real-time oscilloscopes, the measurement becomes too noisy, but with subtraction, we remove the noise,” said Zivny. “Oscilloscopes try to remove noise for compliance measurement. It’s a problem because the noise is still there and the oscilloscope masks the noise that the receiver sees. The challenge comes in that we have to be judicious. We need to know what’s limiting the design.”
—Martin Rowe covers test and measurement for EDN and EE Times. Contact him at martin.rowe@AspenCore.com