While PAM4 and forward error correction may have solved the bandwidth problem (for now), they've created a new set of technical problems.
The 2017 edition of DesignCon's "Case of the closed eye" panel marks 15 years since the panel took shape. In recent years, this has become the PAM4 panel. Chris Loberg from Tektronix opened by stating, "PAM4 is catching on. In 2014, there was one standard, now there are 12, including one with 56Gsps. 112Gsps standards are coming." While that may be true, there are still problems to solve. While PAM4 and forward error correction (FEC) may have solved the bandwidth problem (for now), they've created a new set of technical problems that have yet to be solved. This year's panel discussion focused on some of those issues.
As he always does, Ransom Stephens opened the discussion by noting the real problem, which is "nobody will pay for better PCB. We are all too cheap so we need ways to get around that." He continued, "A 56Gbps NRZ eye is closed so what do we do? Channels can have 30dB to 50dB of loss." While a 56Gbps PAM4 signal operates at 14GHz (as opposed to 28GHz NRZ, but at 12dB worse SNR. That results in more complexity. FEC lets engineers relax BER by over a million, from 1E-12 to 1E-06 and even higher. "We only pay a small cost, a 3% to 6% overhead. But, it's not that easy. There are new issues to resolve such as bit patterns and eye masks."
Broadcom's Cathy Liu explained some of the issues her engineers encountered in getting a 56Gbps PAM system running over a backplane, but showed that FEC can work. Liu's team ran a 68-hour BER test with raw bit-error ratio (BER) at 10E-6. "After turning on the FEC, we achieved error-free operation over a long weekend," she said.
In addition to achieving the error-free operation, the FEC engine can report settings and provide an error signature, telling the engineers how many coded words couldn't be corrected, of which there were none in this test. "Using DSP, we can relax the raw BER to 1E-5 or even 1E-4," said Liu. But, she noted that FEC can't always reduce BER from 1E-5 to 1E-15. It depends on the error signature. Low-frequency jitter, compression, and tuning of the channel can affect probability of error. Note the two eye diagrams below. One is symmetrical, but the other isn't because of compression and nonlinearity. Thus, they have different error signatures.
__Figure 1:__ *These two simulated PAM4 eye diagrams have different characteristics, especially in the middle eye.*
Next, longtime panelist Mike Li from Intel spoke about the positive and negative aspects of FEC. Because FEC lets you use higher raw BER, it reduces test time. You don't have to wait as long to measure raw BER because FEC reduces it from 1E-10 to 1E-12 to as low as 1E-4. "Consistency of FECs is still poor," he said. "It's like we've moved back in time to when we started this panel, when jitter measurements made on different oscilloscopes didn't agree with each other."
Inphi's Mark Marlett is trying to support customers who are developing 400Gbps optical systems. He's running into issues with 400G because of emerging optical standards, which are not yet stable. He's seeing some of the same issues that Cathy Liu mentioned, where eyes can take on different sizes and shapes depending on the metric used. Methods of evaluating eyes are still being discussed. Thus, he faces a dilemma, "Do I tune my channels for standards or to the system being deployed?"
As we saw in the graphic above, a tuning that produces the best compliance may not be the best one for a given system. That's still to be worked out in the standards and was reiterated several times during the panel session. "I get two different numbers and wonder if the reference equaliser was properly used," he said. Before turning the discussion to the test-equipment people on the panel, Marlett noted that some system-level engineers make shortcuts in their measurements, making them more difficult to support. The problem is, do differences in measurement matter if the system works?
Marlett then passed the microphone to Keysight's Greg LeCheminant, who questioned if we need to derive jitter components anymore. (That topic was the focus of a panel the next day.) He then asked if he was committing heresy by showing the audience an NRZ signal. He noted problems in NRZ days in trying to define eyes where some "bad" eyes had better BER than some "better" eyes. "FEC gives higher BER," said LeCheminant. "Errors are more frequent and that's changing the way we do things in test and measurement."
While he admitted that engineers still need to predict BER, he questions if we still need jitter derivation. For example, he noted that at the chip makers, the focus is less on signal integrity and more on error correction, saying "FEC is a game changer." He also added, "We can do a direct measurement on waveform and predict BER by looking at the waveform on the oscilloscope. Referring to Cathy Liu's comment about error signatures, LeCheminant noted that because of the higher BER that FEC makes tolerable, engineers are going to shorter bit patterns. Stephens has been advocating for shorter bit patterns for years and he just may get that. "With today's oscilloscopes and relaxed BER," said LeCheminant, "you can capture a waveform, look at the bits, find where the errors are, and look at the signal integrity."
Marty Miller from Teledyne LeCroy spoke next about test patterns, which he describes as one of his favourite subjects. He pointed to a conflict between T&M companies and chip companies, saying "Chip makers don't like complex bit patterns. While PRBS patterns are easier to use, they suffer because worst-case conditions are so rare that tests can miss them." Miller also noted that there are two different ways that patterns are used. One is to confirm that a system works. Another, he pointed out, is that test patterns can be used to stimulate a system from which engineers can learn what the systems are doing. He even claimed that you can predict an eye diagram from the BER. One such pattern being proposed is SSPRQ, which Miller described as a shorter, but more complex pattern than PRBS, which he claims that chip makers don’t like: "They add weight to designs and the circuits don't really work with long patterns."
Miller then showed test results using PRBS13Q (8191 symbols) and PRBS17Q (131,071) patterns, which produced different test results. Look at the horizontal arrows compared to the lighter vertical bands in the figure below. In this case, the PRBS17Q pattern produced better eye openings. The vertical band has a width of 1/4UI. It is effectively a mask, but only in the horizontal direction.
__Figure 2:__ *Marty Miller showed the differences between QPRBS13 and QPRBS17 patterns.*
Next: DesignCon 2017: SSPRQ patterns break everything »