DDR5: How faster memory speeds shape the future

Article By : Ben Miller, Keysight Technologies

Memory speed improvements introduce new design and test challenges for memory designers and manufacturers.

Why is DDR important?

Faster data processing requires faster memory. Double data rate synchronous dynamic random-access memory (DDR SDRAM) enables the world’s computers to work with the data in memory. DDR is used everywhere — not just in servers, workstations, and desktops, but it is also embedded in consumer electronics, automobiles, and many other systems. DDR SRAM is used for running applications and doing computations at blazing speeds, and the DDR standard ensures that memory providers all follow the same level of quality and speed for reading and writing to memory.

Each new generation of the DDR standard, defined by the Joint Electron Device Engineering Council (JEDEC), delivers significant improvements over the previous generation including increased speeds, reduced footprint, increased capacity, and improved power efficiency. The latest standard, DDR5, was released in 2020. Developers are already eager to take advantage of the incredible data rates: up to 6400 megatransfers per second (MT/s), doubling that of the previous generation.

These memory speed improvements will enable technologies which rely heavily on near-instant data-processing. However, these improvements introduce new design and test challenges for memory designers and manufacturers. Let’s look at how the DDR5 standard deals with these challenges, how developers are taking advantage of DDR5 in emerging technologies, and what the next generation of DDR might look like.

Faster, not wider

Since the origins of SDRAM, engineers have faced challenges with increasing memory speeds. DDR emerged as a faster, more efficient way to handle memory, while providing a universal standard between chip designers and device manufacturers. As shown in Figure 1, DDR memory consists of a memory controller which transmits clock, address and control signals, and a series of DRAM chips which store the data. In a write operation, the controller sends data and strobe signals to the DRAM; in a read operation, the DRAM sends data and strobe signals back over the same bi-directional line. DDR SDRAM became the standard in the late 1990s and has since been improved upon many times over.

Figure 1 DDR has both one-way control signals and bidirectional data buses.

Prior to DDR, memory speeds maxed out in the range of 100 MT/s. The first generation of the DDR standard boosted the data transfer rate up to between 200 – 400 MT/s. Each successive generation has generally doubled the bit rate of its predecessor. This first group of standards, DDR1 through DDR3, eventually ran into issues of increasing signal integrity as data bit rates were reached. Along the way, JEDEC developed related specs optimized for mobile applications, Low-Power DDR (LPDDR), and computer graphics, GDDR. By the release of DDR4, the 3200 MT/s speeds were starting to cause bit error rate issues. Faster speeds complicated design and validation by making signal integrity even more of a priority. While the clock and strobe signals are differential and cancel out noise, the other signals, including the bidirectional data signal between the controller and the DRAM chip, are single ended, making them susceptible to noise, crosstalk, and interference.

Faster DDR bit rates are also possible by transmitting more bits at a time. Unlike many modern communication protocols, DDR is a parallel bus, not a serial one. However, increasing the number of pins on a device above a certain amount is unrealistic and expensive, as both the die and package would need to grow to support the added channels. The pin count on memory has increased in prior generations, but it has since leveled out, as DDR5 has the same number of pins as DDR4. Faster signal speed alone is now the method of choice to increase memory bit rates for next-generation devices.

With faster speed comes challenges

As mentioned previously, faster transmission speeds can create issues with bit error rates. DRAM processes are inherently anti-speed, as the data bits are stored in charged capacitors. Channel and interconnect effects can cause inter-symbol interference (ISI) in signals operating at high speed. This appears as a closed eye diagram on an oscilloscope, which closes more and more as the bit rate increases above 3600 MT/s. High bit error rates manifest as the receiver cannot resolve the symbols with such distortion in the signal. In other words, the data bits become indistinguishable at high speeds, as shown in the eye diagrams in Figure 2.

Figure 2 Inter-symbol interference increases as bit rate increases, resulting in a closed eye diagram.

To preserve signal integrity, DDR5 uses Decision Feedback Equalization (DFE) to compensate for signal distortion and reopen the eye. Unlike high-speed serial communication, which also uses equalization for a similar purpose, DDR5 puts the equalization at the DDR controller side as well as inside the DRAM die. This element was not present in DDR4. Notably, the PHY is outside of the DRAM memory cell array itself because of potential issues with capacitance. Figure 3 shows a block diagram to demonstrate where these elements are placed in a DDR5 memory system.

Figure 3 New additions to DDR include specifications inside the die, such as an equalizer.

Engineers at every step of the DDR memory cycle will need to pay attention to the increased requirements of DDR5. Device simulation, design, and validation require optimizing the transmitter, receiver, and channel for the most reliable data transfer, dealing with the increased design complexity and smaller timing margins inherent in the faster bit rate requirements. Device, board, and system test engineers should keep the increasing ISI and signal integrity issues in mind, especially when debugging DDR failures, a process complicated further by the equalizer on the data path. Ultimately, the device must be tested for conformance with the JEDEC DDR5 standard to ensure interoperability with other memory components. All this effort and complexity culminates in a more sophisticated, faster memory transfer system, allow memory speeds to reach 6000 MT/s and beyond while maintaining a low bit error rate.

Faster memory enables the future

DDR5 is still in the early phases of adoption. JEDEC released the standard in July 2020. The first CPU platform to support DDR5 was released in early 2022. However, this platform still includes DDR4 support, so it could be some time before the industry accepts DDR5 as the de-facto memory standard and fully transitions memory systems to the new standard. Over the next couple of years, more personal computers, servers, and embedded systems will take advantage of the higher speed, lower power, and greater memory capacity available with DDR5. Doing so will open many new possibilities for high-speed networking and data processing.

Internet-of-Things (IoT) and Vehicle-to-Everything (V2X) autonomous vehicles are just two of the many technologies being enabled by 5G and 400 gigabit Ethernet which will connect billions of new devices to cloud services. The data storage infrastructure required to process all that data will need quicker and more efficient memory than anything we’ve seen yet, especially in time- and safety-critical applications like V2X. Outfitting servers with DDR5 will be a huge step, alongside adopting faster wireless communication, to enabling these new technologies. But as these applications become reality, will DDR5 be fast enough?

DDR6: Doubling the data rate of DDR5 

DDR5 is just starting to be adopted into the market, but JEDEC and early-adopters are already looking forward to the next generation. DDR6 is likely to double the maximum bit rate of its predecessor, as all previous DDR generations have done. One memory manufacturer announced in late 2021 that they have begun development of DDR6 with JEDEC, and that the standard could hit speeds of greater than 12,000 MT/s while quadrupling the memory capacity. But considering the speed challenges discussed previously, reaching this incredible bit rate goal will require even more innovation to successfully transmit the signal faster without causing significant, data-altering distortion.

One possibility is for DDR6 to take after other standards like Wi-Fi and Ethernet by using pulse amplitude modulation (PAM) or even quadrature amplitude modulation (QAM) signaling. DDR5 already set the precedent for adding more data processing to the standard, so it makes sense that more complex signaling methods may be used in the next generation to increase the bit rate further. By the Shannon-Hartley Theorem, which quantifies the maximum bit rate possible through a given channel, DDR4 currently takes up only about 10% of the capacity of a DIMM’s interconnects. Unlocking the other 90% between now and the arrival of DDR6, without losing data bits to distortion, is the primary challenge for JEDEC’s engineers. If they can solve the puzzle, they could lay the foundation for more data-driven technologies in the future.

The New Reality

Until DDR6 becomes a reality, developers will continue to make the move from DDR4 to the new DDR5 standard. Adapting products to updated standards takes time. The slightly more complicated nature of the DDR5 standard, including adding equalization to the specification, means that simulation, design, validation, and compliance tests all become trickier and margins for error are even smaller. But for many engineers, the faster bit rate speeds are worth the changes. Their customers will likely agree, from competitive gamers and data center technicians to developers of AI and autonomous vehicles. Faster memory is key to faster data processing for many essential applications.

Keysight is heavily engaged with JEDEC and early adoption partners of the DDR5 standard.

 

About the Author

Ben Miller is a product marketer with experience in the semiconductor and test & measurement industries. He joined Keysight Technologies in 2022 as Product Marketing Manager for Digital Applications Software, promoting software solutions for Keysight’s oscilloscope, AWG, and BERT products. Ben has a bachelor’s degree in electrical engineering from the University of Texas at Austin.

 

 

Virtual Event - PowerUP Asia 2024 is coming (May 21-23, 2024)

Power Semiconductor Innovations Toward Green Goals, Decarbonization and Sustainability

Day 1: GaN and SiC Semiconductors

Day 2: Power Semiconductors in Low- and High-Power Applications

Day 3: Power Semiconductor Packaging Technologies and Renewable Energy

Register to watch 30+ conference speeches and visit booths, download technical whitepapers.

Leave a comment