Why scaling analog for power integrity analysis is critical

Article By : Joseph C. Davis, Siemens Digital Industries Software

The EDA power integrity solution for the past 20 years has been post-layout SPICE simulation. That must change for bigger analog systems.

We are finally in that future where computers actually interact with us and our environment to make our daily lives easier, safer, and more efficient. My car doesn’t yet drive itself, but it knows to slow down when the car in front does. My watch knows I fell and need help. Cameras can discern the focus of a person’s gaze. Smart cities. Industry 4.0. Autonomous driving. 5G networking. Smart Grids. The growth of applications in these areas is driving growth in semiconductor design starts well above the industry average (Figure 1).

Figure 1 Embedding intelligence into everyday life drives new growth areas in semiconductor design starts. Source: Semico Research

All these technologies combine unbelievable processing power with sensors and analog circuitry at a scale that could only be dreamed of in the past. However, to be successful in the market, these combined systems must also work with reasonable power consumption and be reliable over long periods of time, often in challenging conditions. These operational and market requirements influence what design companies must accomplish at the chip level during design, implementation, and verification.

Power integrity is one of many design goals under stress, and it is especially important in these combined systems, which contain large analog systems and sensors at a scale never envisioned before. Let’s take a look at the industry trends driving these chip-level challenges, and what that means for the power integrity and reliability of these systems.

New sensor order

As most design companies will tell you, the use of image sensors has exploded in recent years. Sensors are the front end that fuels the back-end computation growth. Smartphones created the initial volume demand for camera chips that resulted in the price drop, which makes it possible for virtually anyone to put a camera anywhere. Hunters can hide trail cams in the woods to collect data on game movements. Homeowners can put up a security camera in less time than it took to buy it. Cities are blanketing their streets with cameras to deter crime, monitor traffic, and track pedestrian flows.

The sensors in these products span a wide size range, from very small pixel counts up to wafer-sized sensor arrays. There are companies currently designing 300-Mpixel sensors, and there are surely more coming.

Figure 2 A 34% CAGR over the next five years for sensors connected to the Internet implies very rapid growth, not only for sensors, but also for communications networks and interfaces. Source: IC Insights

But sensors by themselves aren’t a complete solution. To enable systems to make sense of what these sensors “see,” they must be able to analyze the data. That means combining processing power with the sensors—either on chip or networked together. Automotive electronics need sensor systems that can detect and distinguish between other vehicles, pedestrians, and bicyclists. Fitness watches that monitor your heart rate, your breathing, and your sleep patterns are using sensor systems. In fact, the most anticipated feature of the next Apple watch is a sensor system.

This processing capability is rapidly evolving as well. Last year, Sony announced a combined sensor/artificial intelligence chip that enables on-chip object and person identification, so that only the metadata rather than the actual image is passed out of the chip. These applications drive increasing complexity on both the analog and digital portions of the chip design, with more power domains and larger analog/mixed-signal blocks in those designs (Figure 3).

Figure 3 Analog blocks and power domains are the fastest-growing chip design components, and this growth trend is expected to continue through the next five years. Source: Semico Research

Most sensor systems in use today rely on wireless communications to transmit data and commands. Whether that wireless communication is Bluetooth, Wi-Fi, or even cellular, it means an IC combining sensors, analog, and digital processing. Infineon’s Shawn Slusser cited a smart building sensor combining radar, CO2 sensor, processing, and Wi-Fi/Bluetooth as one example of the types of products which are driving growth and the synergies Infineon saw when purchasing Cypress Semiconductor. In fact, Infineon’s stated purpose for the acquisition was for “linking the real with the digital world and shaping digitalization”.

Infineon isn’t the only company maneuvering for design capabilities to take advantage of the growing need for combined systems. There have been numerous mergers of big analog and digital companies in recent years. These mergers are not the typical unification of competitors swallowing each other for market share, but companies with diverse offerings coming together to build a broader capability for integrated offerings. For instance, Renesas, historically a more digitally-focused company, has recently acquired IDT, Intersil, and most recently Dialog—all strong analog companies. Marvell has likewise moved to build both digital processing and interfaces with acquisitions of InphiAquantia, and Innovium. These companies are acquiring capabilities to provide more comprehensive and powerful chips for combined processing and interfaces.

Power integrity analysis for large analog designs

What does all this mean to the system-on-chip (SoC) and electronic design automation (EDA) industries? The size and complexity of these sensor systems drive the trends you hear about all the time—processing power, bandwidth, and networking. At the same time, that size and complexity have led to breaks in traditional analog design and verification flows. Traditional analog EDA tools simply aren’t as scalable as digital tools. Sensors can get big. What’s more, the analog systems being designed today combine dozens of blocks. Typical analog design and verification flows may work fine for those blocks, but what about the system, which can run into tens of millions of transistors? Digital faced and overcame the scaling hurdle long ago, but analog has, until now, been able to manage the scalability problem, rather than solving it. Why the difference?

Hierarchy is the standard approach to finding and resolving issues in circuit design. While analog circuits are indeed hierarchical, they don’t use the same sort of abstractions throughout the flow that have been developed for digital flows. In the continuous waveform world, it is much harder to abstract circuits because you can’t reduce everything to zero and ones, slew rates, and simplified models like you can in digital circuitry. In analog circuit analysis, SPICE is king. When your analog circuit gets too big for timely SPICE simulation, you start having to apply less accurate tools or your own engineering judgement to simulate select sub-circuits.

Power integrity for large transistor-level circuits is itself one of those workarounds. For power integrity analysis, you must determine if your circuit, as implemented in silicon, will function as intended after you account for all those pesky voltage drops created by current being delivered from the system level down through the power grid to the individual transistors. For very small circuits, this is a simple matter of extracting the parasitics of the complete as-implemented circuit layout and doing a post-layout simulation. No problem.

As your circuit gets larger, though, the sum of those parasitics causes your SPICE simulation to bog down. The EDA power integrity solution for the last 20 years has been to run post-layout SPICE simulation with only the signal parasitics, then take those waveforms and pass them to secondary simulation, which includes the parasitics of the power grid. This approach works because the non-linear SPICE simulation has a reduced order without the power grid parasitics, and the secondary simulation is very large, but linear. However, there is still a circuit size where even the signals-only SPICE simulation is too much. Modern analog and sensor designs passed that limit long ago.

Of course, people are designing and producing these designs today, so there must be a way they are getting the answers they need. Indeed, they are, but at a cost. That cost comprises both engineering time and over-design. When scale is the problem, there is always a path to find an approximate solution. It just takes a lot of engineering effort and time to create a reduced-order analysis, or to remove parts of the circuit, or to do a detailed simulation of “the sensitive parts” of the circuit. Or perhaps, to simply oversize your traces to minimize voltage drops, which has other costs, such as area and capacitance. That’s the situation design companies find themselves in today—they need those sensor systems in their products, but the lack of good tools is costing them time and money, and introducing additional risk into their products.

How do companies get their analog designs to tapeout in a timely fashion, while still ensuring power integrity and operational reliability? What’s needed is an EDA solution that provides design teams with fast, scalable, and accurate analysis of analog layouts, from the smallest blocks through the largest analog circuits, even up to full-chip designs. By providing analog design teams with the same level of scalability and performance as their digital counterparts, companies can ensure that their analog designs meet the power-related design goals and performance standards, as well as their tapeout schedules.

Full EM/IR analysis

At Siemens EDA, we’ve developed a new toolsuite that will provide, for the first time, full electromigration (EM) and voltage drop (IR) analysis scalability to analog designers. Our mPower tool performs simulation-based EM/IR analysis for the largest, most complex blocks and chips to enable fast, accurate power integrity analysis of 5G sensors, AI, multi-core, chiplets, machine learning, and other large and complex SoC systems (Figure 4). This scalable functionality provides the detailed analysis designers need to confidently sign-off designs for manufacturing, while enabling faster overall turnaround times by providing full-chip and array analyses from block-level SPICE simulations. It can also enable faster iterations early in the design cycle by using pre-layout SPICE simulations.

Figure 4 High-capacity dynamic analysis provides simulation-based EM/IR analysis on the largest, most complex blocks and chips. Source: Siemens EDA

To successfully satisfy the growing demand for products that incorporate sensor systems, analog design companies must ensure that their SoCs can provide the power integrity and operational reliability that the consumer expects and demands. To maintain and grow market share, they must also be able to deliver their SoCs on schedule. With the emergence of automated power integrity analysis tools for analog designs, those goals are now more realistic and attainable.

Joseph C. Davis is senior director of product management for Calibre Interfaces and mPower Power Integrity at Siemens Digital Industries Software.

Leave a comment