IC design: A short primer on the formal methods-based verification

Article By : Dr. Ashish Darbari, Axiomise

Design verification groups are using formal methods, though not consistently. Here is an insight on their practical advantages in IC design.

It’s no secret that hardware is the new currency in the chip world. It’s no longer the case that the semiconductor industry is in the hands of traditional semiconductor giants; an increasing number of software companies now have their own dedicated hardware development teams. With the advent of open-source architectures, developing silicon with open-source framework and tools is slowly becoming mainstream.

However, while designing has become easier, verification has not.

The 2020 Wilson Research Group study on functional verification points out that 83% of the FPGA and 68% of the ASIC designs fail in the first attempt. Also worth noting is that 68% of the ASIC projects run behind schedule.

The survey reports that for processor development, the ratio of verification to design engineers is 5:1. Imagine having to hire five times more verification resources than design and having to respin the chip.

Figure 1 The numerous semiconductor verification challenges range from rising costs to delayed project schedules and respins to headcount. Source: 2020 Wilson Research Group and Axiomise

The status quo of applying software updates to patch hardware flaws is not always a feasible option, especially when verification is estimated to cost about 70% of the overall project costs.

Challenge: It’s no longer functional verification

When the Intel Pentium FDIV bug happened in 1995, computers were primarily used as desktop PCs or servers. Today, computers are used everywhere, from automotive and IoT devices to 5G mobile phones, and in an ever-increasing footprint in defense and aerospace.

Functional verification is not the only challenge. With different standards such as ISO 26262, DO-254 and DO-333, liability is as important as establishing reliability. As a follow-on to Meltdown and Spectre flaws, security is also gaining prominence along with functional safety.

The scope of quality and reliability has increased with demand and competition in hardware development and time-to-market.

Simulation has been the traditional verification workhorse and is still the dominant technology used in design verification for digital and analog circuits. Emulation is increasingly adopted for system-level verification and, though it comes at an upfront cost, any company that is serious about silicon quality is using it. Emulation provides increased speed and is a faster form of simulation.

Both simulation and emulation leverage test cases and use input sequences or programs as a primary form of testing the underlying hardware. If there is a mismatch between the expected behavior and the actual behavior of the design in these environments, it’s flagged as a design or a test bug. Testing is only capable of showing the presence of bugs, not proving its absence.

Originally cited by Dijkstra’s algorithm as a way of making a case for formal methods in the 1970s, this is still very much the case in 2022. Verification is only as strong as its weakest link and, in the case of both simulation and emulation, the weakest link is the input sequence of the specific test case being used to validate a specific scenario.

Of course, the modern simulation uses functional coverage to assure that enough testing has been done, but there are no guarantees of exhaustive verification. In other words, all bugs have been flushed out by the testbench by employing the way simulation or emulation works.

Formal methods: A proof-based approach for establishing bug absence

Formal methods use a proof-centric technique whereby a formal tool builds a mathematical model of the design and testbench to generate proof that a given requirement expressed as an assertion or a cover target is proven exhaustively.

The reason formal methods can prove this is because, at its root, the formal tools are using a combination of search-based techniques inspired from artificial intelligence (AI) and employ logical inferencing and decision making on a symbolic representation of the design.

The formal methods-based verification works on a symbolic model of the hardware. It automatically abstracts the exponential complexity of the design space to build a proof of the assertion or a cover property. If the proof cannot be built, tools show a counter-example waveform, just as in simulation, and it can be used to fix a design bug. Because of the intelligent way in which exhaustive search is carried out by the formal tool, corner-case bugs are often found much quicker and, in many cases, many more are found more easily than via simulation or emulation testbench.

For many verification problems, there is no alternative to formal methods. Examples include proving:

Deadlock absence: Unreachable code coverage correctness of arithmetic heavy designs such as fixed point and floating point.

Robust verification of microprocessors especially out-of-order, superscalar: Correctness of concurrent designs with deep counters, FIFOs and memory components.

So, formal methods have become the only choice for seasoned design verification groups for many of these problems.

Figure 2 The use of formal methods continues to be limited to extremes, either to simple problems or the end of a project cycle. Source: Axiomise

In most organizations, formal methods remain limited in use for solving either simple problems such as lint early in project development, or in establishing connectivity checking on a system on chip (SoC) toward the late end of the project. Yes, bugs are caught using lint apps from electronic design automation (EDA) vendors and connectivity checking apps, but the bulk of functional verification is still carried out by most companies using simulation.

Formal methods are not widely considered. The only exception is that in some cases, a handful of formal gurus will take one or two design blocks and use formal with no consistency in adoption or methodology or reuse. The true potential of formal methods in finding corner-case bugs and establishing proofs of bug absence throughout the project cycle remains vastly untapped.

The true potential for formal methods: Unlimited

In a typical project, simulation is the main technology used for verification. It is no secret that the development time of a reasonably complex universal verification methodology (UVM) testbench is of the order of several weeks or months, and this is simply to start testing a handful of interesting, directed test cases.

On the other hand, setting up formal verification is a matter of a few days for most design projects. And though the formal testbench is not entirely complete then, the assertions when tested in formal tools provide a much wider scope of investigation due to the nature of proof-centric state-space search and is able to identify bugs much more quickly and comprehensively.

Combining formal and simulation enables design verification groups to start picking up bugs in the first week of design using formal. By the time simulation, testbenches are ready, formal would have flushed out low-hanging fruit and corner-case bugs that may have been challenging for simulation.

Figure 3 Formal verification can be deployed across a broad verification spectrum. Source: Axiomise

Formal methods can suffer from scalability issues and this is the time where some of the hardwork can be offloaded to simulation testbenches. The figure shown above illustrates how formal methods when deployed on projects can find bugs starting from the first five to the last five.

When combined with simulation, the two technologies provide a very powerful framework for bug hunting and establishing mathematical proof of bug absence, which is much needed for functional safety and security. By combining formal and simulation, design verification groups can continue to run assertions and cover properties in formal and reuse them in simulation environments. It is necessary to regress formal properties in simulation environments to validate them on interfaces, so any incorrect assumptions made in formal can be ruled out.

When simulation testbenches start running out of capacity on problems such as X-checking, CDC, register checking, connectivity, deadlocks and unreachable code-coverage analysis, formal apps can kick in typically toward the end of the project lifecycle. At this stage, design verification groups pick up their last five bugs by harnessing the power of automated formal apps.

Methodology: Making formal efficient, predictable and scalable

How does a design verification group run formal on large-scale project designs with 10- to 100-million flops? What is the secret needed to sign off those design blocks with formal? The short answer is a great methodology.

To enable formal property checking to find bugs faster and not suffer from tool capacity issues, some fundamental aspects of formal methodology must be mastered. Often, if the right question is posed, the tool can facilitate things much faster without the famous state-space explosion problem.

An abstraction-driven methodology provides a great way of addressing design complexity issues. An agile flow could be an ideal flow for deploying formal verification on a project that can maximize its true potential.

Figure 4 Design complexity can be reduced by implementing an agile formal verification flow. Source: Axiomise

Formal verification should be used as soon as the first line of design is written. Below are the five steps needed to adopt formal successfully in a project captured as an ADEPT flow for formal verification.

  1. Avoid: Use a formal verification tool in the first hour of design development. As soon as the first line of code is being written, verification engineers should run a formal tool to compile it and elaborate it (synthesize it). They need to check for the low-hanging fruits such as dangling wires, mismatched signal types, undriven wires, reset/non-reset bugs, dead code and redundant code, and obtain a code coverage report. Yes, most formal tools can provide this information within seconds of running the tool and identifying the first defects. There is no point in building dead code in modules now and letting a simulation testbench find it several months later and losing time to close the code coverage problem.
  2. Detect: Use formal in an agile manner. By following the first step, every iteration of design change will catch low-hanging bugs. However, engineers can go a step further and bring in interface constraints to start finding issues related to X-propagation, deadlocks, FSM reachability, and livelocks. Most of this analysis is also automated in most tools (not all though) and engineers can start catching deadlocks and X-issues within minutes.
  3. Erase: By deploying formal in an agile manner using the automated capabilities of the formal tool, engineers can build design units that do not have structural design defects (and no code coverage issues). Designs can now be handed over for functional verification to the verification groups who can now deploy assertion-based verification (ABV), also known as property verification to scrub the bugs away (erase).
  4. Prove: Building proofs of bug absence in many cases happens naturally in a formal tool. When the formal tool cannot find bugs, it builds proof that the design will always work correctly against the specified requirements formalized as assertions and cover properties. However, in many cases, formal tools cannot automatically build proof in a reasonable amount of time, and tools run out of capacity. This process can be made deterministic by following a structured process and methodology that allows engineers to make this task predictable. It’s made possible by deploying proof-centric techniques leveraged by domain expertise for a given hardware design, and it allows exhaustive proofs as well as bug hunting at deeper bounds of search space.
  5. Tape out: When individual design blocks have been formally verified for bug absence, they are stitched in a bigger SoC toward a full chip. This is where formal can and should be used to validate functional interfaces between different design units using formal protocol checkers and identify clock-domain crossing and reset issues as well as validate registers and check for Xs. As register transfer level (RTL) code gets synthesized to the gate-level netlist, RTL-to-gate-level equivalence checking can find any issues as well.

The beauty of doing formal in an agile manner is that as engineers keep iterating through design changes, they pick bugs in each iteration by following the flow. For example, to find deadlocks in a design, an engineer doesn’t have to wait until the code is complete.

Coverage and sign-off with formal verification

One common theme in using formal in this agile manner is that at every step along the way from the first hour of design development to the full chip testing, coverage information is available from the tools. This provides the much-needed view on code coverage essential for sign-off.

However, just as code coverage is not enough for sign-off using simulation, it’s not enough for sign-off with formal as well. In fact, every formal verification project should use a blend of techniques for ascertaining that quality, ensuring that reliability doesn’t take a hit with formal. This is where the six-dimensional coverage comes in handy.

When is one done with formal verification? How much verification is enough? Some of these questions are answered in a six-dimensional coverage solution.

Figure 5 Six dimensions form a complete formal verification coverage flow. Source: Axiomise

The six dimensions enforce qualitative and quantitative perspectives on verification work. Everything starts from requirements that pave the way to develop a formal list of assertions and coverage targets.

The first dimension is the qualitative one and is driven by asking: What is the verification group going to verify and what would be the coverage goals for sign-off? It’s derived from requirements and is formalized as target asserts and cover properties are executed in formal tool.

The second dimension determines how many assertions and covers were proven (Pass or Fail) in the formal tool. This quantity is the assertion coverage that provides a quantitative view of how much was achieved from a verification run. At this stage, a verification group doesn’t know whether checkers (asserts and covers) have done anything useful or exercised all the design code.

The third dimension provides insights into checker completeness. It’s a qualitative dimension that addresses the quality of checkers by conducting a thorough review of the testbench as well as by inserting some new bugs to test whether checkers are picking up these artificially-introduced bugs.

Often, with good design and no natural organic bugs, formal tools build proofs and the engineers wonder why we did not find any bugs. To address this, hand-inserted bugs provide a quick way of determining whether the checkers are sensible. This process also helps uncover any over-constraints in the design that may have prevented legal, valid stimulus from being injected in the formal testbench. This is the fourth dimension called over-constraint analysis, a much-needed qualitative assessment to uncover potentially critical testbench bugs. It paves the way for discovering new design bugs previously not found due to the over-constraints.

The fifth dimension is a tool-driven automated way of discovering what aspect of design code was exercised and reached during property verification. It’s a push-button tool feature that is useful to uncover blind spots that don’t have checkers or constraints blocking code reachability. Standard code coverage issues are discovered at this stage.

The sixth dimension is called scenario coverage and provides a functional coverage-like quantitative view of how good the verification was in addressing known scenarios of functional execution. The design verification group populates a spreadsheet of desired scenarios, and the tool automatically synthesizes them into assertion and coverage targets for these specific scenarios and tests them in formal tools. It provides a visual- and metric-oriented view on cover properties and shows proofs of those specific scenarios, knowing they were verified beyond doubt as those scenarios were tested exhaustively.

A practical venue for IC design verification

Formal methods are a powerful technology with strong foundations in mathematical logic. Today, design verification groups are using it, though not in a consistent manner. This article provides insights into how formal can be applied in a practical manner for design verification.

This article was originally published on EDN.

Dr. Ashish Darbari, founder and CEO of Axiomise, has expertise in all aspects of formal methods, including theorem proving, property checking and equivalence checking.

 

Virtual Event - PowerUP Asia 2024 is coming (May 21-23, 2024)

Power Semiconductor Innovations Toward Green Goals, Decarbonization and Sustainability

Day 1: GaN and SiC Semiconductors

Day 2: Power Semiconductors in Low- and High-Power Applications

Day 3: Power Semiconductor Packaging Technologies and Renewable Energy

Register to watch 30+ conference speeches and visit booths, download technical whitepapers.

Leave a comment