Logical equivalence check is an important phase in the IC design process where the design is evaluated without providing test cases.
Designing a chip is a complex process. It starts with defining the architectural requirements, then microarchitecture development, followed by RTL design and functional verification. Then the design is synthesized to obtain a netlist which is handed to the back-end team to perform back-end flow. In the back-end flow, the netlist goes through various changes in various stages, be it engineering change order (ECO) that is performed on design after identification of a bug or modification of cells for timing closure. In a nutshell, the design is subjected to several changes before it reaches the final tapeout stage.
To validate the design after changes, it’s necessary to run simulations by providing test cases, but often this is not possible because of the complexity of design—complex design requires more runtime—and also because of the frequency of design changes. Under such conditions, the design is subjected to logical equivalence checking (LEC), where the tool validates the design by injecting random vectors.
There are numerous tools available in the industry to check logical equivalence, but the most widely used ones are Conformal from Cadence and Formality from Synopsys. Apart from LEC, these tools can also be used for doing other tasks such as ECOs. In this article, we will go through the Conformal LEC flow.
Figure 1 A typical Conformal LEC flow comprises a setup mode and an LEC mode.
A typical conformal LEC flat run flow mainly consists of a setup phase followed by a LEC mode. The setup mode consists of the following steps:
The LEC mode consists of:
1. Specification of blackbox
In a typical design, there will be analog blocks, memories, and digital blocks. Memories and analog blocks are provided by different teams and they are to be treated as blackbox when running LEC. If there are IPs present in the design, verification of IPs is performed by the IP provider and the user is not required to spend time on verification of IPs. So, even IPs can be included as blackbox.
Blackbox can be specified with command as shown on line 25 in Figure 2 below. This must be done before reading the design files/library to make sure only the I/O ports are read and not the entire design. Doing so will cut the runtime by a huge amount and will be of great assistance when we are running LEC on large designs. When a block is specified as blackbox, the Conformal tool will verify the connections—the inputs and outputs of the blackbox—but it doesn’t verify the logic inside the blackbox.
Figure 2 Blackbox can be specified with the command shown in this image.
2. Reading VHDL/Verilog designs and libraries
Apart from Verilog and VHDL support for reading design files, the Conformal tool also supports reading Verilog standard simulation libraries and Liberty format libraries. This information is provided to the tool by using the command as indicated on line 37 and 40 in Figure 3 below. Search path and file list can also be input to the tool using the appropriate command.
Figure 3 The command shown above is for reading design files.
After the design is read in, the built-in Hardware Descriptive Language (HDL) rule check plug-in in Conformal can be used to check for lint errors and warnings. These errors and warnings can be reported based on the severity, and waivers can be added. Use the below commands to report errors and warning messages in golden and revised design:
report rule check -design -error -warning -golden -verbose > rpt.rule.golden
report rule check -design -error -warning -revised -verbose > rpt.rule.revised
3. Specification of design constraints
After design-for-test (DFT) insertion in the design, certain ports/pins will be added for debugging purposes. For example, scan_in, scan_out, scan_mode, and scan_enable. When comparing RTL vs. DFT-RTL design, because of these additional ports, the design would be non-equivalent. So, it is necessary to put the design in a common functional mode for both golden and revised design. This can be achieved by adding constraints to the design.
For example, in Figure 4 below, the flop on the left is a regular D flipflop in golden design while the flop on the right is scan-enabled D flipflop in revised design after scan insertion. The D input in golden design is a function of Din while the D input in revised design is a function of Din, scan_in, and scan_enable. So, when vectors are passed to both flops by the tool, they will be flagged as not equivalent. But in functional mode, scan_enable will always be zero. So, if a constraint is added on scan_enable to tie it to 1’b0, then both the golden and revised design will be equal in functional mode.
add pin constraint 0 scan_enable-revised
Figure 4 This is how golden DFF vs. revised DFF looks after scan insertion.
4. Specification of modeling directives
After synthesis, the netlist output from the synthesis tool would be power optimized. So, any clock gating logic in golden design would be converted to latch-based clock gating as indicated in Figure 5 below on the revised side. Many such optimizations will be done by different tools and it is necessary to carefully review them and add modeling directives to match both designs. The below command can be used to add modeling directive for clock gating optimization:
set flatten model-gated_clock
Figure 5 This is how golden DFF vs. revised DFF looks after power optimization.
5. Switch to LEC mode
After all the setup constraints and modeling directives are added, the tool must be toggled to LEC mode for mapping key points and comparison. It’s done by using the following command:
set system mode lec
6. Mapping process
When the tool is set to LEC mode, the tool automatically maps key points such as primary inputs (PI), primary outputs (PO), DFFs, D-latches, blackboxes, Z-gates, and cut gates. First the tool uses name-based mapping followed by function-based mapping. As the name indicates, in name-based mapping, the tool maps based on names of nets/variables in both golden and revised designs. In function-based mapping, the tool analyzes the input logic cone of a key point and maps based on it. Name-based mapping takes less runtime than function-based mapping, hence the tool tries to map key points with name-based mapping before switching to function based mapping.
It is imperative that all key points should be mapped. But when running LEC for the first time, not all key points will be mapped. This is because of optimizations done by different tools used in the IC design process. For example, the synthesis tool based on the settings will rename the register /dma_reg in RTL as /dma_0_reg in netlist. Name-based mapping will not be able to match this key point, while function-based mapping might be or might not be able to match this key point based on the complexity of the logic cone. Under these conditions, a renaming rule should be provided to the tool so these key points can be mapped. The following renaming rule will solve the above mentioned unmapped key points:
add renaming rule REG1 “%d_reg[%d]” “reg[@1][@2]” -revised
7. Compare process
Comparison will be done only on mapped key points. So, as mentioned before, it is important that all key points should be mapped. The tool will try to prove equivalency by passing all combinations of inputs to the logic cone and checks the output behavior. After the comparison process, the tool will categorize key points as equivalent, inverted equivalent, non-equivalent, and abort points. All the categories apart from equivalent key point should be debugged. If all the compare points come out as equivalent, then we can conclude that the golden and revised design are matching.
The comparison process can be invoked in LEC mode by using the following command, and the following message as shown in Figure 6 will be displayed after comparison:
add compare point-all compare
Figure 6 The shown LEC log summary is of a non-equivalent golden and revised design.
As discussed above, running regression and other complex simulations every time the design changes is not an efficient way of validating the design. Using the LEC tool under such circumstances has shown that design can be validated with much less runtime. Following the above-mentioned steps to do LEC with Cadence Conformal will simplify the overall process and reduces debug time by a significant margin.
Disclaimer: The author does not have any association with Cadence Design Systems or Synopsys. Any specific product reference does not constitute as an endorsement or recommendation. The views or opinions expressed by the author do not reflect the views of the author’s employer, Marvell Technology Inc.
This article was originally published on EDN.
Deekshith Krishnegowda is an IC design engineer at Marvell Technology’s Santa Clara office.