This document explains the recommended checklist items to review when transitioning from one Development Stage to another, for design, verification, and software device interface function (DIF) stages. It is expected that the items in each stage (D1, V1, S1, etc) are completed.
For a transition from D0 to D1, the following items are expected be completed.
Specification mostly (90%) complete, all features are defined. Specification is submitted into the repo as a markdown document. It is acceptable to make changes for further clarification or more details after the D1 stage.
The CSRs required to implement the primary programming model are defined. The Hjson file defining the CSRs is checked into the repository. It is acceptable to add or modify registers during the D2 stage in order to complete implementation.
Clock(s)/Reset(s) connected to all sub-modules.
.sv exists, meet comportability requirements.
Unit is able to be instantiated and bound in top level RTL. The design must compile and elaborate cleanly without errors. The unit must not break top level functionality such as propagating X through TL-UL interface, continuously asserting the interrupts, or creating undesired TL-UL transactions.
All expected memories identified, representative macros instantiated. All other physical elements (analog components, pads, etc) are identified and represented with a behavioral model. It is acceptable to make changes to these physical macros after the D1 stage as long as they do not have a large impact on the expected resulting area (roughly “80% accurate”).
Mainline functional path is implemented to allow for a basic functionality test by verification. (“Feature complete” is the target for D2 status.)
All the outputs of the IP have
Lint run setup, compiles and runs. It is acceptable to have lint errors and warnings at this stage.
Any new features added since D1 are documented, reviewed with DV/SW/FPGA.
The GitHub Issue, Pull Request, or RFC where the feature was discussed should be linked in the
Block diagrams updated.
Documented non-registered block interfaces.
Documented the missing functionalities.
Feature requests for this IP version are frozen at this time.
All features specified have been completed.
Area check completed either on FPGA or on Design Compiler.
100% ports. Port is frozen.
100% architectural states exists (RAMs, CSRs, etc)
Review and sign off TODOs.
Conforming to style guide regarding X usage.
Lint passes. Waiver reviewed.
CDC run set up. No must fix errors, waiver file created.
Block is synthesized as part of continuous integration checks and meets timing there.
CDC Sync flops use behavioral synchronization macros(
Any appropriate security counter-measures documented and implemented. For redundantly encoded FSMs, the sparse-fsm-encode.py script must be used to generate the encoding.
A review of sensitive security-critical storage flops was completed. Where appropriate, non-reset flops are used to store secure material.
Shadow registers are implemented for all appropriate storage of critical control functions.
Compile-time random netlist constants (such as LFSR seeds or scrambling constants) are exposed to topgen via the
randtype parameter mechanism in the comportable IP Hjson file.
Default random seeds and permutations for LFSRs can be generated with the gen-lfsr-seed.py script.
See also the related GitHub issue #2229.
Any approved new features since D2 documented, and reviewed with DV/SW/FPGA
Resolve all TODOs.
Lint clean. Lint waiver file reviewed and signed off by tech steering committe.
CDC clean. CDC waiver file reviewed and signed off by tech sterring committe.
Hold Design Review: Hold an RTL smoke check review by an independent designer.
Hold Design Review: Sign off deleted flops list (one last time).
Review Design Change with SW: Review CSRs
Review Design Change with SW: Review Fatal Errors
Review Design Change with SW: Review other SW visible changes
Review Design Change with SW: Review known “Won’t Fix” bugs and “Errata”.
To transition from V0 to V1, the following items are expected be completed. Prefix “SIM” is applicable for simulation-based DV approach, whereas the prefix “FPV” is applicable for formal property-based verification approach.
DV document drafted, indicating the overall DV goals, strategy, the testbench environment details with diagram(s) depicting the flow of data, UVCs, checkers, scoreboard, interfaces, assertions and the rationale for the chosen functional coverage plan. Details may be missing since most of these items are not expected to be fully understood at this stage.
Testplan completely written (in HJson format) indicating:
- Testpoints (a list of planned tests), each mapping to a design feature, with a description highlighting the goal of the test and optionally, the stimulus and the checking procedure.
- The functional coverage plan captured as a list of covergroups, with a description highlighting what feature is expected to be covered by each covergroup. It may optionally contain additional details such as coverpoints and crosses of individual aspects of the said feature that are covered.
Top level testbench with DUT instantiated and following interfaces hooked up (as applicable): TileLink, clocks and resets, interrupts and major DUT interfaces. All interfaces may not be hooked up at this point. Inputs for which interfaces have not yet been created are tied off to 0.
All available interface assertion monitors hooked up (example: tlul_assert).
UVM environment created with major interface agents and UVCs connected and instantiated. TLM port connections made from UVC monitors to the scoreboard.
RAL model is generated by using regtool and instantiated in UVM environment.
CSR check is generated by using regtool and bound in the TB environment.
Full testbench automation completed if applicable. This may be required for verifying multiple flavors of parameterized DUT designs.
Smoke test exercising a basic functionality of a major DUT datapath passing. What functionality to test and to what level may be driven by higher level (example: chip) integration requirements. These are captured when the testplan is reviewed with the key stakeholders, and the test(s) updated as necessary.
CSR test suite added for ALL interfaces (including, but not limited to the DUT’s SW device acess port, JTAG access port etc. whichever ones are applicable) that have access to system memory map:
- HW reset test (test all resets)
- CSR read/write
- Bit Bash
Memory test suite added for ALL interfaces that have access to system memory map if the DUT has memories:
- Mem walk
Ensure all these tests verify back-2back access with zero delays, along with partial reads and partial writes.
Each input and each output of the module is part of at least one assertion. Assertions for main functional path are implemented and proven.
The smoke regression passes cleanly (with no warnings) with one additional tool apart from the primary tool selected for signoff.
Smoke regression set up for code health. A small suite of tests identified as the smoke regression suite. If the testbench has more than one build configurations, then each configuration has at least one test added to the smoke regression suite.
Nightly regression for running all constrained-random tests with multiple random seeds (iterations) setup. Directed, non-random tests need not be run with multiple iterations. Selecting the number of iterations depends on the coverage, the mean time between failure and the available compute resources. For starters, it is recommended to set the number of iterations to 100 for each test. It may be trimmed down once the test has stabilized, and the same level coverage is achieved with fewer iterations. The nightly regression should finish overnight so that the results are available the next morning for triage.
FPV regression set up by adding the module to
Structural coverage collection model checked in.
This is a simulator-specific file (i.e. proprietary format) that captures what hierarchies and what types of coverage is collected.
For example, pre-verified sub-modules (including some
prim components pre-verified thoroughly with FPV) can be black-boxed - it is sufficient to only enable the IO toggle coverage of their ports.
Functional coverage shell object created - this may not contain coverpoints or covergroups yet, but it is primed for development in post-V1.
- For DV testbench, an entry is expected to be added at
- For FPV testbench, an entry is expected to be added at
Sub-modules that are pre-verified with their own testbenches have already reached V1 or higher stage.
Design / micro-architecture specification reviewed and signed off. If a product requirements document (PRD) exists, then ensure that the design specification meets the product requirements.
Draft DV document (proposed testbench architecture) & the complete testplan reviewed with key stakeholders (as applicable):
- DUT designer(s)
- 1-2 peer DV engineers
- Software engineer (DIF developer)
- Chip architect / design lead
- Chip DV lead
- Security architect
Following categories of post-V1 tests focused on during the testplan review (as applicable):
- Security / leakage
- Error scenarios
V2 checklist reviewed to understand scope and estimate effort.
To transition from V1 to V2, the following items are expected be completed.
It is possible for the design to have undergone some changes since the DV document and testplan was reviewed in V0 stage. All design deltas captured adequately and appropriately in the DV document and the testplan.
DV document is fully written.
Functional coverage plan fully implemented. All covergoups created and sampled in the reactive components of the testbench (passive interfaces, monitors and scoreboards).
For simulations, interfaces connected to all sidebands and exercised. For the FPV, assertions added for all interfaces including sidebands.
All planned assertions written and enabled.
UVM environment fully developed with end-to-end checks in scoreboard enabled.
All tests in the testplan written and passing with at least one random seed.
All assertions implemented and proven: 90%. Each output of the module has at least one forward and one backward assertion check. FPV module converges within reasonable runtime.
All assumptions implemented and reviewed.
For chip-level, verified pieces of FW code (DIFs) in simulation.
Nightly regression with multiple random seeds passing: 90%
Code coverage requirements: line, toggle, fsm (state & transition), branch, assertion: 90%
Functional coverage requirements: 70%
Code coverage requirements: branch, statement, functional: 90%
COI coverage requirements: 75%
Lint for the testbench passes. Waiver reviewed.
Sub-modules that are pre-verified with their own testbenches have already reached V2 or higher stage.
All high priority (tagged P0 and P1) design bugs addressed and closed. If the bugs were found elsewhere, ensure that they are reproduced deterministically in DV (through additional tests or by tweaking existing tests as needed) and have the design fixes adequately verified.
All low priority (tagged P2 and P3) design bugs have been root-caused. They may be deferred to post V2 for closure.
Fully written DV document & the complete testplan reviewed with key stakeholders (as applicable):
- DUT designer(s)
- 1-2 peer DV engineers
- Chip architect / design lead
- Chip DV lead
- Security architect
This review will focus on the design deltas captured in the tesplan since the last review. In addition, the fully implemented functional coverage plan, the observed coverage and the coverage exclusions are expected to be scrutinized to ensure there are no verification holes or any gaps in achieving the required stimulus quality, before the work towards progressing to V3 can commence.
V3 checklist reviewed to understand scope and estimate effort.
To transition from V2 to V3, the following items are expected be completed. Prefix “SIM” is applicable for simulation-based DV approach only, while “FPV” is for FPV approach only.
Although rare, it is still possible for the design to have undergo some more last-minute changes. All additional design deltas captured adequately and appropriately in the DV document and the testplan.
X Propagation Analysis complete
All assertions implemented and proven: 100%. There are no undetermined or unreachable properties.
Nightly regression with multiple random seeds passing: 100% (with 1 week minimum soak time)
Code coverage requirements: line, toggle, fsm (state & transition), branch, assertion: 100%
Functional coverage requirements: 100%
Code coverage requirements: branch, statement, functional: 100%
COI coverage requirements: 100%
There are no TODO items in the entire testbench code, including common components and UVCs.
There are no compile-time or run-time warnings thrown by the simulator.
Lint for the testbench is clean. Lint waiver file reviewed and signed off by tech steering committe.
Sub-modules that are pre-verified with their own testbenches have already reached V3 stage.
All design and testbench bugs addressed and closed.
For a transition from S0 to S1, the following items are expected be completed.
dif_<ip>.h and, optionally,
dif_<ip>.c exist in
All existing non-production code in the tree which uses the device does so via the DIF or a production driver.
Software unit tests exist for the DIF in
Smoke tests exist for the DIF in
This should perform a basic test of the main datapath of the hardware module by the embedded core, via the DIF, and should be able to be run on all OpenTitan platforms (including FPGA, simulation, and DV). This test will be shared with DV.
Smoke tests are for diagnosing major issues in both software and hardware, and with this in mind, they should execute quickly.
Initially we expect this kind of test to be written by hardware designers for debugging issues during module development.
This happens long before a DIF is implemented, so there are no requirements on how these should work, though we suggest they are placed in
sw/device/tests/<ip>/<ip>.c as this has been the convention until now.
Later, when a DIF is written, the DIF author is responsible for updating this test to use the DIF, and for moving this test into the aforementioned location.
For a transition from S1 to S2, the following items are expected be completed.
DIF has functions to cover all specified hardware functionality.
The DIF’s usage of its respective IP device has been reviewed by the device’s hardware designer.
The DIF’s respective device IP is at least stage D2.
DIF uses automatically generated HW parameters and register definitions.
HW IP Programmer’s guide references specific DIF APIs that can be used for operations.
DIF follows the DIF-specific guidelines in
sw/device/lib/dif/README.md and the OpenTitan C style guidelines.
Chip-level DV testing for the IP using DIFs has been started.
For a transition from S2 to S3, the following items are expected be completed.
The DIF’s respective device IP is at least stage D3.
The DIF’s respective device IP is at least stage V3.
Fully re-review C interface and implementation, with a view to the interface not changing in future.
Unit tests cover (at least):
- Device Initialisation
- All Device FIFOs (including when empty, full, and adding data)
- All Device Registers
- All DIF Functions
- All DIF return codes
Ensure all DIF TODOs are complete.