This document explains the recommended checklist items to review when transitioning from one Development Stage to another, for design, verification, and software device interface function (DIF) stages. It is expected that the items in each stage (D1, V1, S1, etc) are completed.
For a transition from D0 to D1, the following items are expected be completed.
Spec complete, feature set finalized.
CSRs defined and generated.
Clock(s)/Reset(s) connected to all sub-modules.
.sv exists, meet comportability requirements.
Unit is able to be instantiated and bound in top level RTL. The unit must not break top level functionality such as propagating X through TL-UL interface, continuously asserting the interrupts, or creating undesired TL-UL transactions.
80% of expected memories instantiated, using behavioral RAMs.
Main functional path implemented.
All the outputs of the IP have
Lint run setup, compiles and runs. It is acceptable to have lint errors and warnings at this stage.
Any new features added since D1 are documented, reviewed with DV/SW/FPGA.
The GitHub Issue, Pull Request, or RFC where the feature was discussed should be linked in the
Block diagrams updated.
Documented non-registered block interfaces.
Documented the missing functionalities.
Feature requests for this IP version are frozen at this time.
All features specified have been completed.
Area sanity check completed either on FPGA or on Design Compiler.
100% ports. Port is frozen.
100% architectural states exists (RAMs, CSRs, etc)
Review and sign off TODOs.
Confirming to style guide regarding X usage. TODO: Related GitHub issue.
Lint passes. Waiver reviewed.
CDC run set up. No must fix errors, waiver file created.
FPGA synthesis timing meet (Fmax-10%) target or better
CDC Sync flops use behavioral synchronization macros(
Any appropriate security counter-measures documented and implemented.
A review of sensitive security-critical storage flops was completed. Where appropriate, non-reset flops are used to store secure material.
Shadow registers are implemented for all appropriate storage of critical control functions.
Any approved new features since D2 documented, and reviewed with DV/SW/FPGA
Resolve all TODOs.
Lint clean. Lint waiver file reviewed and signed off by tech steering committe.
CDC clean. CDC waiver file reviewed and signed off by tech sterring committe.
Hold Design Review: Hold a RTL sanity check review by an independent designer.
Hold Design Review: Sign off deleted flops list (one last time).
Review Design Change with SW: Review CSRs
Review Design Change with SW: Review Fatal Errors
Review Design Change with SW: Review other SW visible changes
Review Design Change with SW: Review known “Won’t Fix” bugs and “Errata”.
For a transition from V0 to V1, the following items are expected be completed. Prefix “SIM” is applicable for simulation-based DV approach only, while “FPV” is for FPV approach only.
- DV Plan document drafted, indicating the overall DV strategy, intent and the testbench environment details with diagrams, details on TB, UVCs, checkers, scoreboard, interfaces, assertions, coverage objects (if applicable).
- Details may be missing since most items are not expected to be fully developed at this stage.
A fully completed Testplan written in Hjson, indicating the list of planned tests with descriptions indicating the goal of the test and optionally details on stimulus and the checking procedure.
- Top level testbench with DUT instantiated and following interfaces hooked up (as applicable): TileLink, clocks and resets, interrupts and major DUT interfaces.
- All interfaces may not be hooked up at this point. Inputs for which interfaces have not yet been created may be tied off to 0.
- All available interface assertion monitors hooked up (example: tlul_assert).
- UVM environment created with major interface agents and UVCs connected and instantiated.
- TLM connections made from UVC monitors to the scoreboard.
RAL model is generated by using regtool and instantiated in UVM environment.
CSR check is generated by using regtool and bound in the TB environment.
Full testbench automation completed if applicable. This may be required for verifying multiple flavors of parameterized DUT designs.
- Sanity test exercising a basic functionality of a major DUT datapath passing.
- What functionality to test and to what level may be governed by higher level (example: chip) integration requirements. These are to be captured when the Testplan is reviewed with the key stakeholders.
CSR test suite added for ALL interfaces that have access to system memory map (JTAG, TL, etc.):
- HW reset test (test all resets)
- CSR read/write
- Bit Bash
Memory test suite added for ALL interfaces that have access to system memory map (JTAG, TL, etc.) if the DUT has memories:
- Mem walk
Ensure all these tests verify back-2back access with zero delays, along with partial reads and partial writes.
- Each input and each output of the module is part of at least one assertion.
- Assertions for main functional path are implemented and proven.
Verify that the sanity test passes cleanly (with no warnings) with one additional tool apart from the primary tool selected for signoff.
Sanity regression set up for code health. A small suite of tests identified for running the sanity regression on. If the testbench has more than one compile-time configuration, then a sanity test for each configuration should be ideally selected.
Nightly regression for running all tests with multiple random seeds (iterations) setup. Selecting the number of iterations for running each test is subjective - it depends on the failure rate and available compute resources. For starters, it is recommended to set iterations to 100 for each test. It may be trimmed down if the compute load is too high. As such, the goal should be to have the nightly regression finish overnight so that the results are available next morning for triage.
Set up FPV regression by adding the module to
- Structural coverage collection model checked in. This specifies what hierarchies and what types of coverage to collect. For example, pre-verified sub-mudules (including some
primcomponents pre-verified thoroughly with FPV) can be black-boxed and only IO toggle coverage can be setup for those sub-modules for coverage collection.
- Functional coverage shell object created - this may not contain coverpoints or covergroups yet, but it is primed for use in post-V2 test development.
Sub-modules that are pre-verified with their own testbenches have already reached V1 or higher stage.
RTL (uArch) specification reviewed and signed off.
DV Plan & Testplan reviewed with key stakeholders - designer, design lead, DV lead, architects, higher level (chip) design and DV leads.
Following categories of post-V1 tests focused at in the Testplan review (as applicable):
Reviewed V2 checklist to understand scope and estimate effort.
For a transition from V1 to V2, the following items are expected be completed. Prefix “SIM” is applicable for simulation-based DV approach only, while “FPV” is for FPV approach only.
It is possible for the design to have undergone some changes since the DV plan and Testplan was reviewed prior to V1 stage. Please ensure that those deltas have been captured adequately in the DV Plan and the Testplan documents.
DV Plan is fully completed in terms of content.
- For simulation based DV, all interfaces including sidebands have been connected and exercised.
- For the FPV approach, all interfaces including sidebands are asserted.
All planned assertions have been written and enabled.
UVM environment fully developed with end-2-end checks in scoreboard enabled.
All tests in the Testplan written and passing with at least one random seed.
- All assertions are implemented and above 90% proven.
- Each output of the module contains at least one forward and one backward assertion check.
- FPV module converges within reasonable runtime.
All assumptions are implemented and reviewed.
For chip-level, verified pieces of FW code (DIFs) in simulaton.
Nightly regression with multiple random seeds passing: 90%
Code coverage requirements: line, toggle, fsm (state & transition), branch, assertion: 90%
Functional coverage requirements: coverpoints: 100%, crosses: 75%
Code coverage requirements: branch, statement, functional: 90%
COI coverage requirements: 75%
Ensure that all high priority (tagged P0 and P1) design bugs have been addressed and closed. If the bugs were found elsewhere, ensure that they are reproduced deterministically in DV (through additional tests or by tweaking existing tests as needed) and the fixes are adequately verified.
Ensure that all low priority (tagged P2 and P3) design bugs habe been root-caused. They may be deferred to post D2V2 for closure.
Sub-modules that are pre-verified with their own testbenches have already reached V2 or higher stage.
Reviewed V3 checklist to understand scope and estimate effort.
For a transition from V2 to V3, the following items are expected be completed. Prefix “SIM” is applicable for simulation-based DV approach only, while “FPV” is for FPV approach only.
It is possible for the design to undergo changes even at this stage (when it is expected to be mature). Please ensure that those new deltas have been captured adequately in the DV Plan and the Testplan documents.
Ensure that the complete testbench code is free from TODOs.
X Propagation Analysis complete
- Assertion proven requirement: 100% of properties proven
- No undetermined or unreachable properties
Nightly regression with multiple random seeds passing: 100% (with 1 week minimum soak time)
Code coverage requirements: line, toggle, fsm (state & transition), branch, assertion: 100%
Functional coverage requirements: coverpoints: 100%, crosses: 100%
Code coverage requirements: branch, statement, functional: 100%
COI coverage requirements: 100%
Ensure that all design bugs have been addressed and closed.
Clean up all compile-time and run-time warnings thrown by the simulator.
Sub-modules that are pre-verified with their own testbenches have already reached V3 stage.
For a transition from S0 to S1, the following items are expected be completed.
dif_<ip>.h and, optionally,
dif_<ip>.c exist in
All existing in-tree code which uses the device, uses the device via the DIF. There is no remaining driver code that directly uses the device outside of DIF code.
Software unit tests exist for the DIF in
Sanity tests exist for the DIF in
This should perform a basic test of the main datapath of the hardware module by the embedded core, via the DIF, and should be able to be run on all OpenTitan platforms (including FPGA, simulation, and DV). This test will be shared with DV.
Sanity tests are for diagnosing major issues in both software and hardware, and with this in mind, they should execute quickly.
Initially we expect this kind of test to be written by hardware designers for debugging issues during module development.
This happens long before a DIF is implemented, so there are no requirements on how these should work, though we suggest they are placed in
sw/device/tests/<ip>/<ip>.c as this has been the convention until now.
Later, when a DIF is written, the DIF author is responsible for updating this test to use the DIF, and for moving this test into the aforementioned location.
For a transition from S1 to S2, the following items are expected be completed.
DIF has functions to cover all specified hardware functionality.
The DIF’s usage of its respective IP device has been reviewed by the device’s hardware designer.
The DIF’s respective device IP is at least stage D2.
DIF uses automatically generated HW parameters and register definitions.
HW IP Programmer’s guide references specific DIF APIs that can be used for operations.
DIF follows the DIF-specific guidelines in
sw/device/lib/dif/README.md and the OpenTitan C style guidelines.
Chip-level DV testing for the IP using DIFs has been started.
DIF has initial interface for use from Tock.
For a transition from S2 to S3, the following items are expected be completed.
The DIF’s respective device IP is at least stage D3.
The DIF’s respective device IP is at least stage V3.
Fully re-review C interface and implementation, with a view to the interface not changing in future.
Unit tests cover (at least):
- Device Initialisation
- All Device FIFOs (including when empty, full, and adding data)
- All Device Registers
- All DIF Functions
- All DIF return codes
Ensure all DIF TODOs are complete.
Fully re-review Tock interface, with a view to the interface not changing in future.