# Signoff Checklist

This document explains the recommended checklist items to review when transitioning from one Development Stage to another, for design, verification, and software device interface function (DIF) stages. It is expected that the items in each stage (D1, V1, S1, etc) are completed.

## D1

For a transition from D0 to D1, the following items are expected to be completed.

### SPEC_COMPLETE

The specification is 90% complete, all features are defined. The specification is submitted into the repository as a markdown document. It is acceptable to make changes for further clarification or more details after the D1 stage.

### CSR_DEFINED

The CSRs required to implement the primary programming model are defined. The Hjson file defining the CSRs is checked into the repository. It is acceptable to add or modify registers during the D2 stage in order to complete implementation.

### CLKRST_CONNECTED

Clock(s) and reset(s) are connected to all sub-modules.

### IP_TOP

The unit .sv exists and meets comportability requirements.

### IP_INSTANTIABLE

The unit is able to be instantiated and connected in top level RTL files. The design must compile and elaborate cleanly without errors. The unit must not break top level functionality such as propagating X through TL-UL interfaces, continuously asserting alerts or interrupts, or creating undesired TL-UL transactions. To that end, the unit must fulfill the following V1 checklist requirements:

• TB_TOP_CREATED
• SIM_RAL_MODEL_GEN_AUTOMATED
• CSR_CHECK_GEN_AUTOMATED
• SIM_CSR_MEM_TEST_SUITE_PASSING

### PHYSICAL_MACROS_DEFINED_80

All expected memories have been identified and representative macros instantiated. All other physical elements (analog components, pads, etc) are identified and represented with a behavioral model. It is acceptable to make changes to these physical macros after the D1 stage as long as they do not have a large impact on the expected resulting area (roughly “80% accurate”).

### FUNC_IMPLEMENTED

The mainline functional path is implemented to allow for a basic functionality test by verification. (“Feature complete” is the target for D2 status.)

All the outputs of the IP have ASSERT_KNOWN assertions.

### LINT_SETUP

A lint flow is set up which compiles and runs. It is acceptable to have lint warnings at this stage.

## D2

### NEW_FEATURES

Any new features added since D1 are documented and reviewed with DV/SW/FPGA. The GitHub Issue, Pull Request, or RFC where the feature was discussed should be linked in the Notes section.

### BLOCK_DIAGRAM

Block diagrams have been updated to reflect the current design.

### DOC_INTERFACE

All IP block interfaces that are not autogenerated are documented.

### DOC_INTEGRATION_GUIDE

Any integration specifics that are not captured by the comportability specification have been documented. Examples include special synthesis constraints or clock requirements for this IP.

### MISSING_FUNC

Any missing functionality is documented.

### FEATURE_FROZEN

Feature requests for this IP version are frozen at this time.

### FEATURE_COMPLETE

All features specified are implemented.

### PORT_FROZEN

All ports are implemented and their specification is frozen, except for security / countermeasure related ports that will not affect functionality or architectural behavior (the addition of such ports can be delayed only until D2S).

### ARCHITECTURE_FROZEN

All architectural state (RAMs, CSRs, etc) is implemented and the specification frozen.

### REVIEW_TODO

All TODOs have been reviewed and signed off.

### STYLE_X

The IP block conforms to the style guide regarding X usage.

### CDC_SYNCMACRO

All CDC synchronization flops use behavioral synchronization macros (e.g. prim_flop_2sync) not manually created flops.

### LINT_PASS

The lint flow passes cleanly. Any lint waiver files have been reviewed.

### CDC_SETUP

A CDC checking run has been set up (if tooling is available). The CDC checking run shows no must-fix errors, and a waiver file has been created.

If there is no CDC flow at the block-level, this requirement can be waived.

### RDC_SETUP

An RDC checking run has been set up (if tooling is available). The RDC checking run shows no must-fix errors, and a waiver file has been created.

If there is no RDC flow at the block-level, this requirement can be waived.

### AREA_CHECK

An area check has been completed either with an FPGA or ASIC synthesis flow.

### TIMING_CHECK

A timing check has been completed either with an FPGA or ASIC synthesis flow.

### SEC_CM_DOCUMENTED

Any custom security countermeasures other than standardized countermeasures listed under SEC_CM_IMPLEMENTED have been identified, documented and their implementation has been planned. The actual implementation can be delayed until D2S.

Where the area impact of countermeasures can be reliably estimated, it is recommended to insert dummy logic at D2 in order to better reflect the final area complexity of the design.

## D2S

### SEC_CM_ASSETS_LISTED

List the assets and corresponding countermeasures in canonical format in the IP Hjson (the canonical naming is checked by the reggen tool for correctness).

This list could look, for example, as follows:

# Inside the rstmgr.hjson

countermeasures: [
{ name: "BUS.INTEGRITY",
desc: "Bus integrity check."
},
{ name: "RST.INTEGRITY",
desc: "Reset integrity checks."
},
desc: "Shadow reset logic."
}
# ...
]



For a full list of permitted asset and countermeasure types, see the countermeasure.py script that implements the name checks.

Note the SEC_CM_DOCUMENTED item in the D2 checklist, which is a precursor to this step.

### SEC_CM_IMPLEMENTED

Any appropriate security counter-measures are implemented. Implementations must follow the OpenTitan Secure Hardware Design Guidelines.

In particular, note that:

### SEC_CM_RND_CNST

Compile-time random netlist constants (such as LFSR seeds or scrambling constants) are exposed to topgen via the randtype parameter mechanism in the comportable IP Hjson file. Default random seeds and permutations for LFSRs can be generated with the gen-lfsr-seed.py script. See also the related GitHub issue #2229.

### SEC_CM_NON_RESET_FLOPS

A review of sensitive security-critical storage flops was completed. Where appropriate, non-reset flops are used to store secure material.

Shadow registers are implemented for all appropriate storage of critical control functions.

### SEC_CM_RTL_REVIEWED

If deemed necessary by the security council, an offline review of the RTL code sections pertaining to the assets and countermeasures listed in SEC_CM_ASSETS_LISTED has been performed.

### SEC_CM_COUNCIL_REVIEWED

Security council has reviewed the asset list and associated documentation (SEC_CM_ASSETS_LISTED, SEC_CM_DOCUMENTED) and deems the defenses implemented appropriate.

The security council decides whether an additional RTL review of the relevant code sections is necessary (SEC_CM_RTL_REVIEWED).

## D3

### NEW_FEATURES_D3

Any approved new features since D2 have been documented and reviewed with DV/SW/FPGA

### TODO_COMPLETE

All TODOs are resolved. Deferred TODOs have been marked as “iceboxed” in the code and have been attached an issue as follows: // ICEBOX(#issue-nr).

### LINT_COMPLETE

The lint checking flow is clean. Any lint waiver files have been reviewed and signed off by the technical steering committee.

### CDC_COMPLETE

The CDC checking flow is clean. CDC waiver files have been reviewed and understood. If there is no CDC flow at the block-level, this requirement can be waived.

### RDC_COMPLETE

The RDC checking flow is clean. RDC waiver files have been reviewed and understood. If there is no RDC flow at the block-level, this requirement can be waived.

### REVIEW_RTL

A simple design review has been conducted by an independent designer.

### REVIEW_DELETED_FF

Any deleted flops have been reviewed and signed off. If there is no synthesis flow at the block-level, this requirement can be waived.

### REVIEW_SW_CHANGE

Any software-visible design changes have been reviewed by the software team.

### REVIEW_SW_ERRATA

All known “Won’t Fix” bugs and “Errata” have been reviewed by the software team.

## V1

To transition from V0 to V1, the following items are expected to be completed. The prefix “SIM” is applicable for simulation-based DV approaches, whereas the prefix “FPV” is applicable for formal property-based verification approaches.

### DV_DOC_DRAFT_COMPLETED

A DV document has been drafted, indicating the overall DV goals, strategy, the testbench environment details with diagram(s) depicting the flow of data, UVCs, checkers, scoreboard, interfaces, assertions and the rationale for the chosen functional coverage plan. Details may be missing since most of these items are not expected to be fully understood at this stage.

### TESTPLAN_COMPLETED

A testplan has been written (in Hjson format) indicating:

• Testpoints (a list of planned tests), each mapping to a design feature, with a description highlighting the goal of the test and optionally, the stimulus and the checking procedure.
• The functional coverage plan captured as a list of covergroups, with a description highlighting which feature is expected to be covered by each covergroup. It may optionally contain additional details such as coverpoints and crosses of individual aspects of the said feature that is covered.

If the DUT has a CPU for which a ROM firmware is developed (burnt-in during manufacturing):

• A detailed ROM firmware testplan has been written to adequately verify all functions of the ROM firmware in a pre-silicon simulation environment (i.e. a DV testbench).
• The testing framework may validate each functionality of the ROM firmware discretely as an individual unit test.
• Depending on the ROM firmware development progress, this may be postponed for V2.

### TB_TOP_CREATED

A top level testbench has been created with the DUT instantiated. The following interfaces are connected (as applicable): TileLink, clocks and resets, interrupts and alerts. Other interfaces may not be connected at this point (connecting these is part of SIM_TB_ENV_CREATED). Inputs for which interfaces have not yet been created are tied off to the default value.

All available interface assertion monitors are connected (example: tlul_assert).

### SIM_TB_ENV_CREATED

A UVM environment has been created with major interface agents and UVCs connected and instantiated. TLM port connections have been made from UVC monitors to the scoreboard.

### SIM_RAL_MODEL_GEN_AUTOMATED

A RAL model is generated using regtool and instantiated in the UVM environment.

### CSR_CHECK_GEN_AUTOMATED

A CSR check is generated using regtool and bound in the TB environment.

### TB_GEN_AUTOMATED

Full testbench automation has been completed if applicable. This may be required for verifying multiple flavors of parameterized designs.

### SIM_SMOKE_TEST_PASSING

A smoke test exercising the basic functionality of the main DUT datapath is passing. The functionality to test (and to what level) may be driven by higher level (e.g. chip) integration requirements. These requirements are captured when the testplan is reviewed by the key stakeholders, and the test(s) updated as necessary.

### SIM_CSR_MEM_TEST_SUITE_PASSING

CSR test suites have been added for ALL interfaces (including, but not limited to the DUT’s SW device access port, JTAG access port etc.) that have access to the system memory map:

• HW reset test (test all resets)
• Bit Bash
• Aliasing

Memory test suites have been added for ALL interfaces that have access to the system memory map if the DUT has memories:

• Mem walk

All these tests should verify back to back accesses with zero delays, along with partial reads and partial writes.

### FPV_MAIN_ASSERTIONS_PROVEN

Each input and each output of the module is part of at least one assertion. Assertions for the main functional path are implemented and proven.

### SIM_ALT_TOOL_SETUP

The smoke regression passes cleanly (with no warnings) with one additional tool apart from the primary tool selected for signoff.

### SIM_SMOKE_REGRESSION_SETUP

A small suite of tests has been identified as the smoke regression suite and is run regularly to check code health. If the testbench has more than one build configuration, then each configuration has at least one test added to the smoke regression suite.

### SIM_NIGHTLY_REGRESSION_SETUP

A nightly regression for running all constrained-random tests with multiple random seeds (iterations) has been setup. Directed, non-random tests need not be run with multiple iterations. Selecting the number of iterations depends on the coverage, the mean time between failure and the available compute resources. For starters, it is recommended to set the number of iterations to 100 for each test. It may be trimmed down once the test has stabilized, and the same level of coverage is achieved with fewer iterations. The nightly regression should finish overnight so that the results are available the next morning for triage.

### FPV_REGRESSION_SETUP

An FPV regression has been set up by adding the module to the hw/top_earlgrey/formal/top_earlgrey_fpv_cfgs.hjson file.

A structural coverage collection model has been checked in. This is a simulator-specific file (i.e. proprietary format) that captures which hierarchies and what types of coverage are collected. For example, pre-verified sub-modules (including some prim components pre-verified thoroughly with FPV) can be black-boxed - it is sufficient to only enable the IO toggle coverage of their ports. A functional coverage shell object has been created - this may not contain coverpoints or covergroups yet, but it is primed for development post-V1.

### TB_LINT_SETUP

VeribleLint for the testbench is set up to run in nightly regression, with appropriate waivers.

• For a constrained random testbench, an entry has been added to hw/<top-level-design>/lint/<top-level-design>_dv_lint_cfgs.hjson
• For an FPV testbench, an entry has been added to hw/<top-level-design>/lint/<top-level-design>_fpv_lint_cfgs.hjson

### PRE_VERIFIED_SUB_MODULES_V1

Sub-modules that are pre-verified with their own testbenches have already reached V1 or a higher stage. The coverage level of the pre-verified sub-modules that are not tracked (i.e., not taken through the verification stages), meets or exceeds the V1 stage requirement. They are clearly cited in the DV document and the coverage of these sub-modules can be excluded in the IP-level testbench.

### DESIGN_SPEC_REVIEWED

The design / micro-architecture specification has been reviewed and signed off. If a product requirements document (PRD) exists, then ensure that the design specification meets the product requirements.

### TESTPLAN_REVIEWED

The draft DV document (proposed testbench architecture) and the complete testplan have been reviewed with key stakeholders (as applicable):

• DUT designer(s)
• 1-2 peer DV engineers
• Software engineer (DIF developer)
• Chip architect / design lead
• Chip DV lead
• Security architect

### STD_TEST_CATEGORIES_PLANNED

The following categories of post-V1 tests have been focused on during the testplan review (as applicable):

• Security / leakage
• Error scenarios
• Power
• Performance
• Debug
• Stress

### V2_CHECKLIST_SCOPED

The V2 checklist has been reviewed to understand the scope and estimate effort.

## V2

To transition from V1 to V2, the following items are expected to be completed. The prefix “SIM” is applicable for simulation-based DV approaches, whereas the prefix “FPV” is applicable for formal property-based verification approaches.

### DESIGN_DELTAS_CAPTURED_V2

It is possible for the design to have undergone some changes since the DV document and testplan were reviewed in the V0 stage. All design deltas have been captured adequately and appropriately in the DV document and the testplan.

### DV_DOC_COMPLETED

The DV document is fully complete.

### FUNCTIONAL_COVERAGE_IMPLEMENTED

The functional coverage plan is fully implemented. All covergroups have been created and sampled in the reactive components of the testbench (passive interfaces, monitors and scoreboards).

### ALL_INTERFACES_EXERCISED

For simulations, interfaces are connected to all ports of the DUT and are exercised. For an FPV testbench, assertions have been added for all interfaces including sidebands.

All planned assertions have been written and enabled.

### SIM_TB_ENV_COMPLETED

A UVM environment has been fully developed with end-to-end checks in the scoreboard enabled.

### SIM_ALL_TESTS_PASSING

All tests in the testplan have been written and are passing with at least one random seed.

### FPV_ALL_ASSERTIONS_WRITTEN

All assertions (except security countermeasure assertions) are implemented and are 90% proven. Each output of the module has at least one forward and one backward assertion check. The FPV proof run converges within reasonable runtime.

### FPV_ALL_ASSUMPTIONS_REVIEWED

All assumptions have been implemented and reviewed.

### SIM_FW_SIMULATED

If the DUT has a CPU for which a ROM firmware is developed (burnt-in during manufacturing):

• The ROM firmware testplan is fully written.
• SIM_ALL_TESTS_PASSING checklist item is met, including these tests.

This checklist item is marked N.A. if the DUT does not have a CPU.

### SIM_NIGHTLY_REGRESSION_V2

A nightly regression with multiple random seeds is 90% passing.

### SIM_CODE_COVERAGE_V2

Line, toggle, fsm (state & transition), branch and assertion code coverage has reached 90%. Toggle coverage of the ports of the DUT and all pre-verified sub-modules have individually reached 90% in both directions (1->0 and 0->1).

### SIM_FUNCTIONAL_COVERAGE_V2

Functional coverage has reached 90%.

### FPV_CODE_COVERAGE_V2

Branch, statement and functional code coverage for FPV testbenches has reached 90%.

### FPV_COI_COVERAGE_V2

COI coverage for FPV testbenches has reached 75%.

### PRE_VERIFIED_SUB_MODULES_V2

Sub-modules that are pre-verified with their own testbenches have already reached V2 or a higher stage. The coverage level of the pre-verified sub-modules that are not tracked (i.e., not taken through the verification stages), meets or exceeds the V2 stage requirement.

### SEC_CM_PLANNED

Security countermeasures are planned and documented.

• Common countermeasure features (such as shadowed reg, hardened counter etc) can be tested by importing common sec_cm testplans, tests and adding the bind file cm_sec_bind.
• Additional checks and sequences may be needed to verify those features. Document those in the individual testplan.
• Create testplan for non-common countermeasures.

### NO_HIGH_PRIORITY_ISSUES_PENDING

All high priority (tagged P0 and P1) design bugs have been addressed and closed. If the bugs were found elsewhere, ensure that they are reproduced deterministically in DV (through additional tests or by tweaking existing tests as needed) and have the design fixes adequately verified.

### ALL_LOW_PRIORITY_ISSUES_ROOT_CAUSED

All low priority (tagged P2 and P3) design bugs have been root-caused. They may be deferred to post V2 for closure.

### DV_DOC_TESTPLAN_REVIEWED

The DV document and testplan are complete and have been reviewed by key stakeholders (as applicable):

• DUT designer(s)
• 1-2 peer DV engineers
• Chip architect / design lead
• Chip DV lead
• Security architect

This review will focus on the design deltas captured in the testplan since the last review. In addition, the fully implemented functional coverage plan, the observed coverage and the coverage exclusions are expected to be scrutinized to ensure there are no verification holes or any gaps in achieving the required stimulus quality, before the work towards progressing to V3 can commence.

### V3_CHECKLIST_SCOPED

The V3 checklist has been reviewed to understand the scope and estimate effort.

## V2S

### SEC_CM_TESTPLAN_COMPLETED

The testplan has been updated with the necessary testpoints and covergroups to adequately verify all security countermeasures implemented in the DUT. These countermeasures are listed in the comportable IP Hjson file located at hw/ip/<ip>/data/<ip>.hjson (or equivalent).

On OpenTitan, a security countermeasures testplan is auto-generated (the first time) by the reggen tool for each DUT, and is placed at hw/ip/<ip>/data/<ip>_sec_cm_testplan.hjson (or equivalent). This testplan has been imported into the main testplan written for the DUT. Tests implemented to verify the security countermeasures have been mapped to these testpoints.

Common countermeasures can be fully verified or partially handled by cip_lib. Follow this document to enable them. Make sure to import the applicable common sec_cm tests and testplans.

### FPV_SEC_CM_PROVEN

All security countermeasure assertions are proven in FPV. The required assertions for countermeasure are defined in Security Countermeasure Verification Framework. Follow this document to setup the FPV sec_cm testbench.

### SIM_SEC_CM_VERIFIED

All security countermeasures are verified in simulation. Common countermeasures can be fully verified or partially handled by cip_lib. Refer to the cip_lib document for details.

### SIM_COVERAGE_REVIEWED

Security countermeasure blocks may have been excluded in order to satisfy the V2 sign-off criteria. If so, these exclusions should be removed.

If UNR exclusion has been generated, it needs to be re-generated and reviewed after all security countermeasure tests have been implemented, as fault injection can exercise countermeasures which are deemed as unreachable code. The V2S coverage requirement is the same as V2 (90% code coverage and 70% functional coverage).

### SEC_CM_DV_REVIEWED

The security countermeasures testplan and the overall DV effort has been reviewed by key stakeholders (as applicable):

• DUT designer(s)
• 1-2 peer DV engineers
• Security architect (optional)

This review may be waived if not deemed necessary.

## V3

To transition from V2 to V3, the following items are expected to be completed. The prefix “SIM” is applicable for simulation-based DV approaches, whereas the prefix “FPV” is applicable for formal property-based verification approaches.

### DESIGN_DELTAS_CAPTURED_V3

Although rare, it is possible for the design to have undergone some last-minute changes since V2. All additional design deltas have been captured adequately and appropriately in the DV document and the testplan.

### X_PROP_ANALYSIS_COMPLETED

X Propagation analysis has been completed.

### FPV_ASSERTIONS_PROVEN_AT_V3

All assertions are implemented and 100% proven. There are no undetermined or unreachable properties.

### SIM_NIGHTLY_REGRESSION_AT_V3

A nightly regression with multiple random seeds is 100% passing (with 1 week minimum soak time).

### SIM_CODE_COVERAGE_AT_100

Line, toggle, fsm (state & transition), branch and assertion code coverage has reached 100%.

### SIM_FUNCTIONAL_COVERAGE_AT_100

Functional coverage has reached 100%.

### FPV_CODE_COVERAGE_AT_100

Branch, statement and functional code coverage for FPV testbenches has reached 100%.

### FPV_COI_COVERAGE_AT_100

COI coverage for FPV testbenches has reached 100%.

### ALL_TODOS_RESOLVED

There are no remaining TODO items anywhere in the testbench code, including common components and UVCs.

### NO_TOOL_WARNINGS_THROWN

There are no compile-time or run-time warnings thrown by the simulator.

### TB_LINT_COMPLETE

The lint flow for the testbench is clean. Any lint waiver files have been reviewed and signed off by the technical steering committee.

### PRE_VERIFIED_SUB_MODULES_V3

Sub-modules that are pre-verified with their own testbenches have already reached the V3 stage. The coverage level of the pre-verified sub-modules that are not tracked (i.e., not taken through the verification stages), meets the V3 stage requirement.

### NO_ISSUES_PENDING

All design and testbench bugs have been addressed and closed.

## S1

For a transition from S0 to S1, the following items are expected to be completed.

### DIF_EXISTS

Autogenerated IRQ and Alert DIFs have been created with the util/make_new_dif.py tool, and exist in sw/device/lib/dif/autogen/. Additionally, a header file, dif_<ip>.h and, optionally, dif_<ip>.c exist in sw/device/lib/dif/.

### DIF_USED_IN_TREE

All existing non-production code in the tree which uses the device does so via the DIF or a production driver.

### DIF_TEST_ON_DEVICE

An on-device test exists (in sw/device/tests) that uses the DIF.

This test should excercise the main datapath of the hardware module via the DIF, and should be able to be run on at least one OpenTitan platform (either on FPGA or in simulation).

## S2

For a transition from S1 to S2, the following items are expected to be completed.

### DIF_HW_FEATURE_COMPLETE

The DIF’s respective device IP is at least stage D2.

### DIF_FEATURES

The DIF has functions to cover all specified hardware functionality.

## S3

For a transition from S2 to S3, the following items are expected to be completed.

### DIF_HW_DESIGN_COMPLETE

The DIF’s respective device IP is at least stage D3.

### DIF_HW_VERIFICATION_COMPLETE

The DIF’s respective device IP is at least stage V3.

### DIF_DOC_HW

The HW IP Programmer’s guide references specific DIF APIs that can be used for operations.

### DIF_CODE_STYLE

The DIF follows DIF-specific guidelines in sw/device/lib/dif and the OpenTitan C style guidelines.

### DIF_TEST_UNIT

Software unit tests exist for the DIF in sw/device/tests/dif named dif_<ip>_unittest.cc. Unit tests exist to cover (at least):

• Device Initialisation
• All Device FIFOs (including when empty, full, and adding data)
• All Device Registers
• All DIF Functions
• All DIF return codes

### DIF_TODO_COMPLETE

All DIF TODOs are complete.