Monday, April 5, 2010

Reasons for Improved Productivity with VMM for RTL Engineers

When looking at a possible move to use of advanced verification methodologies such as VMM, a natural question that comes to the mind of an RTL verification engineer is whether such a move will be productive. Is it worth the cost of learning a new programming paradigm? Are there significant productivity gains to be had with such a move?

These questions can be answered if we take a closer look at what VMM provides to an RTL verification engineer that is different from the verification environment created at RTL. Productivity gains are a result years of software development work that has resulted in VMM infrastructure that helps streamline the overall verification task. It streamlines the overall task of verification by, i) separating the task of creating the verification environment from the task of writing testcases, ii) creating the verification environment itself in a layered fashion, iii) allowing control of simulation in a standardized fashion, and iv) enabling reuse of the verification work as a by-product. Let’s take a look at each of these points in a bit more detail.

Separation of the verification environment & testcases:

It adds to overall productivity in the complex task of achieving goals of design verification in much the same way as hierarchical design add to the productivity in creating DUTs. A well designed verification environment provides simple controls for verification parameters leading to the ease of creation of testcases that accomplish the goals of verification. The verification hierarchy, at a macro level, consists of a verification environment that interacts with the DUT and a set of test cases interact with the verification environment.

Standardized approach to the creation of verification environment:

The verification environment itself can be created in a hierarchical fashion using a set of standard layers that would be needed in a typical design verification task.

A verification environment provides means for signal-level interaction with the DUT at the lowest level.

These signals can be driven, monitored, and checked via a set of commands that work with a higher-level data structures such as packets and drive signal-level interactions.

The DUT provides capabilities that enable higher-level operations and functional transactions are the abstractions of these higher-level operations performed by the DUT. These higher-level operations can lead to a set of packets being driven on to some interfaces and results monitored and checked on the same or some other interfaces. The functional layer results from a combination of commands in sequential and/or hierarchical fashion.

The transactions at the functional layer can be generated randomly or can be generated in a coordinated fashion to result in a specific sequence of random transactions. You are generating various scenarios that can used by testcases to target specific goals of verification.

A testcase then modifies constraints on transaction generators, defines new scenarios, and synchronizes various transactions occurring to accomplish the goals of a testcase. As a result of these layering structures, the communications between the DUT and the verification environment is at transaction-level as opposed to signal-level resulting in productivity gains in testcase creation.

Standardized simulation control:

Standardized simulation control is possible with the clean separation of verification environment and the testcase along with standard layering of the verification environment. The standard steps in simulation control for a test case after its configuration involves building of the verification environment, configuring the DUT after reset, getting all associated transactors and generators started, detecting the end-of-test condition, stopping transactors and generators, and reporting the results.

Testbench Reuse:

Testbench reuse is a natural outcome of hierarchical creation of the verification environment with standard layers.
• Testcases are created at the very top-level and decoupling them from the environment itself allows for ease of creation. The bulk of the code is written to create the environment that can be reused not only over generations of the same design but also with different designs using some common physical level interfaces.
• The use of standardized and higher-level communication between components enables reuse.
• Standardized transactions via scenario generators enable reuse.
• Layering of the verification environment contributes to reuse at each level and across levels. Signal layer can be used, Signal and command layer can be reused, and so on.
• Even analysis and reporting components can be reused due to their modular construction.

So for an RTL verification engineer, VMM provides a methodology and associated capabilities for efficiently developing a verification environment that is likely to highly reusable. Productivity gains are a result years of software development work that has resulted in the VMM infrastructure that helps streamline the overall verification through modular and flexible development of the verification environment resulting in ease of creation of tetscases.