Sunday, August 29, 2010

Some Key Developments in Universal Verification Methodology

In the past blog articles, we had focused on examples using VMM to discuss creation of a comprehensive verification methodology using SystemVerilog. The need for a common verification methodology has led to creation of Universal Verification Methodology (UVM) and an early adapter’s (EA) release was announced during the last DAC. This release can be downloaded from the Accellera website. Going forward, we will look into some of work and advances in UVM (as it moves to UVM 1.0 from the UVM EA release) as the work by an Accellera committee comprising of representatives from verification user companies such as Cisco, Freescale, Intel, and IBM along with the EDA companies including Cadence, Mentor, and Synopsys.

Some of the major areas of ongoing work include defining time-consuming phases in the simulation, register abstraction layer, and the TLM2 modeling.

Backward compatibility required that any additional time-consuming phases be run parallel to the run() phase defined in the EA release. SoC projects require a common test flow to ensure module-level environments easily integrate into the full-chip environment and the execution of simulation can move from one phase to another phase when all tasks (including all outstanding transactions) of the previous phase have completed. Being able to run these phases in parallel to the single run() phase defined in the EA release is the suggested way to make this enhancement.

Synopsys’ UVM Register Abstraction Layer (RAL) is an application package that can be used to automate the creation of object-oriented models of registers and memories inside a design. Janick Bergeron of Synopsys is leading the effort here that brings in years of earlier work done at Synopsys with SysytemVerilog VMM methodology work.

SystemC TLM2 standard is targeted to enable development of interoperable high-speed transaction-level models for virtual platforms in SystemC. These models need to memory-mapped bus-based systems at high speed such that software can be run on a virtual platform at near real-time speed. TLM2 is a complex standard and difficult to be mapped to languages other than SystemC. Some user-level requirements for a TLM2 implementation in UVM1.0 have been defined and they focus on TLM2 subset that includes generic payloads, blocking and non-blocking transport, and base protocols but no SystemC support.

We will look into some of these developments in more detail as the work on UVM advances at Accellera.

Monday, April 5, 2010

Reasons for Improved Productivity with VMM for RTL Engineers

When looking at a possible move to use of advanced verification methodologies such as VMM, a natural question that comes to the mind of an RTL verification engineer is whether such a move will be productive. Is it worth the cost of learning a new programming paradigm? Are there significant productivity gains to be had with such a move?

These questions can be answered if we take a closer look at what VMM provides to an RTL verification engineer that is different from the verification environment created at RTL. Productivity gains are a result years of software development work that has resulted in VMM infrastructure that helps streamline the overall verification task. It streamlines the overall task of verification by, i) separating the task of creating the verification environment from the task of writing testcases, ii) creating the verification environment itself in a layered fashion, iii) allowing control of simulation in a standardized fashion, and iv) enabling reuse of the verification work as a by-product. Let’s take a look at each of these points in a bit more detail.

Separation of the verification environment & testcases:

It adds to overall productivity in the complex task of achieving goals of design verification in much the same way as hierarchical design add to the productivity in creating DUTs. A well designed verification environment provides simple controls for verification parameters leading to the ease of creation of testcases that accomplish the goals of verification. The verification hierarchy, at a macro level, consists of a verification environment that interacts with the DUT and a set of test cases interact with the verification environment.

Standardized approach to the creation of verification environment:

The verification environment itself can be created in a hierarchical fashion using a set of standard layers that would be needed in a typical design verification task.

A verification environment provides means for signal-level interaction with the DUT at the lowest level.

These signals can be driven, monitored, and checked via a set of commands that work with a higher-level data structures such as packets and drive signal-level interactions.

The DUT provides capabilities that enable higher-level operations and functional transactions are the abstractions of these higher-level operations performed by the DUT. These higher-level operations can lead to a set of packets being driven on to some interfaces and results monitored and checked on the same or some other interfaces. The functional layer results from a combination of commands in sequential and/or hierarchical fashion.

The transactions at the functional layer can be generated randomly or can be generated in a coordinated fashion to result in a specific sequence of random transactions. You are generating various scenarios that can used by testcases to target specific goals of verification.

A testcase then modifies constraints on transaction generators, defines new scenarios, and synchronizes various transactions occurring to accomplish the goals of a testcase. As a result of these layering structures, the communications between the DUT and the verification environment is at transaction-level as opposed to signal-level resulting in productivity gains in testcase creation.

Standardized simulation control:

Standardized simulation control is possible with the clean separation of verification environment and the testcase along with standard layering of the verification environment. The standard steps in simulation control for a test case after its configuration involves building of the verification environment, configuring the DUT after reset, getting all associated transactors and generators started, detecting the end-of-test condition, stopping transactors and generators, and reporting the results.

Testbench Reuse:

Testbench reuse is a natural outcome of hierarchical creation of the verification environment with standard layers.
• Testcases are created at the very top-level and decoupling them from the environment itself allows for ease of creation. The bulk of the code is written to create the environment that can be reused not only over generations of the same design but also with different designs using some common physical level interfaces.
• The use of standardized and higher-level communication between components enables reuse.
• Standardized transactions via scenario generators enable reuse.
• Layering of the verification environment contributes to reuse at each level and across levels. Signal layer can be used, Signal and command layer can be reused, and so on.
• Even analysis and reporting components can be reused due to their modular construction.

So for an RTL verification engineer, VMM provides a methodology and associated capabilities for efficiently developing a verification environment that is likely to highly reusable. Productivity gains are a result years of software development work that has resulted in the VMM infrastructure that helps streamline the overall verification through modular and flexible development of the verification environment resulting in ease of creation of tetscases.