Paper Submitted by David Kelf, Vice President and Chief Marketing Officer, Breker Verification Systems

Announcements about new semiconductor applications for artificial intelligence and automotive electronics are welcome news for the semiconductor industry, driving new end-user opportunities.

Naturally, these bigger, more powerful chips require formidable, robust verification methodologies. Traditional simulation-based verification is not enough. Even hardware emulation and formal verification environments need fortification. New thinking is required to meet the demands of these complex applications with differing requirements than the past.

With so many changes happening in fast succession, some industry watchers believe we’re entering the Verification 3.0 era and the delivery of a new verification methodology.

Back in the late nineties, the late Gary Smith then of Dataquest predicted the advent of the “Intelligent Testbench,” a methodology that would include an executable specification synthesized into reactive, IP-laden test content. This would drive multiple engines appropriate to different verification process phases.

At last, we are seeing these ideas come to fruition. SoC designs with functionality in hardware and software, and verification requirements that go beyond functional test include integrity and infrastructure designed predominantly using IP blocks from a variety of sources. Verification is performed on ingenious new execution mechanisms leveraging a now accepted cloud-based platform and workflows driven by a coverage-directed, intent-based specification synthesized into test content across the entire verification flow. Every aspect of verification is changing, bringing productivity leaps in time to tackle next-generation semiconductors.

A Review of Verification 1.0 and Verification 2.0

A look back at Verification 1.0 highlights the move to simulators running on individual workstations, disrupted by emerging hardware description language (HDL) methodologies replacing schematic capture. Design capture moved from gate to register transfer level (RTL) with directed tests written using HDLs. Design synthesis, portable tools across compute platforms, ASIC devices, interpreted simulation (such as Verilog-XL) were all entering mainstream design flows. The integrated circuit (IC) engineering discipline was transformed as large circuit blocks were implemented on application specific IC (ASIC) platforms.

Around the time of the new millennium, processors started to be included on devices large enough to require accelerated verification performance. Different testbench approaches and customer-owned place and route tools tied to synthesis became prevalent. Verification 2.0 commenced with compiled simulation running on regression server farms, followed by the advent of hardware emulation and formal verification. Constrained random testbenches dramatically changed engineering roles. Specialized verification engineers who understood object-oriented dynamic testbenches leveraged tools such as Specman from Verisity (now Cadence), and then the Accellera Standard SystemVerilog (originally the SUPERLOG language from Co-Design Automation, now Synopsys) governed by the universal verification methodology (UVM) methodology.

Introducing Verification 3.0

Design requirements are changing again. Enormous SoC devices include multi-core processor clusters performing many functions with an IP-based design methodology. Design structures are more diverse and complex to maintain differentiation as Moore’s Law slows. Platform infrastructure (e.g. connectivity coherency and power domain switching, for example) as well as functionality must now be verified. Design requirements are driving more validation processes in areas such as safety and security, or “design integrity.” This drives a discontinuity in verification methodologies.

Verification 3.0 has five legs including the merging of hardware and software in an IP-based environment, the expanding role of verification into integrity and infrastructure, the continuum of verification engines, the intelligent abstract testbench, and cloud-based execution solutions. Each has multiple facets, all currently in various stages of evolution.

 

Execution Engine Continuum in the Cloud

Early examples of Verification 3.0 include hybrid verification platforms that combine simulation and hardware emulation with virtual platforms, and formal verification accelerating aspects of the flow. Formalized FPGA prototyping systems replacing ad hoc FPGA circuits are now part of the continuum of verification engines, which means a change in the way these technologies are combined and used.  This range of engines targeting specific points in the verification process makes the entire continuum far more effective with the right tool applied to its appropriate phase.

Company-owned server farms are giving way to a cloud-based infrastructure as the cost of maintaining these resources outweigh old concerns of off-premises compute security. With the cloud comes new workflows that automate regressions leveraging Agile ideas and new business models with on-demand pricing that match the actual use model of verification solutions through a development process.

Intelligent Testbench Merges Hardware and Software

Another significant trend is the shift in SoC verification from hardware to software-driven testing where tests are built into C code running on the processors. For example, a typical chip contains numerous deeply embedded processors that rely on firmware to provide their functionality. System integration pulls together many of the firmware engines along with general purpose CPUs, GPUs and, increasingly, FPGA resources into the platform. All need to work together to provide complex functionality.

Portable Stimulus-based test suite synthesis, the first real commercialization of the “Executable Specification” idea, has a large role to play here. Verification at this level is about hardware execution, including cache coherency and software functionality, and tests for these need to be driven from processor software rather than I/O transactions. Portable Stimulus makes it possible for the required tests to be synthesized to C tests, transactions and other formats that allow a single testbench to operate across the verification continuum to ensure an effective multi-engine flow.

The idea of test suite synthesis from a design specification represented the holy grail of verification since Verification 1.0, and now we are seeing it in new verification flows. Switching test content production from the perspective of “what tests do I need to activate design functions” to “lets synthesize the test from a comprehendible design specification” is a significant leap. However, it drives many efficiencies into the entire process that we are starting to understand.

This is as significant a leap in verification as Design Synthesis was to design.


Verification’s Expanding Role

Until Verification 3.0, most design and verification groups only concerned themselves with functionality. That now may be the less onerous concern as power verification, performance verification and, increasingly, safety and security considerations demand attention. Systematic requirements tracking and reliability analysis are essential verification tasks for many designs. Path tracing through the design is a core verification necessity, as is formal verification. Design Integrity is paramount in many industry segments and new techniques such as fault analysis are required to meet the needs of rigorous standards in these areas.

In addition, verification tools need to cover technology to offload tasks from dynamic execution engines and prune the state space. Design debug, profiling and multi-run analysis are moving up abstraction levels to leverage artificial intelligence on large data sets and reduce manpower. Sophisticated verification management techniques are starting to employ coverage-driven test synthesis, through Portable Stimulus-described intent that expands the process of analyzing coverage after execution and provides greater direction to the entire process.

Conclusions

No one company can tackle the entirety of the Verification 3.0 challenge and the industry does not have all the answers yet though some of the underpinnings are being put in place. Emerging technologies including machine learning can assist with debug, verification analysis, bug triage and other aspects of the verification flow.

Traditional EDA companies have a role to play, particularly in foundational technologies such as hardware emulation and FPGA prototyping platforms that are expensive capital investments for startups. Startups have innovative ideas and the ability to respond quickly to market and business model needs making them more likely to set the technology direction for Verification 3.0.

 ___________________________

About Dave Kelf

Dave Kelf is vice president and chief marketing officer at Breker Verification Systems responsible for all aspects of Breker’s marketing activities, strategic programs and channel management. He most recently served as vice president of worldwide marketing solutions at formal verification provider OneSpin Solutions. Earlier, Kelf was president and CEO of Sigmatix, Inc. He worked in sales and marketing at Cadence Design Systems and was responsible for the Verilog and VHDL verification product lines. As vice president of marketing at Co-Design Automation and then Synopsys, Kelf oversaw the successful introduction and growth of the SystemVerilog language, before running marketing for Novas Software, noted for the Verdi product line, which became Springsoft and is now part of Synopsys. Kelf holds a Bachelor of Science degree in Electronic Computer Systems from the University of Salford and a Master of Science degree in Microelectronics from Brunel University, both in the U.K., and an MBA from Boston University.