Big Data requires massive amounts of verification
September 03, 2014
As someone who's been in the verification field for many years, when I hear big data, I don't immediately think of massive amounts of personal data be...
As someone who’s been in the verification field for many years, when I hear big data, I don’t immediately think of massive amounts of personal data being collected in the cloud. Not by a long shot. When it comes to big data, I think system-on-chip (SoC) design. The complexity of these types of designs and the amount of embedded software code needed to make them come alive surely trumps all else in the cloud.
Hardware design is only getting more complicated and the addition of embedded software to the mix creates large data sets that need substantial verification resources to analyze and debug them. Verifying the combination of hardware and embedded software is estimated to consume about 70 percent of the project cycle.
Meanwhile, like a juggler keeping several balls in the air at once, companies are juggling their human and financial resources to preserve their technological edge while meeting time-to-market delivery schedules. All the while, they’re balancing cost with design size, maximum performance and low-power considerations, revenue, and profitability.
Traditional verification tools, such as the reliable but aging hardware description language (HDL) simulator, are coming up short in managing design size and verification complexities. Hardware emulation, a formerly overlooked verification tool used only for the largest and most difficult microprocessors and graphics chip designs, has become the cornerstone of today’s verification strategies. That’s because hardware emulation has the ability to debug big data, including embedded software, in a manner other verification tools can’t. It offers an accurate representation of the design before silicon availability, since it is based on an actual silicon implementation, something appealing to project teams.
Hardware emulation’s speed of execution can be up to six orders of magnitude faster than a simulator – a requirement for embedded software to shorten the design debug process without being constrained by design sizes. Further, it enables the implementation of real-life scenarios impossible to do with other verification tools … and that means more data sets.
One project team that’s designing storage and networking chips pointed out that the big data it was checking included vast amounts of waveforms, registers, and memory content. Design sizes such as these can range from hundreds of million gates to one-billion gates or more, a great deal of data that needs to be sifted through and analyzed. No other verification tool can manage designs larger than a hundred million gates or thereabouts.
Hardware emulation can be a shared resource between hardware and software teams to accelerate hardware/software integration ahead of first silicon, as noted above. This co-verification process can improve product quality and shave months off a project development schedule.
Another project team reports designing image display devices and implemented hardware emulation into the design and verification flow with satisfactory results. The emulation system is used to start software development earlier, co-verify hardware with the embedded software, perform hardware/software trade-off analysis, and correct functional, architectural, and performance problems before tape out. The ultimate goal is to accelerate a product’s time to market. According to the project team, many design bugs have been discovered using long tests and randomized scenarios.
As the project team found, hardware emulation is a versatile verification tool that can be used in a variety of applications, including system-level prototyping, software debug, and simulation testbench acceleration. It can provide power estimation and can be used to analyze and debug all kinds of chips destined to all sorts of market segments. The only exception is analog because the tool needs a digital representation of the design.
One more project team of both hardware verification engineers and software developers adopted hardware emulation for simulation acceleration. The effectiveness of the tool gave team members the confidence to move on to full SoC debug and analysis, complete with embedded software validation. They, too, believe they derive good value from the tool.
An emerging trend comes in the form of transaction-based verification, a way to link the power and performance of hardware emulation with the flexibility of simulation and a way to move verification to a higher level of abstraction. In many design and verification flows it is replacing hardware emulation’s in-circuit emulation mode (ICE). With ICE, the design-under-test (DUT) mapped inside the hardware emulator was connected to the target system where the taped-out chip would reside. Many project teams believe a transaction-based verification methodology is the future, though ICE continues to be popular.
And, by the way, what do you think drives the ability of the cloud to host massive amounts of data? Why, big data centers chock full of powerful machines with multicore SoCs at their foundation, of course. And, the intractable integration and complexity of SoCs like these need thorough verification and debug resources that only hardware emulation can provide.
Dr. Lauro Rizzatti is a verification consultant. He was formerly general manager of EVE-USA and its vice president of marketing before Synopsys’ acquisition of EVE. Previously, he held positions in management, product marketing, technical marketing, and engineering.