Michael Hsiao believes that formal verification tools and a new focus in engineering education will help companies reduce the costs to testing their designs. Companies report that 70-90 percent of development costs are spent in verification, testing, and debugging.
As computing systems grow increasingly complex and integrated into all aspects of life, verification, testing, and debugging have taken the biggest role in product design.
Most system houses and chip design firms report that 70 percent of their total cost comes from checking to be sure their design is correct, according to verification expert Michael Hsiao, a computer engineering professor. For large software projects, verification can run 90 percent of cost, he added.
“Verification and debugging are difficult problems with today’s complex systems,” Hsiao said. He pointed to the many hardware and software products that require continual modifications both before and after being released. He believes the answers lie in better tools — specifically formal verification algorithms — and a new focus in engineering education.
He is taking action in both directions. He has built one of the university’s largest research teams, with 15 graduate students (14 Ph.D.), that recently improved the performance and capacity of verification tools by one to two orders of magnitude. Based on the breakthroughs, he is leading a faculty team to build course modules and a curriculum to integrate verification throughout the ECE curriculum.
“The most naïve way to check a large, complex design is through simulation,” he said. “People use it because it is easy to understand, but most systems are too complex for simulation to be effective. The search space is just too big.” He described how simulating with 100 million input patterns is still significantly less than one percent of the possibilities in many designs. “Some companies spend months, simulating 24/7. After a time, when nothing bad happens, they think it is probably OK for release. But they have no guarantee.”
More effective tools are found in formal techniques, what Hsiao called “the opposite of simulation.” Formal techniques are algorithms based on theorems and proofs. “Like other methods, in formal techniques, you are trying to prove that you hold a property or that two different designs hold the same property,” he explained. He gave the simple example of a traffic control system being tested to ensure that there are never green lights in all directions at the same time. Comparing different designs for similar properties, or an equivalence check, is useful when a newer, or improved version is developed. “For instance, whatever the Pentium 3 was able to perform, the Pentium 4 should be able to do.”
Expanding the bottleneck
Although formal techniques have been researched for decades, there is no good method for large and complex problems, according to Hsiao. The research follows two general tracks: binary-decision-diagram (BDD) and satisfiability-based. Both methods try to solve all the billions of solutions very quickly. “In general, BDD tries to exploit memory space: if I have infinite memory, I can solve this.” On the other hand, the satisfiability-based track, including automatic test pattern generation (ATPG), says, “for these methods, I don’t need infinite space, but if I have infinite time, I can solve this.”
Without infinite time or space, the question becomes how to reduce the space requirement AND the temporal, or time requirement. Hsiao’s answer is to enhance learning capabilities. “Our breakthrough was in formal learning,” he explained. “If we solve certain parts of the problem, can we use that knowledge to help with other unsolved parts? The answer is ‘Yes’. Essentially, if we have found 1 million solutions so far and want to reach 1 billion, we don’t have to spend 1,000 times more, we can take advantage of what we solved so far.” The Virginia Tech technique is called “generic term static and dynamic learning” and allows formal verification of much greater complexity than ever before, according to Hsiao.
Even with his recent advances, formal verification has a long way to go, he said. “Now, we need to develop better learning mechanisms. How do we reason in a complex system? How do we learn from a given set of facts? How do we accelerate the learning process? How do we make the algorithms that learn most efficiently?”
Advancing the verification tools is only part of the answer in reducing time and costs of product development, according to Hsiao. He is interested in changing the design culture to incorporate verification, starting with the education of engineers. Funded by the National Science Foundation, Hsiao, Sandeep Shukla, Dong Ha, and Joseph Tront are building course modules and a curriculum to integrate verification at all levels of ECE education. Instead of “design conception and implementation…becoming mere preludes to the main activity of verification,” as described in a recent industry report, the team wants to instill verification-centric design and introduce formal verification skills.
The goal is for students to be able to envision the verification tasks before the design begins, write properties and transactions to be checked, and know how to embed design-for-verifiability into the process. After the design stage, students would be able to select the appropriate verification approaches for a design task, and apply formal and semi-formal verification techniques. The project complements Hsiao’s other efforts. He has been funded by six separate NSF projects, along with industrial support from companies such as Intel, Fujitsu Labs, and nVidia.
“The companies want to reduce this verification cost,” said Hsiao. “Now they need designers who know how to verify and debug.”