Jina Kim demonstrates the size and function of an all-digital health monitor for structures.
The Virginia Tech VLSI for Telecommunications (VTVT) group has developed a prototype system to monitor the health of infrastructures, from buildings and railroads to spacecraft. The prototype, which is the first all-digital signal processing system, reduces power consumption by 80 percent of that of its predecessors and paves a way for a self-contained monitoring system.
"We want to be able to diagnose damage to infrastructures and verify the repair efficacy," said ECE's Dong Ha, director of the VTVT group. "The single most important aspect of developing a self-contained monitoring system is low-power dissipation. The system should operate on energy harvested from the ambient such as solar, thermal, or vibration. We need to squeeze out every drop to reduce the overall power dissipation ranging from DSP algorithms to the interface between a sensor and a digital signal processing chip."
Ha explained that, in typical monitoring systems, sinusoidal tones are used to excite the structure, then the responses in the form of electrical impedance are measured and processed. His team, including graduate students Jina Kim (CPE) and Ben Grisso (ME), reduced power consumption by using digital pulse trains rather than sinusoidal tones and by observing only the polarity of the response signals. Using digital signals on both the structural excitation and the sensing allowed the team to eliminate both power-hungry digital-to-analog and analog-to-digital converters. It also simplified signal processing of the response signals, Ha said.
The interdisciplinary team from the VTVT lab and mechanical engineers from the Center for Intelligent Material Systems and Structures is supported by NSF on the effort. VTVT is part of the Center for Embedded Systems for Critical Applications (CESCA).
Computer engineering researchers, led by Michael Hsiao and Jung-Min Park are applying their verification, testing, and network security expertise to a system that can help protect the online privacy of children.
The system involves using a trusted third-party server, where parents register and allow different levels of access. The computer engineering team is developing the prototype and verifying that the overall system is secure, in order to protect children's online privacy. They are working with researchers in the business college who are providing legal and focus group input and assessing the system with users.
The verification challenges in the system spring from the very large state spaces involved, said Hsiao. "One of the key steps is representation of the system during the state space traversal to avoid state space explosion." The team is developing methods that are less vulnerable to memory explosion, such as automatic test pattern generation and satisfiability solvers.class="content"> class="blurb">
For more information, visit the FERMAT website.
While scientists and engineers view the promise of greater performance, efficiency, and smaller size from nano-scale electronics, a group of computer engineers see the challenges of higher manufacturing defect rates, more frequent fault incidence and greater susceptibility to variational effects. They are investigating nano computing: how to build complex processors or application-specific computing architectures with nano scale components and materials that may be fault-prone or defective.
"People are building quantum dot cellular automata, molecular switches and fabrics and nano tube-based transistors/interconnects and discovering there are issues with reliability, quality, and performance," says Sandeep Shukla, an assistant professor of computer engineering. "Silicon devices reaching 45nm and beyond are experiencing variational effects, which lead to unpredictable timing, capacitance, resistance and other factors. Manufacturing techniques are imprecise at that scale and we are seeing high defect rates. Moreover, with very thin oxide layers, these devices are susceptible to atmospheric radiation," he explains.
"How can we efficiently map logic onto an array of quantum dot cellular automata so the logic uses minimum resources, bypasses defective cells and works correctly in the presence of radiation that might inadvertently change the states of the quantum cells involved" he asks.
Molecular fabrics introduce issues as well. "Many researchers believe that in the near future, hybrid structures will dominate computing. How do we interface CMOS with nanofabric; micro structures with nanostructures There will be more nanowires in the memory than micro addressing lines and, hence, a problem with interfacing," he adds.
Nanotube-based fabrics will also affect software and may require spatial programming rather than temporal view of programming today, he says. Also probabilistic computing will possibly replace deterministic computing of today. "How do you write code that can harness the massive parallelism afforded by billions of switches"
Some of the technologies under development are speculative and may not be feasible, but the industry must be prepared to deal with the issues involved, he says. The issue is growing in importance as evidenced by the growing number of workshops studying nano computing.
This spring, Shukla co-chaired a workshop on nano computing at Europe's biggest conference on Design Automation and Test (DATE 07). He is also organizing a workshop on nano electronics at the NSF in September to help NSF assess the importance of investing in the area. Last May, Shukla organized a workshop for researchers from Virginia sponsored by NSF, Virginia Tech, Virginia Commonwealth University, and the IEEE Joint Chapter on Computer/Industrial Electronics and Control.