Houston, TX 77005
4:00 p.m. Thursday, March 7, 2013
On Campus | Alumni
Well over a decade ago, many believed that an engine of growth driving the semiconductor and computing industries, captured nicely by Gordon Moore's remarkable prophecy (Moore's law), was speeding towards a dangerous cliff edge. Ranging from expression of concern to doomsday scenarios, the exact time when serious hurdles would beset us varied quite a bit--some of the more optimistic warnings giving Moore's law till 2020! Needless to say, a lot of people have spent time and effort with great success to find ways for substantially extending the time when we would encounter the dreaded cliff edge, if not avoid it altogether. When faced with this issue, I decided to consider a different approach--one that suggested falling off the metaphorical cliff as a design choice, but in a controlled manner. This would result in devices that could switch and produce bits that are correct, namely have the intended value, only with a probabilistic guarantee. As a result, the results could in fact be incorrect. Such devices and associated circuits and computing structures are now broadly referred to as inexact designs, circuits and architectures. In this talk, I will start with the beginnings of this idea in 2002--one that Technology Review labeled as being heretical in their TR10 citation--and give an overview of a range of ideas that my students and other groups around the world have been developing since, embodying inexact computing today. Despite being probabilistic, inexact designs can be significantly more efficient in the energy they consume, their speed of execution and area needs, which makes them attractive for resilient applications which can tolerate error. I will also contrast this style of design with traditional approaches with a rich history, aimed at realizing reliable computing from unreliable elements, starting with von Neumann's influential lectures and further developed elegantly by Shannon-Weaver and others.