Our daily experiences with life and silicon constantly remind us that computation requires energy. Prima facie, an attractive analogy with steam engines suggests itself—perhaps computers are machines to convert energy into "computational work," and perhaps there are laws of physics that limit their efficiency. Though careful analysis by Charles Bennett in the 1970's has led him to suggest that computation can be done for arbitrarily little energy per step, this holds only in the limit of infinite time. Our finite-time reformulation of the problem identifies and encapsulates certain semantic aspects of the computational process as well as certain physical aspects of the control protocol that were absent in previous formulations. It draws on non-equilibrium thermodynamics, optimal control, and computational complexity theory. Here we model the simplest case—that of switching a single bit—and conjecture that there exists a technology-independent lower bound to the energy required.
Manoj Gopalkrishnan is a Ramanujan fellow and faculty member in the school of technology and computer science at TIFR Mumbai since 2009. He has spent a semester as a research assistant professor in the Mathematics department at Duke University. He received his PhD in Computer Science in 2008 from the University of Southern California under the advisement of Professor Leonard Adleman, and a BTech in Computer Science and Engineering from IIT Kharagpur in 2003. His research interests are in molecular programming, reaction networks, and the thermodynamics of computation.