Rahul Sarpeshkar, head of the Analog Circuits and Biological Systems Group at MIT, describes himself as an “amphibian” researcher, designing biological systems in two domains — the “wet” world of DNA-protein molecular circuits in living cells, and the “dry” world of electronic circuits on supercomputing chips.
The ability to do both wet and dry work at an advanced level is not typical. Sarpeshkar, an associate professor of electrical engineering and computer science, can do it because he made a fundamental discovery that the same thermodynamic physical laws that cause electron flow in transistors also govern molecular flow in chemical reactions. “I can literally take some advanced electronic analog circuits and then map them into DNA-protein circuits, and vice versa,” he says. “It gives us a very unifying way to synthesize and analyze circuits in both the wet and the dry domains.”
Sarpeshkar’s discovery opens up possibilities for engineering cells and improving molecular sensing, processing, and synthesis for the pharmaceutical, energy, and food industries. For example, immune cells could be engineered to both detect cancer cells and kill them. And highly parallel computational modeling could be used to discover problems with cancer and diabetes genes, drug design, and cancer treatments. Supercomputing chips could be used to efficiently design biological circuits in cells.
Underlying all of Sarpeshkar’s work is the idea of energy efficiency. One project promises to enable ultra-low-power, implantable medical devices through the creation of a fuel cell that harvests energy from glucose in the body. An implant thus powered would be antenna-free and battery-free, and it could last for the patient’s entire lifetime, drawing its power solely from the food the patient eats. “As soon as people power themselves, they power the implant,” Sarpeshkar says. Glucose-powered medical devices could revolutionize the field of medical implants, including brain implants and pacemakers.
The Power of Analog Circuits and Analog Computation
One major reason Sarpeshkar concentrates on analog circuits and biological systems is that biological systems are much more energy-efficient than digital ones. In a 1998 paper, Sarpeshkar pointed out that the energy required to duplicate the human brain’s complex computations digitally would be at least seven orders of magnitude greater than the energy the brain actually consumes. Biological cells can’t afford, energetically, to break everything down into zeroes and ones; instead, they have to utilize grayscale, analog signals.
Sarpeshkar’s group has shown that analog computation is practical. One group member recently engineered a bacterium to compute a square root and other analog functions using only two genetic parts. Others who tried to do the same computation digitally had to use 130 molecular parts. And unlike purely digital synthetic biology, Sarpeshkar says, “Analog synthetic biology has the promise of scaling.”
The group has also built revolutionary electronic circuits inspired by biology. A few years ago, they created a radiofrequency chip that could analyze wireless signals over a wide spectrum extremely quickly by mimicking how the human ear analyzes the spectrum of sound signals. Sarpeshkar is currently investigating circuits inspired by cell biology that perform massively parallel collective analog computing, unaffected by noise and performing reliably despite unreliable parts.
Speaking Different Languages
Sarpeshkar’s ability to bridge seemingly disparate fields finds application in his work outside of the university setting. He finds that engaging with the private sector grounds his research in the real world. Small details that might at first not seem so important to him end up being essential when he works with industry. “That which is the most fundamental is also the most practical,” says Sarpeshkar, quoting Ludwig Boltzmann, whose laws of physics he exploited to unify biological and electronic circuit design.
And despite the “synergistic positive feedback loop” in his own lab, Sarpeshkar knows that interdisciplinarity does not always happen automatically when a team of experts is assembled. It is difficult to bring together a biologist and engineer and expect them to speak the same language or to communicate deeply. But when the two worlds are bridged in one person or in one unified way of thinking, the possibilities increase. “If you’re an amphibian,” he says, “you really can understand both the commonalities and the differences in both fields deeply, and work with high creativity and discipline.”