Skip to content ↓

In the Media

Media Outlet:
Forbes
Publication Date:
Description:

A new paper from graduate students in EECS details a newly-developed chip that allows neural networks to function offline, while drastically reducing power usage. “That means smartphones and even appliances and smaller Internet of Things devices could run neural networks locally” writes Eric Mack for Forbes.

Related News

MIT researchers have developed a special-purpose chip that increases the speed of neural-network computations by three to seven times over its predecessors, while reducing power consumption 93 to 96 percent. That could make it practical to run neural networks locally on smartphones or even to embed them in household appliances.

Neural networks everywhere

New chip reduces neural networks’ power consumption by up to 95 percent, making them practical for battery-powered devices.