Mercury News
In response to a reader’s question about self-driving cars, Mercury News reporter Gary Richards describes new technology in the works by MIT researchers to allow, “driverless cars to change lanes more like human drivers do.”
In response to a reader’s question about self-driving cars, Mercury News reporter Gary Richards describes new technology in the works by MIT researchers to allow, “driverless cars to change lanes more like human drivers do.”
To prove that the data used to train machine learning algorithms can greatly influence its behavior, MIT researchers input gruesome and violent content into an AI algorithm, writes Benjamin Fearnow for Newsweek. The result is “Norman,” an AI system in which “empathy logic simply failed to turn on,” explains Fearnow.
In this video, BBC Click spotlights VirtualHome, a simulator developed by CSAIL researchers that could be used to teach robots to perform household chores. The researchers hope the system could one day allow for seamless human-robot collaboration by allowing robots to, “cooperate with [humans] in finishing their activity,” explains graduate student Xavier Puig.
CSAIL researchers have developed a new system that could be used to train machines to complete tasks, writes Patrick Lucas Austin for Gizmodo. The researchers hope the system could eventually be used to, “teach robots how to accomplish tasks simply by showing them actual instructional videos,” Austin explains.
MIT researchers have created a system that aims to teach robots how to perform household chores by breaking down activities into simple steps, reports Sean Captain for Fast Company. Captain explains that in order to simplify each chore, the researchers, “identified sub-tasks to describe thousands of duties in settings such as kitchens, dining rooms, and home offices.”
Wired reporter Matt Simon writes that CSAIL researchers have developed a new virtual system that could eventually be used to teach robots how to perform household chores. Researchers hope the system could one day help robots, “learn to anticipate future actions and be able to change the environment for the human,” explains PhD student Xavier Puig.
Using LiDAR sensors, MIT researchers have developed an autonomous vehicle navigation system for rural roads with “no detailed, three-dimensional map for the vehicle to reference,” reports Rob Verger of Popular Science. “The solution for urban mapping really doesn’t scale very well to a huge portion of the country,” explains graduate student Teddy Ort.
CSAIL researchers have developed a system that uses LIDAR and GPS to allow self-driving cars to navigate rural roads without detailed maps, writes Tracey Lindeman of Motherboard. Autonomous ride-hailing or car-sharing is important in rural communities because “the carless in these areas have few transportation options; many small communities don’t even have public buses,” notes Lindeman.
Eric Mack writes for Forbes about a new system from MIT researchers that uses GPS in conjunction with LIDAR and IMU sensors to power self-driving vehicle navigation. Graduate student Teddy Ort says the system “shows the potential of self-driving cars being able to actually handle roads beyond the small number that tech companies have mapped.”
MapLite, a new system developed by CSAIL, aims to help autonomous vehicles navigate uncharted areas, writes Jesus Diaz for Co.Design. “[I]f autonomous cars can reach the millions of people who live beyond the city and are unable to pilot their own vehicles,” said graduate student Teddy Ort, “they will be uniquely capable of providing mobility to those who have very few alternatives.”
Writing for The Verge, Angela Chen highlights advances in AI that are allowing researchers to discover and understand new materials at a rapid pace. Chen cites a study co-authored by Assistant Prof. Elsa Olivetti, who “developed a machine-learning system that scans academic papers to figure out which ones include instructions for making certain materials.”
MIT researchers have developed “the first artificial system to mimic the way the brain interprets sound – and it rivals humans in its accuracy,” reports Samia Bouzik for NOVA Next. “The research offers a tantalizing new way to study the brain…[and] could boost some neuroscience research into the fast track,” writes Bouzik.
After several years of experimentation, graduate student Arnav Kapur developed AlterEgo, a device to interpret subvocalization that can be used to control digital applications. Describing the implications as “exciting,” Katharine Schwab at Co.Design writes, “The technology would enable a new way of thinking about how we interact with computers, one that doesn’t require a screen but that still preserves the privacy of our thoughts.”
AlterEgo, a device developed by Media Lab graduate student Arnav Kapur, “can transcribe words that wearers verbalise internally but do not say out loud, using electrodes attached to the skin,” writes Samuel Gibbs of The Guardian. “Kapur and team are currently working on collecting data to improve recognition and widen the number of words AlterEgo can detect.”
Researchers at the Media Lab have developed a device, known as “AlterEgo,” which allows an individual to discreetly query the internet and control devices by using a headset “where a handful of electrodes pick up the miniscule electrical signals generated by the subtle internal muscle motions that occur when you silently talk to yourself,” writes Rob Verger for Popular Science.