Skip to content ↓

Topic

Computer vision

Download RSS feed: News Articles / In the Media / Audio

Displaying 76 - 90 of 179 news clips related to this topic.
Show:

Wired

Wired reporter Matt Simon writes that MIT researchers have developed a new system that allows robots to be able to visually inspect and then pick up new objects, all without human guidance. Graduate student Lucas Manuelli explains that the system is “all about letting the robot supervise itself, rather than humans going in and doing annotations.”

Quanta Magazine

Quanta Magazine reporter Natalie Wolchover spotlights how the work of Profs. William Freeman, Antonio Torralba and Ramesh Raskar is shedding light on how visual signals can be used to uncover information on hidden objects. Freeman explains that he is thrilled by the idea that “the world is rich with lots of things yet to be discovered.”

Gizmodo

CSAIL researchers have created a deep learning system that can isolate individual musical instruments in a video by clicking on the specific instrument, writes Andrew Liszewski for Gizmodo. The researchers suggest the system, “could be a vital tool when it comes to remixing and remastering older performances where the original recordings no longer exist,” explains Liszewski.

BBC News

In this video, BBC Click spotlights VirtualHome, a simulator developed by CSAIL researchers that could be used to teach robots to perform household chores. The researchers hope the system could one day allow for seamless human-robot collaboration by allowing robots to, “cooperate with [humans] in finishing their activity,” explains graduate student Xavier Puig.

Gizmodo

CSAIL researchers have developed a new system that could be used to train machines to complete tasks, writes Patrick Lucas Austin for Gizmodo. The researchers hope the system could eventually be used to, “teach robots how to accomplish tasks simply by showing them actual instructional videos,” Austin explains.

Fast Company

MIT researchers have created a system that aims to teach robots how to perform household chores by breaking down activities into simple steps, reports Sean Captain for Fast Company. Captain explains that in order to simplify each chore, the researchers, “identified sub-tasks to describe thousands of duties in settings such as kitchens, dining rooms, and home offices.”

Wired

Wired reporter Matt Simon writes that CSAIL researchers have developed a new virtual system that could eventually be used to teach robots how to perform household chores. Researchers hope the system could one day help robots, “learn to anticipate future actions and be able to change the environment for the human,” explains PhD student Xavier Puig.

Salon

MIT researchers have developed a virtual reality system that can train drones to fly faster while also avoiding obstacles, reports Lauren Barack for Salon. Barack explains that the “researchers are programming the drones so they think they're in a living room or bedroom while they fly. They virtually see obstacles around them, but those impediments aren't really there.”

Popular Science

Using LiDAR sensors, MIT researchers have developed an autonomous vehicle navigation system for rural roads with “no detailed, three-dimensional map for the vehicle to reference,” reports Rob Verger of Popular Science. “The solution for urban mapping really doesn’t scale very well to a huge portion of the country,” explains graduate student Teddy Ort.

Motherboard

CSAIL researchers have developed a system that uses LIDAR and GPS to allow self-driving cars to navigate rural roads without detailed maps, writes Tracey Lindeman of Motherboard. Autonomous ride-hailing or car-sharing is important in rural communities because “the carless in these areas have few transportation options; many small communities don’t even have public buses,” notes Lindeman.

Forbes

Eric Mack writes for Forbes about a new system from MIT researchers that uses GPS in conjunction with LIDAR and IMU sensors to power self-driving vehicle navigation. Graduate student Teddy Ort says the system “shows the potential of self-driving cars being able to actually handle roads beyond the small number that tech companies have mapped.”

co.design

MapLite, a new system developed by CSAIL, aims to help autonomous vehicles navigate uncharted areas, writes Jesus Diaz for Co.Design. “[I]f autonomous cars can reach the millions of people who live beyond the city and are unable to pilot their own vehicles,” said graduate student Teddy Ort, “they will be uniquely capable of providing mobility to those who have very few alternatives.”

Smithsonian Magazine

Emily Matchar of Smithsonian details research out of the Media Lab, which seeks to help both autonomous and standard vehicles avoid obstacles in heavy fog conditions. “You’d see the road in front of you as if there was no fog,” says graduate student and lead researcher Guy Satat. “[O]r the car would create warning messages that there’s an object in front of you.”

CNBC

MIT Media Lab researchers have created a system that can detect obstacles through fog that are not visible to the human eye, writes Darren Weaver for CNBC. “The goal is to integrate the technology into self-driving cars so that even in bad weather, the vehicles can avoid obstacles,” explains Warren.  

Gizmodo

MIT researchers have developed a new imaging system that could allow autonomous vehicles to see through dense fog, writes Andrew Liszewski of Gizmodo. The laser-based system, which used a new processing algorithm, was able “to clearly see objects 21 centimeters further away than human eyes could discern,” Liszewski writes.