By moving their hands and fingers, users can direct a robot to play piano or shoot a basketball, or they can manipulate objects in a virtual environment.
Their flight patterns change in response to different sensory cues, a new study finds. The work could lead to more effective traps and mosquito control strategies.