TRENDING ON ONEINDIA
- Cyclone Gaja Updates: Destruction Continues; Death Toll Rises
- Samsung Begins Rolling Out Android Pie Beta Update For The Galaxy S9 & S9+
- Jawa Vs Royal Enfield — A Brief Comparison
- Do You Know Deepika Padukone's Wedding Ring's Cost?
- Absence Of Hardik Pandya Will Hurt India: Michael Hussey
- Have You Heard Of This Disgusting Food Museum In Sweden?
- Types Of Insurance Cover That Everyone Should Have
- Most Beautiful Winter Destinations In Uttarakhand To Visit In 2018
Robots are always an interesting topic to read and learn about, the generation of robotics is really going advance. The MIT Computer Science and Artificial Laboratory (CSAIL) program has done something amazing and advances in the sector of robotics. CSAIL has made robots from origami robots that transform themselves into artificial intelligence which can sense people through walls.
The newest project will allow you to control the robot just by watching it and can correct mistake just with some simple hand gesture. The team has demonstrated the research with a short video, the small clips show a human who is supervising a robot for drilling a hole in a piece of wood. Basically, the interface works on people.
The video shows that the brain sensors can be quickly detected when a person notices that the robot is about to make a mistake. The person in the video is using his hand movement to instruct the robot about the correct action to perform. CSAIL Director Daniela Rus said the two sensors working in tandem enabled an almost instantaneous response.
"This work combining EEG and EMG feedback enables natural human-robot interactions for a broader set of applications that we've been able to do before using only EEG feedback," Rus said. "By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity."
To control a robot with your brain requires you to learn how to think in a certain way so that the sensors can transcribe the command correctly to the robot.
"What's great about this approach is that there's no need to train users to think in a prescribed way," Joseph DelPreto, lead author of a paper on the research, said."The machine adapts to you, and not the other way around."
"By looking at both muscle and brain signals, we can start to pick up on a person's natural gestures along with their snap decisions about whether something is going wrong," DelPreto said. "This helps make communicating with a robot more like communicating with another person."
According to the study, the robot chose the correct drill spot 70 percent of the time on his own and with the supervision of human, the number has risen to 97 percent almost close to 100 percent accuracy.