Just In
- 5 min ago
WhatsApp Will Not Work On These iOS, Android, Windows Phone Devices From 2020
- 37 min ago
These Are The Most Expensive Devices Launched In India In 2019
- 38 min ago
Oppo Planning To Set Up R&D Center In Bengaluru
- 1 hr ago
Vodafone Idea Plans To Sell Optic Fiber Arm, Navi Mumbai Data Centre: Report
Don't Miss
- Movies
Kapil Sharma On His Newborn: Feeling Blessed; This Is The Best Day Of My Life
- Lifestyle
Angad Bedi and Neha Dhupia's Outfits Remind Us Of Raj-Simran From Dilwale Dulhania Le Jayenge
- News
Young Artiste 2020, India’s largest national level talent competition calls for applications
- Sports
Rumour Has It: Everton to open talks with Arsenal-linked Ancelotti
- Finance
PC Jeweller Shares Sink 9% After Care Ratings Downgrades Its FD To CARE D FD
- Travel
7 Beautiful Churches in India For The Perfect Christmas Holiday
- Automobiles
Orxa Mantis Electric Performance Motorcycle Revealed At India Bike Week 2019
- Education
TOEFL Go! Global: A Mobile App From ETS To Stand Out In Exam
MIT researchers develop a robot which is controlled by human brain
The MIT Computer Science and Artificial Laboratory (CSAIL) program has done something amazing in robotics. The newest project will allow you to control the robot just with your brain.
Robots are always an interesting topic to read and learn about, the generation of robotics is really going advance. The MIT Computer Science and Artificial Laboratory (CSAIL) program has done something amazing and advances in the sector of robotics. CSAIL has made robots from origami robots that transform themselves into artificial intelligence which can sense people through walls.
The newest project will allow you to control the robot just by watching it and can correct mistake just with some simple hand gesture. The team has demonstrated the research with a short video, the small clips show a human who is supervising a robot for drilling a hole in a piece of wood. Basically, the interface works on people.
The video shows that the brain sensors can be quickly detected when a person notices that the robot is about to make a mistake. The person in the video is using his hand movement to instruct the robot about the correct action to perform. CSAIL Director Daniela Rus said the two sensors working in tandem enabled an almost instantaneous response.
"This work combining EEG and EMG feedback enables natural human-robot interactions for a broader set of applications that we've been able to do before using only EEG feedback," Rus said. "By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity."
To control a robot with your brain requires you to learn how to think in a certain way so that the sensors can transcribe the command correctly to the robot.
"What's great about this approach is that there's no need to train users to think in a prescribed way," Joseph DelPreto, lead author of a paper on the research, said."The machine adapts to you, and not the other way around."
"By looking at both muscle and brain signals, we can start to pick up on a person's natural gestures along with their snap decisions about whether something is going wrong," DelPreto said. "This helps make communicating with a robot more like communicating with another person."
According to the study, the robot chose the correct drill spot 70 percent of the time on his own and with the supervision of human, the number has risen to 97 percent almost close to 100 percent accuracy.
-
22,990
-
29,999
-
14,999
-
28,999
-
34,999
-
1,09,894
-
15,999
-
36,591
-
79,999
-
71,990
-
14,999
-
9,999
-
64,900
-
34,999
-
15,999
-
25,999
-
46,669
-
19,999
-
17,999
-
9,999
-
22,160
-
18,200
-
18,270
-
22,300
-
33,530
-
14,030
-
6,990
-
20,340
-
12,790
-
7,090