Vietnamese/Australian entrepreneur Tan Le tells us that our communications with machines have always involved explicit commands. Human communication is much more subtle – we communicate expressively, through our faces, bodies, etc. Her project – and her company, Emotiv – is intended to let computers respond to our facial expressions and emotional experiences by interpreting the signals from our brains.
We can measure brain signals by mapping electrical impulses on the brain. But brain structure can be very different based on how the brain folds – even in identical twins, signals might come from different places. So the first breakthrough is to build a map that can unfold the brain and identify where a signal is coming from.
The second challenge is building sensors that can interpret signals from the brain. Generally, this requires a “hairnet” of sensors that’s expensive ($10,000+) and awkard to wear, involving conductive gel.
A preselected (bald) volunteer puts on a headset designed to interpret his brain signals. Tan Le creates a new user in her system – the EPOC headset – and starts training based on a “neutral signal”, the normal state of his relaxed brain. Then he thinks about a movement, and the system trains on that data. Before Tan Le can tell us that he’s now going to try to move the cube with his mind, the cube is pulled forward to him, getting applause from the audience.
Now, she asks him to imagine the cube slowly fading out. Again, he’s able to dim it and bring it back with nothing but his thoughts. At first, he’s only able to do it for a few seconds – after about a minute, he makes it disappear completely and the crowd cheers.
We see a demo of how this technology might interface with robots – a helicopter that lifts off, a robot that moves around based on facial expressions. We see someone opening and closing the curtains in her smart house with her mind (it looks like more work than getting up and closing them, to be frank), but other applications make much more sense – using the interface to control an electric wheelchair, which could be profoundly useful for someone who is seriously disabled.
Pingback: TED – Tan Le: A headset that reads your brainwaves « Mind Your Language
Criminals from jpl and google are using this mind reading tech to commit crimes against humanity. they use spy satellites with this tech to exploit lives for a laugh and to harass and stalk people… Google and Jpl is run by people with a disgusting sexual disorder that starts with pedophilia and taps everything from peeping tom bathroom shark and goes as far as rapist and murderer. The culprits are Gary Kurtz,Barack Obama, Robin Williams, Nadeen Sheikh (aka sergey brin and larry page) and dr drew pinsky….They use satellites and ham radio systems and the hubble telescope to abuse innocent people for no reason other than they wanna watch me.n shower. This abuse can be considered nothing less than an atrocity and needs to stop now….These people will be put in jail soon….I believe i have enough evidence to put them away for life.
Comments are closed.