Tag Archives: Brain-Controlled Interface

Creating a brain-controlled TV remote with Arduino


All you need to do is think about changing the channel. Couch potatoes, rejoice! 


If you’ve ever wished of turning on the TV or switching a channel by simply thinking it, you’re in luck. As recent projects have demonstrated, such sci-fi-like magic is well on its way of not only becoming a reality, but more accessible for DIYers to tap into the technology. In fact, Maker Daniel Davis — who runs the website “Tinkernut” — has developed a homemade mind-controlled TV remote using an old Star Wars Force Trainer game and Arduino.

Mind1

For those who may recall back in 2009, the brain-computer interface toy was released as part of Uncle Milton Industries’ Star Wars Science line. It included a headset that was capable of detecting a mind’s electric fields (similar to an EEG) and relaying the signals to a tube that used a fan to levitate a ball in the air. The harder the user focused, the harder the fan blew, and the higher the ball was suspended.

Upon tearing down the game, Tinkernut discovered a NeuroSky EEG chip embedded inside the accompanying headset, which he decided to connect to an Arduino Uno (ATmega328) to collect and convert the raw EEG data onto a computer. After scavenging an IR LED and receiver from an old VCR, the hardware was just about complete, and so, the Maker went on to create an IR remote.

Arduino1

“Once this program is uploaded to your Arduino, open up the serial monitor, point a remote at the IR receiver and start pressing buttons. You should see a response in the results in the serial monitor screen. This will be the button code. You want to write down this code for later use,” the Maker writes.

With just a little coding, Tinkernut was able to combine both BCI and remote functionalities into one mind-blowing project that would enable the helmet to switch a TV on and off, simply by concentrating.

Intrigued? You can head over to the Maker’s official log, while also checking out his Arduino Brain Library on Github.

Controlling a robotic arm with your brain


Mind = Blown.


When it comes to brain-controlled interfaces, advancements in the space have come a long way since its earliest days of research at UCLA in the 1970s. Under a grant from the National Science Foundation and followed by a contract from DARPA, the papers published following the study marked the first appearance of the expression BCI in scientific literature. Now fast forward nearly 40 years and Makers are inspiring a wide-range of possibilities, from EEG beanies that can read and change colors based on a wearer’s mood to amputees instructing a prosthesis to gain movement in their arms.

(Source: MAKE)

(Source: MAKE)

Writing for MAKE: Magazine, Nathan Hurst highlights a recent project from Cognitive Technology for their recent Make it Move interactive display in San Francisco. The device was simply plugged into a computer and screen, and into a two-jointed robotic arm.

To bring the exhibit to life, the team adapted an EEG board from OpenBCI. The ATmega328 based platform measures brain activity in both hemispheres, and records that data on eight channels. However, for it to work, it requires electrodes to be pasted onto the skull — something which wouldn’t work for a public exhibit. Subsequently, the Makers decided to use a soft neoprene cap with dry electrodes that was capable of sitting on the head with a velcro strap under the chin. It would then read and output brainwaves to the OpenBCI board.

OpenBci

“I think if this technology advances more, it will help a lot of disabled people who can’t move their arms,” Jisoo Kim tells MAKE. “Since everything is open-source, people can build it themselves, so I think it will advance a lot more.”

Maker Tomas Vega, who is a University of California student and Cog Tech member, shares that an EEG device can read that in the form of electrical signals on the scalp. Those signals are then processed, filtered and analyzed into more digestible form of feedback. Software interprets the information from the EEG and assists in processing the signals to create useful output. However, these signals can come in rather noisy, and as a result, the program must employ some machine learning to sort it out.

(Source: MAKE)

(Source: MAKE)

As MAKE points out, EEG interpretation faces a bit of skepticism from the academic community, and in the current exhibit’s setting, the team was faced with an additional barrier of teaching visitors to actually control the interface. While brain-controlled interfaces have been used primarily for science, the Cog Tech crew hopes that new tools will help spur further research and more importantly, address more practical problems including assisting those who are paralyzed.

For the exhibit, Cog Tech is harnessing the powers of BCI to command a robotic arm that Jon Ferran devised using an Arduino Mega (ATmega2560) along with some parts from an old bartending arm. At the moment, motion is limited to waving left and right.

Kim explains to MAKE that after just a few hours of training, she could already feel herself getting better with controlling the arm — something that the team hopes others will one day have the chance to experience. “It was pretty difficult. The most difficult part was to think the way that can control the arm; imagining moving my left or right arm is different from moving it.”

(Source: MAKE)

(Source: MAKE)

While BCI boasts several possible applications in basic computer control, such as replacing mice and keyboards, some have a more personal goal as well. “I want to be a cyborg. That’s my long-term goal,” Vega concludes. “I’m going to work all my life to make this a reality. There’s nothing that makes my heart beat faster than this dream of being enhanced by technology. This dream of being augmented, and augmenting my capabilities as a human, and trying to push the boundary.”

Interested in learning more? You can read the entire feature in MAKE: Magazine here.