Tag Archives: Brain-Controlled Robot

Controlling a robotic arm with your brain


Mind = Blown.


When it comes to brain-controlled interfaces, advancements in the space have come a long way since its earliest days of research at UCLA in the 1970s. Under a grant from the National Science Foundation and followed by a contract from DARPA, the papers published following the study marked the first appearance of the expression BCI in scientific literature. Now fast forward nearly 40 years and Makers are inspiring a wide-range of possibilities, from EEG beanies that can read and change colors based on a wearer’s mood to amputees instructing a prosthesis to gain movement in their arms.

(Source: MAKE)

(Source: MAKE)

Writing for MAKE: Magazine, Nathan Hurst highlights a recent project from Cognitive Technology for their recent Make it Move interactive display in San Francisco. The device was simply plugged into a computer and screen, and into a two-jointed robotic arm.

To bring the exhibit to life, the team adapted an EEG board from OpenBCI. The ATmega328 based platform measures brain activity in both hemispheres, and records that data on eight channels. However, for it to work, it requires electrodes to be pasted onto the skull — something which wouldn’t work for a public exhibit. Subsequently, the Makers decided to use a soft neoprene cap with dry electrodes that was capable of sitting on the head with a velcro strap under the chin. It would then read and output brainwaves to the OpenBCI board.

OpenBci

“I think if this technology advances more, it will help a lot of disabled people who can’t move their arms,” Jisoo Kim tells MAKE. “Since everything is open-source, people can build it themselves, so I think it will advance a lot more.”

Maker Tomas Vega, who is a University of California student and Cog Tech member, shares that an EEG device can read that in the form of electrical signals on the scalp. Those signals are then processed, filtered and analyzed into more digestible form of feedback. Software interprets the information from the EEG and assists in processing the signals to create useful output. However, these signals can come in rather noisy, and as a result, the program must employ some machine learning to sort it out.

(Source: MAKE)

(Source: MAKE)

As MAKE points out, EEG interpretation faces a bit of skepticism from the academic community, and in the current exhibit’s setting, the team was faced with an additional barrier of teaching visitors to actually control the interface. While brain-controlled interfaces have been used primarily for science, the Cog Tech crew hopes that new tools will help spur further research and more importantly, address more practical problems including assisting those who are paralyzed.

For the exhibit, Cog Tech is harnessing the powers of BCI to command a robotic arm that Jon Ferran devised using an Arduino Mega (ATmega2560) along with some parts from an old bartending arm. At the moment, motion is limited to waving left and right.

Kim explains to MAKE that after just a few hours of training, she could already feel herself getting better with controlling the arm — something that the team hopes others will one day have the chance to experience. “It was pretty difficult. The most difficult part was to think the way that can control the arm; imagining moving my left or right arm is different from moving it.”

(Source: MAKE)

(Source: MAKE)

While BCI boasts several possible applications in basic computer control, such as replacing mice and keyboards, some have a more personal goal as well. “I want to be a cyborg. That’s my long-term goal,” Vega concludes. “I’m going to work all my life to make this a reality. There’s nothing that makes my heart beat faster than this dream of being enhanced by technology. This dream of being augmented, and augmenting my capabilities as a human, and trying to push the boundary.”

Interested in learning more? You can read the entire feature in MAKE: Magazine here. 

Take over the world with this $500 mind-controlled robot

Have you ever thought of controlling your own legion of robots with nothing but your mind? Chip Audette has made that fantasy a reality.

CHIP-4edit-660x461

Using OpenBCI, a low-cost programmable open-source EEG platform that gives Makers easy access to their brainwaves, Audette has been able to use just his mind to control a Hexbug Spider.

When he closes his eyes, the robot moves forward; when he focuses on specific flashing images, the robot to turn left or right. Generally, there are two images on a computer screen, each flashing at a different frequency. As the Maker stares at one image, the brainwave reader can assess how quickly the image is flashing and therefore determine which direction to turn.

Screen-Shot-2014-08-21-at-5.57.16-PM

As with many prototypical designs, there are some glitches, but the fact that Audette has created any sort of functionality for this low cost is impressive. The Maker used OpenBCI’s EEG electrodes and custom brain-signal-processing board, all connected to an Arduino Uno (ATmega328), which serves as the interface between the Hexbug and his computer.

“The PC processes the EEG data looking for the Alpha waves or the visually-entrained waves. If any are detected, it decides what commands to give the robot. The commands are conveyed back to the Arduino, which then drives the remote control, which the Hexbug receives over its usual IR link,” Audette noted in his blog.

Setup-Schematic2

Though the current system is limited by the simplicity of its technology, the Maker says, “Ideally, I’d just think the word ‘Fire!’ and the robot would respond. Unfortunately, those kinds of brain waves are too hard to detect.”

As Wired’s Robert McMillan writes, scientific-grade electroencephalography (EEG) monitors can cost thousands of dollars, but thanks to devices such as the Emotiv, there’s been a mini-boom in low cost brain-hacking gear. OpenBCI wants to be the open-source player in this space. Their kit comes with its own mini-computer and sensors that you jack into a black helmet-like device, called “Spider Claw 3000,” that you make on a 3D printer.

“What we really want to do is just provide the hardware to let people being learning,” explains Conor Russomanno, one of OpenBCI’s creators.

Brain-computer interfacing remains a relatively new field of science that offers a wide range of potential uses. For instance, medical grade BCIs are often used to help individuals with damaged cognitive or sensory-motor functions, while more affordable BCIs are being designed to address various neurotherapy applications.

Though these accessible technologies like OpenBCI are more focused upon education, rather than world domination, there is no telling what the future holds!