Tag Archives: brain-computer interface

Wearable cap lets amputees grasp objects with their mind

Researchers at the University of Houston have built a brain-machine interface to control prosthetic hands.

When it comes to brain-controlled interfaces, advancements in the space have come a long way since its earliest days of research at UCLA in the 1970s. Under a grant from the National Science Foundation and followed by a contract from DARPA, the papers published following the study marked the first appearance of the expression BCI in scientific literature. Now fast forward nearly 40 years and scientists are inspiring a wide-range of possibilities, including enabling amputees to command robotic limbs with their mind.


That’s exactly what one team from the University of Houston has done. The researchers have developed an algorithm that allowed a man to grasp a bottle and other objects with a prosthetic hand, powered merely by his thoughts. Instead of implants, this non-invasive method uses a wearable EEG cap that monitors brain activity externally through the scalp. During the its demonstration, a 56-year-old man whose right hand had been amputated was successfully able to clutch selected items 80% of the time, which included a water bottle, a small coin, a credit card and even a screwdriver.

While the ability to command prosthetics through brainwaves has been around, earlier studies centered around either surgically implanted electrodes or myoelectric control, which relies upon electrical signals from muscles in the arm. Beyond demonstrating that prosthetic control is possible using non-invasive EEG, researchers said the study offers a new understanding of the neuroscience of grasping and will be applicable to rehabilitation for other types of injuries, including stroke and spinal cord injury.

Reportedly, the work marked the first time an amputee has been able to use EEG-based BMI control of a multi-fingered prosthetic hand to grab objects, and could potentially lead to the development of improved artificial limbs. Interested in learning more? The team’s findings were recently published in the journal Frontiers in Neuroscience. You can also check out the study’s official page here.

Playing the game of Labyrinth using your brain

One group’s project is bringing a much more literal meaning to the term ‘mind game.’

First launched in 1946, Labyrinth is a skill game consisting of a box with a maze, holes, and a steel marble. The object of the game is to try to tilt the playfield to guide the marble to the end of the maze, without letting the ball fall into any of the holes. While versions of the game featured a suspended maze surface that rotates on two axes using a knob, other handheld versions have included an entirely closed transparent cover on top. However, none have ever been controlled by the human mind. That was, at least, until now.


As part of Autodesk’s neuroscience themed hackathon event, BrainiHack 2015, a team of Makers going by the name Blue GSD —  Daniel Harari, Gal Weinstock, and Maxim Altshul — created their own iteration of the classic game, all powered through brainwaves. The contraption was entirely 3D-printed and based on the OpenBCI open-source platform (ATmega328P).

To start, the game’s movement was enabled through a pair of micro servo motors, each controlled with an Arduino Uno (ATmega328). Meanwhile, the mechanism was comprised of three nested frames that were anchored in various places to achieve two degrees of freedom – roll and pitch. Given the limited amount of time to complete the project, the motors and motor arms were all attached to the frame using zip ties, while some nuts and screws were employed to keep the frames in place.


For those who may not know, OpenBCI offers a GUI that lets users visualize and analyze data in a more efficient and easier manner. The interface provides time-domain and frequency-domain (FFT) graphs, as well as a map of the head with electrode activity. OpenBCI allowed the team to attach electrodes wherever they wanted, and carry out experiments with various methods and brain waves.

“Once the data is captured with OpenBCI, it is transferred to the computer for analysis, the computer runs a Processing program that computes the Fourier Transform of the signal over a defined interval of time, filters the spectrum to look at relevant frequencies and finds the most powerful frequency in the range,” the team writes. “If the peaked frequency is the one we are looking for, a command is sent to an Arduino board via serial port. The Arduino then controls the servos according to the command received.”


However, the problem with brain-reading technology is that it can be on the slower. Given the real-time nature of the Labyrinth game, any sort of delay can cause a lapse in judgement and the ball to fall through a hole. As a result, the team decided to simplify the game into a basic maze with two different signals to study — the left-right position toggle was controlled via Alpha waves, while up-down positioning driven by SSVEP. By combining both Alpha and SSVEP, the team was presented with two types of waves that were capable of control and anticipation, which provided them the ability to control the game with just one person.

As it turns out, the team who admits to having absolutely no background in neuroscience ended up winning the OpenBCI prize for the best project in the open-source category. Those interested can head over to its official page to read more. Meanwhile, the project’s files are available on Thingiverse so that Makers can download and create their own Labyrinth game.

Thinking about the future with brain computer interfaces

What’s cooler than controlling the world around you with your mind? Nothing! According to OpenBCI’s Conor Russomanno, this dream is now coming closer to reality with the help of Makers.


As previously discussed on Bits & Pieces, brain-computer interfaces have made great progress as of late, thanks in part to companies like OpenBCI, whose co-founder recently shared his thoughts on the surging BCI movement with MAKE Magazine.

“Though BCI is in an embryonic state — with a definition that evolves by the day — it’s typically a system that enables direct communication between a brain and a computer, and one that will inevitably have a major impact on the future of humanity,” Russomanno writes.

The Maker notes devices from Emotiv, NueroSky, and Not Impossible Labs as being innovate yet he still has a strong desire to further utilize, “Brain-Computer Interface technology to create a comprehensive communication system for patients with ALS and other neurodegenerative disorders, which inhibit motor function and the ability to speak.”


BCIs entail a wide range of technologies which vary in terms of invasiveness, ease-of-use, functionality, cost, and real-world practicality. They include fMRI, cochlear implants, and EEG, Russomanno explains.

He holds a contained excitement for the future of BCI saying, “Each day it gets easier to leverage technology to expand the capabilities of that squishy thing inside our heads. Real-world BCI will be vital in reverse-engineering and further understanding the human brain.”

The OpenBCI co-founder was first introduced to the mind-controlling technology just two and half years ago and is astounded by the growth of the community in that time span. He specifies one catalyst to the prosperity of the movement – Makers! He believes, “While these devices have opened up BCI to innovators, there’s still a huge void waiting to be filled by those of us who like to explore the inner workings of our gadgets.”


Russomanno describes he and his partner Joel Murphy’s creation of OpenBCI as “a powerful, customizable tool that would enable innovators with varied backgrounds and skill levels to collaborate on the countless subchallenges of interfacing the brain and body.” The platform is based upon an Arduino shield prototype and sports an Atmel ATmega328 chip onboard. The design has even evolved to include the world’s first 3D-printed Electroencephalography (EEG) headset.


“In the next 5 to 10 years we will see more widespread use of BCIs, from thought-controlled keyboards and mice to wheelchairs to new-age, immersive video games that respond to biosignals.” the Maker predicts. While some products similar to these have already hit the market, he reveals, “They’re not ready; we still need makers, who’ll hack and build and experiment, to use them to change the world.”

Right on, Conor! The Maker community is always up for a good challenge.


‘Telepathic’ communication achieved for first time

Alright, so maybe it’s not entirely “mental telepathy,” but an international group of researchers is reporting that they have successfully achieved brain-to-brain communication. According to the scientists from the United States, France and Spain, the team has leveraged several technologies, including computers and the Internet, to relay information between test subjects separated by approximately 5,000 miles without carrying out any invasive procedures on the subjects.


Words such as “hola” and “ciao” were telepathically transmitted from a location in India to a location in France using an Internet-connected electroencephalogram (EEG) and robot-assisted and image-guided transcranial magnetic stimulation (TMS) technologies. When one study participant merely thought of a greeting, the recipient thousands of miles away was aware of the thought occurring, according to the report published in PLOS One.

“We wanted to find out if one could communicate directly between two people by reading out the brain activity from one person and injecting brain activity into the second person, and do so across great physical distances by leveraging existing communication pathways,” revealed Alvaro Pascual-Leone, a Harvard Medical School neurology professor.


Generally speaking, previous studies on EEG-based brain-computer interaction (BCI) have used communication between a human brain and computer. In these studies, electrodes attached to a person’s scalp record electrical currents in the brain as a person realizes an action-thought, such as consciously thinking about moving the arm or leg. The computer then interprets that signal and translates it to a control output, such as a robot or wheelchair, the study explains.

However, in this new study, the research team comprised of Pascual-Leone, Giulio Ruffini and Carles Grau of Starlab Barcelona, Spain and Michel Berg, leading a group from Axilum Robotics in Strasbourg, France added a second human brain on the other end of the system. Four healthy participants, aged 28 to 50, participated in the study.

“One of the four subjects was assigned to the brain-computer interface (BCI) branch and was the sender of the words; the other three were assigned to the computer-brain interface (CBI) branch of the experiments and received the messages and had to understand them.”

By utilizing an electroencephalogram connected to the Internet and transcranial magnetic stimulation, in which electromagnetic induction is used to externally stimulate a brain, the researchers proved that it was possible to communicate information from one human brain to another. In order to facilitate this, a computer translated simplistic words into digital binary code, presented by a series of 1s or 0s. Then, the message was emailed from India to France, and delivered via robot to the receiver, who through non-invasive brain stimulation could see flashes of light in their peripheral vision. The subjects receiving the message did not hear or see the words themselves, but were correctly able to report the flashes of light that corresponded to the message.


“We wanted to find out if one could communicate directly between two people by reading out the brain activity from one person and injecting brain activity into the second person, and do so across great physical distances by leveraging existing communication pathways,” said Pascual-Leone.

This experiment suggests the possibility of supplementing or bypassing the traditional methods of language-based or motor-based communication, and could have a number of applications. “We hope that in the longer term this could radically change the way we communicate with each other,” the researchers concluded.

Over the past several years, there has been a number of studies surrounding brain-controlled activity, where researchers have used these signals to control everything from drones to prosthetics. As we’ve explored on Bits & PiecesAtmel chips (like the ATmega328) have been at the heart of several brain-computer interfacing (BCI) innovations. With the emergence of new technologies and a passionate Maker crowd, we can only imagine the endless possibilities the future holds.

Take over the world with this $500 mind-controlled robot

Have you ever thought of controlling your own legion of robots with nothing but your mind? Chip Audette has made that fantasy a reality.


Using OpenBCI, a low-cost programmable open-source EEG platform that gives Makers easy access to their brainwaves, Audette has been able to use just his mind to control a Hexbug Spider.

When he closes his eyes, the robot moves forward; when he focuses on specific flashing images, the robot to turn left or right. Generally, there are two images on a computer screen, each flashing at a different frequency. As the Maker stares at one image, the brainwave reader can assess how quickly the image is flashing and therefore determine which direction to turn.


As with many prototypical designs, there are some glitches, but the fact that Audette has created any sort of functionality for this low cost is impressive. The Maker used OpenBCI’s EEG electrodes and custom brain-signal-processing board, all connected to an Arduino Uno (ATmega328), which serves as the interface between the Hexbug and his computer.

“The PC processes the EEG data looking for the Alpha waves or the visually-entrained waves. If any are detected, it decides what commands to give the robot. The commands are conveyed back to the Arduino, which then drives the remote control, which the Hexbug receives over its usual IR link,” Audette noted in his blog.


Though the current system is limited by the simplicity of its technology, the Maker says, “Ideally, I’d just think the word ‘Fire!’ and the robot would respond. Unfortunately, those kinds of brain waves are too hard to detect.”

As Wired’s Robert McMillan writes, scientific-grade electroencephalography (EEG) monitors can cost thousands of dollars, but thanks to devices such as the Emotiv, there’s been a mini-boom in low cost brain-hacking gear. OpenBCI wants to be the open-source player in this space. Their kit comes with its own mini-computer and sensors that you jack into a black helmet-like device, called “Spider Claw 3000,” that you make on a 3D printer.

“What we really want to do is just provide the hardware to let people being learning,” explains Conor Russomanno, one of OpenBCI’s creators.

Brain-computer interfacing remains a relatively new field of science that offers a wide range of potential uses. For instance, medical grade BCIs are often used to help individuals with damaged cognitive or sensory-motor functions, while more affordable BCIs are being designed to address various neurotherapy applications.

Though these accessible technologies like OpenBCI are more focused upon education, rather than world domination, there is no telling what the future holds!

OpenBCI is a brain-computer interface for Makers

OpenBCI – designed by Joel Murphy & Conor Russomanno – is a low-cost programmable open-source EEG platform that gives Makers easy access to their brainwaves.

“Our vision is to realize the potential of the open-source movement to accelerate innovation in brain science through collaborative hardware and software development,” the duo wrote in a recent Kickstarter post.

“Behind the many lines of code and circuit diagrams, OpenBCI has a growing community of scientists, engineers, designers, makers, and a whole bunch of other people who are interested in furthering our understanding of the brain.”

Brain-computer interfacing (BCI) is a relatively new field of science that offers a wide range of potential applications. For example, medical grade BCIs are often used to help individuals with damaged cognitive or sensory-motor functions. In addition, more affordable BCIs are being designed to address various neurotherapy applications.

“Both neurofeedback and biofeedback are starting to be used more frequently by artists, musicians, dancers, and other creative individuals who want to find new ways of connecting people with the world around them, making more immersive experiences,” the two explained. “There’s great potential for research in psychology and behavior studies with portable EEG devices that can record brain activity in real-world environments.”

In addition to an ADS1299 IC, the OpenBCI is equipped with Atmel’s ATmega328 (+ Arduino’s latest bootloader). Murphy and Russomanno have thoughtfully broken out all the Arduino pins, allowing Makers to blink lights or drive motors. In addition, Version 3 of the OpenBCI board uses bluetooth low energy (BTLE) for data transmission and programming of the ATMega controller.

On the software side, OpenBCI includes code examples written in Arduino, Processing, Python and openFramworks.

“We have no intention of reinventing the wheel, so we are actively working to make the hardware data accessible to all commonly used open-source EEG signal processing applications, such as BrainBay, OpenVibe and more,” Murphy and Russomanno added. “Because you have direct access to the data on the hardware side, making it portable to any existing EEG software is as easy as structuring the way the data is formatted and related.”

Interested in learning more about OpenBCI? You can check out the project’s official Kickstarter page here.