Tag Archives: BCI

This lower-limb exoskeleton is controlled by staring at flickering LEDs

Scientists have developed a brain-computer interface for controlling a lower limb exoskeleton.

As recent experiments have shown, exoskeletons hold great promise in assisting those who have lost the use of their legs to walk again. However, for those who are quadriplegic, diagnosed with a motor neuron disease or have suffered a spinal cord injuries, hand control is not an option. To overcome this barrier, researchers at Korea University and TU Berlin have developed a brain-computer interface that can command a lower limb exoskeleton by decoding specific signals from within the user’s mind.


This is achieved by wearing electroencephalogram (EEG) cap, which enables a user to move forwards, turn left and right, sit and stand simply by staring at one of five flickering LEDs, each representing a different action. Each of the lights flicker at a different frequency, and when the user focuses their attention on a specific LED, this frequency is reflected within the EEG readout. This signal is then identified and used to control the exoskeleton.

The exoskeleton control system consists of a few parts: the exoskeleton, an ATmega128 MCU powered visual stimuli generator and a signal processing unit. As the team notes, a PC receives EEG data from the wireless EEG interface, analyzes the frequency information, and provides the instructions to the robotic exoskeleton.


This method is suitable for even those with no capacity for voluntary body control, apart from eye movements, who otherwise would not be able to control a standard exoskeleton. The researchers believe that their system offers a much better signal-to-noise ratio by separating the brain control signals from the surrounding noise of ordinary brain signals for more accurate exoskeleton operation.

“Exoskeletons create lots of electrical ‘noise,’” explains Professor Klaus Muller, an author on the paper that has been published in the Journal of Neural Engineering. “The EEG signal gets buried under all this noise — but our system is able to separate not only the EEG signal, but the frequency of the flickering LED within this signal.”


The control system could serve as a technically simple and feasible add-on to other devices, with EEG caps and hardware now emerging on the consumer market. According to the researchers, it only took volunteers a few minutes to get the hang of using the exoskeleton. Because of the flickering LEDs, participants were carefully screened and those suffering from epilepsy were excluded from the study. The team is now working to reduce the ‘visual fatigue’ associated with long-term use.

“We were driven to assist disabled people, and our study shows that this brain control interface can easily and intuitively control an exoskeleton system — despite the highly challenging artefacts from the exoskeleton itself,” Muller concludes.

Those wishing to learn more can read the entire paper here, or watch the brain-controlled exoskeleton in action below.

[Images: Korea University / TU Berlin]

Playing the game of Labyrinth using your brain

One group’s project is bringing a much more literal meaning to the term ‘mind game.’

First launched in 1946, Labyrinth is a skill game consisting of a box with a maze, holes, and a steel marble. The object of the game is to try to tilt the playfield to guide the marble to the end of the maze, without letting the ball fall into any of the holes. While versions of the game featured a suspended maze surface that rotates on two axes using a knob, other handheld versions have included an entirely closed transparent cover on top. However, none have ever been controlled by the human mind. That was, at least, until now.


As part of Autodesk’s neuroscience themed hackathon event, BrainiHack 2015, a team of Makers going by the name Blue GSD —  Daniel Harari, Gal Weinstock, and Maxim Altshul — created their own iteration of the classic game, all powered through brainwaves. The contraption was entirely 3D-printed and based on the OpenBCI open-source platform (ATmega328P).

To start, the game’s movement was enabled through a pair of micro servo motors, each controlled with an Arduino Uno (ATmega328). Meanwhile, the mechanism was comprised of three nested frames that were anchored in various places to achieve two degrees of freedom – roll and pitch. Given the limited amount of time to complete the project, the motors and motor arms were all attached to the frame using zip ties, while some nuts and screws were employed to keep the frames in place.


For those who may not know, OpenBCI offers a GUI that lets users visualize and analyze data in a more efficient and easier manner. The interface provides time-domain and frequency-domain (FFT) graphs, as well as a map of the head with electrode activity. OpenBCI allowed the team to attach electrodes wherever they wanted, and carry out experiments with various methods and brain waves.

“Once the data is captured with OpenBCI, it is transferred to the computer for analysis, the computer runs a Processing program that computes the Fourier Transform of the signal over a defined interval of time, filters the spectrum to look at relevant frequencies and finds the most powerful frequency in the range,” the team writes. “If the peaked frequency is the one we are looking for, a command is sent to an Arduino board via serial port. The Arduino then controls the servos according to the command received.”


However, the problem with brain-reading technology is that it can be on the slower. Given the real-time nature of the Labyrinth game, any sort of delay can cause a lapse in judgement and the ball to fall through a hole. As a result, the team decided to simplify the game into a basic maze with two different signals to study — the left-right position toggle was controlled via Alpha waves, while up-down positioning driven by SSVEP. By combining both Alpha and SSVEP, the team was presented with two types of waves that were capable of control and anticipation, which provided them the ability to control the game with just one person.

As it turns out, the team who admits to having absolutely no background in neuroscience ended up winning the OpenBCI prize for the best project in the open-source category. Those interested can head over to its official page to read more. Meanwhile, the project’s files are available on Thingiverse so that Makers can download and create their own Labyrinth game.

Video: Using your brain and visual stimuli to play music

This biotronic art installation creates a unique musical experience based on thoughts and emotion.

What if moving your eyes from left-to-right or up-and-down could trigger lights, play music and control other devices? That’s what digital artist Fèlix Vinyals has set out to accomplish with his latest project entitled Torval. Well sort of, at least. In collaboration with EEG and BCI researcher Oscar Portolés, the Maker has designed a hybrid brain computer machine interface (BCMI) installation that allows him to create music and control the lighting while on stage, all through the reading of the electric potential of his brain and visual stimuli.


The project combines two independent BCI systems. The first makes use of the steady state visual evoked potential (SSVEP) technique to enable the musician to switch on/off a set of music tracks from a MIDI sample. Meanwhile, the other determines the musician’s index of relaxation that is read through the alpha rhythm to alter the illumination of the installation. The communication between the BCMI, the MIDI sampler and the set of floodlights via DMX protocol is done with an Atmel based Arduino.


Beyond that, Torval is comprised of six main modules: the visual stimuli tool, the EEG signal acquisition unit, the signal processing algorithms for both BCI systems, the output control box (Bebop), the music sampler, and the illumination system.

“On one hand, the visual stimuli tool elicits a SSVEP in the user visual cortex when he gazes at one of the six flickering stimuli. Then, the signal-processing algorithm searches the EEG data in real time for a SSVEP. When a SSVEP is found at a frequency coincident with one of the flickering stimulus units, the outputs control box will send a MIDI command to switch on or off the musical loop associated with the particular flickering stimulus unit,” Vinyals explains.


“On the other hand, a signal-processing algorithm constantly monitors the level of relaxation of the artist – the power within the alpha rhythm of the occipital cortex. Continuously and smoothly, Bebop modifies the illumination of the stage through DMX protocol in correlation with the relaxation of the user; a shade from the color spectrum that ranges from red to blue is projected onto the stage. Therefore, the user can actively control the color of the stage. Yet, as he fully engages in the performance, he loses his ability to self-control his level of relaxation and mental load; turning the stage illumination into a genuine portrait of both physiological states.”

What’s unique about this project is that is relies only upon imagination and emotion, enabling a wearer to create a unique, irreproducible musical experience. As the video eludes to, there are eyes that speak and there are other eyes that can perform… Trust us, you’ll want to see this!

Puzzlebox Orbit takes flight on All-American Makers

It was most mind-blowing (or controlling) episode yet! 

Given the ubiquity of DIY culture today, it’s no surprise that the Maker Movement has hit primetime with the debut of Science Channel’s new series All-American MakersIn case you haven’t had the chance to tune-in yet, the premise of the show is to give innovators and entrepreneurs an opportunity to pitch their ideas in return for funding and help being brought to market.


The ABC Shark Tank-like show for the engineering and Maker savvy, which stars Printrbot founder Brook Drumm, roboticist Brian Roe and venture capitalist Marc Portney, airs on Wednesday nights at 10 p.m. ET. Most recently, the panelists were presented with the Puzzlebox Orbit — a mind-controlled, ATmega328 powered helicopter — by Joshua Macias and Steven Castellotti, both of whom you may have seen in our Maker Faire Bay Area booth in previous years. If you recall, the Bay Area-based company also took to Kickstarter way back in 2012, where it successfully garnered over $74,000.

While you may see neighborhood kids, or even adults for that matter, playing with remote-controlled helicopters in their yards today, the remote control that operates that toy may soon take a back seat — thanks to Puzzlebox. The startup recently created a toy ‘copter they call Orbit that is capable of being controlled through brainwaves via an electroencephalography (EEG) headset that reads electrical activity along the scalp and communicates to the device over Bluetooth. The company’s software then extracts and visualizes those brainwaves in real-time, issuing command signals to the Orbit via an infrared adapter.


“Just weeks into its freshman run on the Science Channel, the network’s new series All-American Makers has highlighted some pretty fun tech, but Wednesday’s episode might have some of the coolest yet,” Mashable’s Sandra Gonzalez writes. And, having been able to play around with Orbit, we must agree. It’s mind-blowing!

If you missed the show’s latest episode, you can check out the team’s pitch here. Meanwhile, you can also read a recent blog post from Castellotti on Bits & Pieces. 

Control magnetic liquid with your mind

While we have featured some brilliant brain-controlled projects before, none may have been as mystifying as the Solaris endeavor from ::vtol::. Gaining inspiration from the swirling scenery of the 2002 film of the same name, this team has developed a system that allows the user to control magnetic liquids with merely their minds.


With the help of an Emotiv Epoc, the team enables a user to control a motorized magnet below a pool of gleaming green liquid. Within the liquid lies a magnetic substance that is purely governed by thought. Just don’t drink that green goo, or you might wake up as a Ninja Turtle!

Once the Epoc reads the wearer’s thoughts, a computer deciphers the focus and strength of the brainwaves, while an Arduino Uno (ATmega328) then translates these levels to the mobile magnet.

The team surveyed various social groups and generations to observe who would maintain the strongest control over the viscous liquid. People who spent a plenty of time with the object managed to influence the dynamic and direction of the liquids on the unconsciousness level, the Makers noted.

The unity between mind and control that the team discovered was intriguing. ::vtol:: writes that the whole experiment “visualizes the temperament of the person. The object copies your mental organization and echoes it on the liquid’s surface. The object becomes a part of the participant.”

Check out the video below to gain some insight into the minds of the team behind the Solaris. Still can’t get enough? Head over to the official Solaris blog to find out more about the inspiration behind this pool of green goo!

‘Telepathic’ communication achieved for first time

Alright, so maybe it’s not entirely “mental telepathy,” but an international group of researchers is reporting that they have successfully achieved brain-to-brain communication. According to the scientists from the United States, France and Spain, the team has leveraged several technologies, including computers and the Internet, to relay information between test subjects separated by approximately 5,000 miles without carrying out any invasive procedures on the subjects.


Words such as “hola” and “ciao” were telepathically transmitted from a location in India to a location in France using an Internet-connected electroencephalogram (EEG) and robot-assisted and image-guided transcranial magnetic stimulation (TMS) technologies. When one study participant merely thought of a greeting, the recipient thousands of miles away was aware of the thought occurring, according to the report published in PLOS One.

“We wanted to find out if one could communicate directly between two people by reading out the brain activity from one person and injecting brain activity into the second person, and do so across great physical distances by leveraging existing communication pathways,” revealed Alvaro Pascual-Leone, a Harvard Medical School neurology professor.


Generally speaking, previous studies on EEG-based brain-computer interaction (BCI) have used communication between a human brain and computer. In these studies, electrodes attached to a person’s scalp record electrical currents in the brain as a person realizes an action-thought, such as consciously thinking about moving the arm or leg. The computer then interprets that signal and translates it to a control output, such as a robot or wheelchair, the study explains.

However, in this new study, the research team comprised of Pascual-Leone, Giulio Ruffini and Carles Grau of Starlab Barcelona, Spain and Michel Berg, leading a group from Axilum Robotics in Strasbourg, France added a second human brain on the other end of the system. Four healthy participants, aged 28 to 50, participated in the study.

“One of the four subjects was assigned to the brain-computer interface (BCI) branch and was the sender of the words; the other three were assigned to the computer-brain interface (CBI) branch of the experiments and received the messages and had to understand them.”

By utilizing an electroencephalogram connected to the Internet and transcranial magnetic stimulation, in which electromagnetic induction is used to externally stimulate a brain, the researchers proved that it was possible to communicate information from one human brain to another. In order to facilitate this, a computer translated simplistic words into digital binary code, presented by a series of 1s or 0s. Then, the message was emailed from India to France, and delivered via robot to the receiver, who through non-invasive brain stimulation could see flashes of light in their peripheral vision. The subjects receiving the message did not hear or see the words themselves, but were correctly able to report the flashes of light that corresponded to the message.


“We wanted to find out if one could communicate directly between two people by reading out the brain activity from one person and injecting brain activity into the second person, and do so across great physical distances by leveraging existing communication pathways,” said Pascual-Leone.

This experiment suggests the possibility of supplementing or bypassing the traditional methods of language-based or motor-based communication, and could have a number of applications. “We hope that in the longer term this could radically change the way we communicate with each other,” the researchers concluded.

Over the past several years, there has been a number of studies surrounding brain-controlled activity, where researchers have used these signals to control everything from drones to prosthetics. As we’ve explored on Bits & PiecesAtmel chips (like the ATmega328) have been at the heart of several brain-computer interfacing (BCI) innovations. With the emergence of new technologies and a passionate Maker crowd, we can only imagine the endless possibilities the future holds.

Open Sauce

By Steve Castellotti

CTO, Puzzlebox

North Beach, San Francisco’s Italian neighborhood, is famous for the quality and wide variety of its many restaurants. From colorful marquees scattered up and down Columbus to the hushed, more dimly lit grottos hidden down side streets and back alleys, there is no lack of choice for the curious patron.

Imagine then, having chosen from all these options, you sit down and order your favorite dish. When the plate arrives the waiter places next to it a finely embossed card printed on thick stock. A closer examination reveals the complete recipe for your meal, including hand-written notations made by the chef. Tips for preparation and the rationale for selecting certain ingredients over others are cheerfully included.

Flipping the card over reveals a simple message:

“Thank you for dining with us this evening. Please accept this recipe with our regards. You may use it when cooking for friends and family, or just to practice your own culinary skills. You may even open your own restaurant and offer this very same dish. We only ask that you  include this card with each meal served, and include any changes or improvements you make.”

Sharing the “Secret” Sauce

Having been raised in an Italian family myself, I can assure you that there is no more closely guarded secret than the recipe for our pasta gravy (the sauce). But I can’t help but wonder how such an open sharing might affect the landscape of a place such as North Beach. If every chef was obliged to share their techniques and methods, surely each would learn from the other? Customers would benefit from this atmosphere of collaboration in terms of the taste and quality of their dinners.

These many restaurants, packed so tightly together as they are, would still be forced to compete on terms of the dining experience. The service of their wait-staff, the ambience, and cost would count for everything.

For the majority of customers, knowledge of the recipe would simply be a novelty. In most cases they would still seek a professional chef to prepare it for them. But to the aspiring amateur, this information would contribute to their education. A new dish could be added to their repertoire.

An experienced restaurateur could no doubt correct me on any number of points as to why such a scenario would be a poor business model and never could or should be attempted. But just across town, throughout Silicon Valley and indeed across the globe, in the realm of technology, this exact model has been thriving for decades.

Open Source in the Software World

In the software world, developers have been sharing their source code (the recipe for the programs they write) under licenses similar to the one outlined above on a grand scale and to great success. The Internet itself was largely constructed using open platforms and tools. Mobile phones running Google’s Android operating system are now the most popular in the world, with complete source material available online. And in 2012 Red Hat became the first open source company to achieve a billion dollars in revenue, with customers from IBM to Disney and Pixar among their roster.

The benefits are many. Developers can leverage each others’ work for knowledge and time saving. If you want to build a new web site, there’s no need to write the web server or common routines such as user management from scratch. You can take open versions and start from there. Even better, if you have questions or run into trouble, more likely than not someone else has, too, and the answer is only a search away. Most importantly, if the problem you found indicates a flaw in the software (a bug), then a capable coder is empowered to examine the source and fix it himself or herself. And the result can be shared with the entire community.

There are parallels here to several fields. Similar principles form the basis of the scientific method. Without the sharing of procedures and data, independent verification of results would be impossible. And many discoveries result from iterating on proven techniques. A burgeoning do-it-yourself community, a veritable Maker Movement, has grown around magazines like Make and websites such as Instructables.com. New inventions and modifications to popular products are often documented in meticulous detail, permitting even casual hardware hackers to follow along. Electronics kits and prototyping boards from companies like Arduino are based on Atmel microcontrollers  plus open circuit designs, and are often used to power such projects.

Puzzlebox Brain Controlled Helicopter in Flight

Brain-Controlled Helicopter

Recently, our company, Puzzlebox, released the Orbit, a brain-controlled helicopter. The user begins by setting a display panel to the desired level of concentration and/or mental relaxation they wish to achieve.  A mobile device or our custom Pyramid peripheral processes data collected by a NeuroSky EEG headset. When that target is detected in the user’s brainwaves, flight commands are issued to the Orbit using infrared light. One can practice maintaining focus or a clarity of thought using visual and physical feedback.

Puzzlebox Brain-Controlled Helicopter with Atmel AVR

Puzzlebox Brain-Controlled Helicopter with Atmel AVR

Beyond novelty, however, lies the true purpose of the Puzzlebox Orbit. All source code, hardware designs, schematics, and 3D models are published freely online. Step-by-step guides for hacking the software and electronics are included. Methods for decoding infrared signals and extending mechanisms to operate additional toys and devices are shared. Creative modification is encouraged.  The goal is to promote the product as a teaching aid for middle and high school sciences classes and in university-level programming and electrical engineering courses.

Puzzlebox forging Classroom and Early Adoption of Technology for Education

This business model is itself a bit of an experiment, much like the restaurant described above. There is little preventing a competitor from producing a knock-off and leveraging our own recipes to do it. They might even open their doors just across the street from ours. We’ll need to work hard to keep our customers coming back for seconds. But so long as everyone abides by the rules, openly publishing any modifications of improvements made on our recipe, we’re not afraid to share the secrets of our sauce. We only ask that they include the original material with each dish they serve, and include any changes or improvements made along the way. We’re willing to compete on cost and dining experience. In this way we hope to improve the quality and flavor for everyone.

Puzzlebox with Arduino and Atmel AVR

Puzzlebox with Arduino and Atmel AVR

Puzzlebox Software IDE Interface

Openness and The Internet of Things

Today, communities such as Kickstarter and others tapping into the power of openness and crowd-sourcing are fueling a lot of technological innovation.  The next era for enterprise is revolving around The Internet of Things (#IoT), machine-to-machine (#M2M) communications and even the Industrial Internet (#IndustrialInternet).

One strong proponent of innovation and thought, Chris Anderson, is renowned for having his fingerprints and vision on trends as they bloom into movements.  Anderson is committed and energized in this Make-infused world. His latest book, “Makers: The New Industrial Revolution”, eloquently outlines the “right now” moment with makers. “Hardware is the new software”, opening up the brink of the next age of the Internet, where devices and machines become connected. Cloud, agile apps, and embedded design hardware (systems on chips, microcontrollers, or smart devices) are converging and  paving the next generation of integrated products across the fabric of devices.

“The real revolution here is not in the creation of the technology, but the democratization of the technology. It’s when you basically give it to a huge expanded group of people who come up with new applications, and you harness the ideas and the creativity and the energy of everybody. That’s what really makes a revolution.

…What we’re seeing here with the third industrial revolution is the combination of the two [technology and manufacturing]. It’s the computer meets manufacturing, and it’s at everybody’s desktop.”

Excerpt credited from Chris’s Anderson’s “Maker: The New Industrial Revolution”

With that said, we enter the next age, where hardware is the new software.