Tag Archives: LEDs

Clara is a smart lamp that helps you stay focused

Working on a project? Cramming for an exam? This brain-sensing, environment-augmenting lamp uses EEG technology to tell how focused your are and block out distractions. 

We’ve all been there: It’s late at night, you’re cramming for an exam when suddenly you’re interrupted by the simplest thing. How cool would it be to have a desktop accessory that could give you a kick in the right direction and increase your intensity as you try to finish your studying? Thanks to a group of Makers from the School of Visual Arts, that will soon be a reality.


The brainchild of developers Mejía Cobo, Belen Tenorio, and Josh Sucher, Clara is a brain-sensing lamp that employs EEG technology to tell how focus you are at a task at hand. Embedded with speaker and LEDs, the scene-augmenting device is capable of responding to changes in brainwaves, then reacting to your level of concentration by increasing the ambient music and shifting the light levels.

To bring this idea to fruition, the team used the combination of an Arduino Uno (ATmega328), an MP3 shield, several Adafruit NeoPixels, a SparkFun Bluetooth modem and a Neurosky MindWave Mobile EEG headset to wirelessly measure your “attention” and map the lamp’s color temperature, thereby subtly altering your environment.


As you begin homing in on a specific idea, the light will become crisper and cooler as the volume of the ambient noise emitted from the speaker slowly rises. This helps to enhance your ninja-like focus and block out other distractions.

“The basic structure of the Arduino code is straightforward. The NeoPixel strip is instantiated, then the Music Maker shield is instantiated, then we take advantage of interrupts to listen for, receive and act on Bluetooth serial data while the music is playing,” its creators reveal. “When the MindWave detects ‘activity’ (a number from 0-100 generated via some proprietary algorithm on the Neurosky chip), we initiate the ‘fade’ of the music and the light.”


Looking ahead, don’t be too surprised if you see Clara on Kickstarter in the coming months. Plus, the team hints that they may even migrate to an Arduino Mega (ATmega2560) for its next iteration. Until then, check out rather unique project on its page here.

This LED map tracks the MBTA in real-time

Maker uses an Arduino, Raspberry Pi and LEDs to create a real-time map that keeps tabs on Boston’s trains.

Inspired by his love for making and public transit, MIT student Ian Reynolds has built an MBTA map into the wall of his fraternity room to show real-time locations of vehicles using bright LEDs.


The Maker employed a few meters of NeoPixels, driven by an Arduino Uno (ATmega328) that takes orders from a Python script running on a Raspberry Pi lying on his floor. The color of the LEDs were specially designed to match those of each transit line (e.g. red line, blue line, green line, orange line, etc.). Every 10 to 15 seconds, the system receives data via the MBTA’s API, which in turn, causes the respective lights to flash based on the trains’ approximate GPS location throughout Boston.


“It maps those to some LEDs, decides which ones actually need to be changed, and then sends that information to the Arduino, which does the bit pushing,” Reynolds explains. “In addition, I’m writing a tiny web app that lets me change visualizations and adjust the brightness for when I need to sleep.”

Intrigued? The Maker has put together an elaborate blog post that breaks down his entire project, from the hardware to the headaches. You can also get a glimpse of it all below!

Disney researchers found a way for devices to communicate using LEDs

Visible Light Communication enables the interaction between objects using only LEDs.

If devices are going to communicate with one other, more times than not it’s going to be done through Bluetooth or Wi-Fi. However, wireless networks aren’t always available and Bluetooth can drain battery life. Knowing this, a Disney Research team has come up with an alternative way for Internet of Things objects to ‘talk.’ How, you ask? Through LED lights.


Unlike incandescent or fluorescent bulbs, the brightness of LEDs can be controlled with extreme precision. Meaning, they can be turned on and off at very high frequencies that are faster than the human eye can detect. Aside from that, LEDs can even be used as receivers just like photodiodes.

Similar to how two ships passing in the night can communicate via Morse code, a couple of IoT gadgets can now secretly converse through the visible light generated by an LED — a method that the team calls Visible Light Communication, or VLC. Not only can it illuminate a room, but the MCU inside each bulb is capable of transmitting and receiving data.


“VLC creates opportunities for low-cost, safe, and environmentally friendly wireless communication solutions. We focus on connected toys and light bulb networks,” the team writes. “Our work targets a full system design that spans from hardware prototypes to communication protocols, and applications.”

Though the concept of “Li-Fi” has been around for a while, as expected, it would appear that many of VLC’s initial examples are focused primarily on toys. (It is Disney, after all!) Among them included a toy car that can turn on its own lights and come to life when placed near a lamp, as well as a princess dress whose embedded LEDs are activated whenever a wand with its own light comes near.

“LED-to-LED Visible Light Communication allows interaction between toys by only using LEDs. No dedicated hardware is required. When multiple devices are networked with each other, we organize the communication with our software protocols,” the researchers add.


However, the technology has other potential applications as well, with an adapter connected to the headphone jack of a smartphone or tablet to receive signals from overhead lights operating at wavelengths unnoticeable by the human eye. This, for instance, opens the door for LED emitters to be placed around a store to beam notifications to the smartphones of shoppers.

Using a simple mobile app on the device, the lightbulb data can be used to tell a story and visualize both pictures and text. When off, no data is transmitted. When switched back on, the storytelling continues.

As you can see in the photo above, the researchers employed various Arduino Uno boards (ATmega328) as part of the study’s testbed. Read all about the project here.

Sparks are weather-proof, sound-responsive LEDs

Sparks are ruggedized, weather-proof, addressable LEDs that can be added anywhere — from the home to the office to the club. 

Their kitchen. Their sofa. Their office. Their jackets. Their vehicles. These are just some of the places that the folks at San Diego startup JuiceJuice have affixed multicolor LEDs. In doing so, they found that although the lights were easy to install, they were often quite difficult to program. As a result, the team decided to create an app that would communicate with their rainbow LEDs over Bluetooth, enabling them to change patterns, adjust the speed and brightness, and sync multiple units together in real-time.


The aptly named Sparks are equipped with WS2812B modules running at 5V and a novel three-LED PCB format. With an Arduino at its core, its control unit (also known as the Brain) is embedded with Bluetooth connectivity, a real-time clock, and an SD slot for custom patterns — though it will come with at least 30 pre-programmed sequences. They’ve even added a microphone so the lights can respond to sound.

Aside from all that, JuiceJuice Sparks features a high-wattage 12V barrel plug for wall-socket applications, as well as times when more than 100 lights are being used. Otherwise, you can run up to 100 lights off USB, portable battery packs, car adapters and even your laptop.


“Our lights have lit our campsites at Burning Man, projects at dozens of festivals, DJ lighting in bars and nightclubs around San Diego, and vehicles from skateboards to yachts. They’ve survived hot tub parties in Mexico, the staircase of the ‘Charlie the Unicorn’ art-car, they accent shelves and desks in our homes and offices, and we even have them under my truck,” company co-founder Leeward Bound writes.

The Sparks were designed to withstand weather and to be flexible enough for installation in a wide range of settings  — whether that’s decorating the outside of your home for the holidays or illuminating your bike for a mesmerizing nighttime ride.

For its launch, backers can choose from one of two different kits. A basic set includes a pair of five-foot strands with 36 LEDs (12 three-inch segments of three lights), along with one Brain that can drive up to 500 LEDs and the necessary power supply. Meanwhile, a mega kit boasts twice as many LEDs for those with more elaborate ideas in mind.


“Want green explosions in your bookshelf, purple waves down your staircase and red lava around the tub? No problem. Switch your bed frame to red heartbeats, put on some Barry White, BOOM! Date night. And on a Saturday night, sound-activated rainbows turn absolutely any occasion into a dance party,” Bound adds.

Have a bright idea you’d like to get started on? Want to be the light of the party? Head over to Sparks’ Indiegogo campaign, where the JuiceJuice team is currently seeking $75,000. Pending all goes to plan, delivery is expected to begin in January 2016.

These lights will let you control your smart devices through gestures

LiSense uses shadows created by the human body from blocked light and reconstructs 3D human skeleton postures in real-time.

As our homes become increasingly smarter, what if we could use the light around us for more than just illumination? In other words, imagine if the light in your room could sense you waving your hand as you enter, or was able to trigger your smart coffee machine, unlock the door and turn on your entertainment center. While it sounds like something straight out of a sci-fi novel, it may soon all be possible thanks to a new project from researchers at Dartmouth University.


The team is looking to transform ubiquitous light into a medium that integrates communication with human sensing. LiSense works by decoding information made from visible light to turn everyday lighting into sensors that can then recognize and respond to what we do. This is achieved through visible light communication (VLC), which encodes data into light intensity changes at a high frequency invisible to the human eye.

Not only does LiSense use light to sense people’s movements, but it also allows them to control devices in their environment with simple gestures, employing light to transmit the information. The hope is that you will be able to gesture and engage with objects in a room via nothing more than light, similar to how you’d use a Kinect or Wii gaming system to interact with your TV.

For LiSense to track a person’s movements, the researchers built a three-meter by three-meter light-sensing testbed with five off-the-shelf Cree LEDs in the ceiling and 324 photodiodes on the floor. A total of 29 microcontrollers, Arduino Due (SAM3X8E) and Uno (ATmega328), were embedded as well. The system uses the shadows created by a person standing on the testbed to reconstruct their 3D human skeletal posture in real-time (at 60 Hz).

To get their shadow-based human sensing to work, the researchers had to overcome two critical challenges. Since multiple ceiling lights lead to diminished and complex shadow patterns on the floor, they had to devise light beacons to separate light rays from individual LEDs and ambient light. Additionally, they came up with an algorithm capable of taking the collected limited resolution, 2D shadow maps from the photodiodes in the floor and reconstructing a person’s posture in 3D.


By waving your hand, LiSense lets you freely control things, play games and track behavior without the need of cameras and on-body devices. One day, the team says it may even respond to your feelings. Compared to existing methods that use wireless radio signals such as Wi-Fi to track user gestures, VLC has several appealing properties and advantages. For starters, light-based sensing is secure, doesn’t penetrate walls, and isn’t limited to classifying a pre-defined set of gestures and activities. On top of that, it’s energy efficient, operates at a bandwidth 10,000 times greater than the radio frequency spectrum, and reuses existing lighting infrastructure.

“Light is everywhere and we are making light very smart,” says Xia Zhou, lead author and researcher on the project. “Imagine a future where light knows and responds to what we do. We can naturally interact with surrounding smart objects such as drones and smart appliances and play games, using purely the light around us. It can also enable a new, passive health and behavioral monitoring paradigm to foster healthy lifestyles or identify early symptoms of certain diseases. The possibilities are unlimited.”

Sounds intriguing, right? See it all in action below, and be sure to read the team’s entire paper here.

Maker creates a Tron and Star Wars-inspired control panel for his computer

This fully-functional, overhead control panel will be the most awesome thing you see today.

Most of us rely on a keyboard and mouse to perform tasks on our computers. Not Redditor user “smashcuts.” Instead, the Maker has built a fully-functional overhead control panel for his PC, complete with 100 programmable buttons and switches that trigger all kinds of actions, from the useful to the absurd.


As you can imagine, constructing such a complex device was no easy task. To make this a reality, the Maker employed the combination of a USB hub and controllers, LEDs for the backlighting, an Arduino Mega (ATmega2560) for the blinking lights and a HAL unit from Think Geek. These electronics are all housed inside an enclosure made from a metal junction box and laser-etched acrylic panels.


While the project itself was a pretty elaborate endeavor with some serious functionality, it was all done in good humor. There’s a green ‘Main Systems’ section which turns on his most frequently used programs, such as Chrome, Photoshop, Premiere, After Effects and iTunes. Meanwhile, a central unit controls all of his main OS shortcuts like open, save and close.


He’s also included a category that he calls ‘Panic Control,’ with three toggles for stress management. According to smashcuts, ‘Don’t Panic’ cues a hitchhiker’s guide YouTube video, ‘Serenity Now’ cues a Firefly clip, and ‘Hold Steady’ plays songs from his favorite band. As if that wasn’t enough, there’s a ‘Wave Collider’ panel that allows him to activate various iTune playlists and choose ‘More Rock’ or ‘Less Rock’ depending on his mood.


Beyond that, buttons in the bottom left-hand corner type a variety of laughter into open chat windows, including the common ‘HA,’ ‘HAHA,’ or ‘HAHAHA’ for extremely funny moments. There’s even a ‘Weapons System,’ which emits humorous sound effects. Despite some of its comedic features, this was surely an impressive build!

If you’ve ever dreamt of using a Star Wars/Tron-like control panel, you’ll want to check out the Maker’s project in its entirety here.

This interactive light installation breathes with you

PRANA is an immersive, interactive light installation controlled by the viewer’s breath.

The term “prana” refers to the cosmic life force that originates in the sun and enters the body as we inhale. In the act of breathing, we form a connection between ourselves and the universe. To better explore the relationship between both breath and light, B-Reel Creative developed an installation of 13,221 LEDs that gave onlookers a visual representation of this connection.


The creative firm’s exhibit PRANA was comprised of a 12’ x 12’ sphere suspended from the ceiling, which visitors were prompted to step into and stand before a XeThru respiration sensor (ATSAM4E16E) that detected their breath. With every inhale and exhale, the XeThru data was fed into a custom JavaScript library, triggering color shifts and animations to make it appear as if the installation was breathing with them.


Sounds by One Thousand Birds were emitted to enhance its effect, seamlessly transitioning between each phase of the experience. The team open sourced the code to enable other artists and developers to create custom animations that can later be incorporated into the piece.

PRANA was designed and built entirely in-house over the course of a year and had been on display at the Fridman Gallery in New York City up until earlier this month. However, you can see it for yourself below!

This LED installation mimics the movements of fireflies

This 2,000-plus LED installation reacts to the movement of its visitors, placing them inside a colorful 3D environment.

Austrian arts collective Neon Golden recently created an immersive light installation designed to mimic the movements of fireflies. The project, aptly named SWARMconsists of over 2,000 LEDs that are suspended at various heights from an overhead metal grid and arranged in a series of 40 modules throughout a dark room.


The lights use motion-sensing technology, which is controlled by Raspberry Pi and Arduino running Processing, to replicate the motion of lightning bugs. The hanging LEDs change position horizontally in response to the movements of nearby visitors. The team also employed Cinema 4D to generate SWARM’s advanced 3D effects.

“Through the movement of the visitors within the installation the LEDs are lightening up and the static, chaotic structure transforms into a vibrant, three-dimensional swarm one can visually but also acoustically experience,” Neon Golden explains.


According to its creators, SWARM is adaptable to meet different space requirements, as the configuration of light modules can be adjusted to fit smaller or larger areas. The piece made its debut back at the Olympus Photography Playground in Vienna in February 2015.


You can see it for yourself in the video below as dancer Máté Czakó makes his way through the luminescent creatures, revealing the LEDs’ reactivity.

[h/t Dezeen]

This lower-limb exoskeleton is controlled by staring at flickering LEDs

Scientists have developed a brain-computer interface for controlling a lower limb exoskeleton.

As recent experiments have shown, exoskeletons hold great promise in assisting those who have lost the use of their legs to walk again. However, for those who are quadriplegic, diagnosed with a motor neuron disease or have suffered a spinal cord injuries, hand control is not an option. To overcome this barrier, researchers at Korea University and TU Berlin have developed a brain-computer interface that can command a lower limb exoskeleton by decoding specific signals from within the user’s mind.


This is achieved by wearing electroencephalogram (EEG) cap, which enables a user to move forwards, turn left and right, sit and stand simply by staring at one of five flickering LEDs, each representing a different action. Each of the lights flicker at a different frequency, and when the user focuses their attention on a specific LED, this frequency is reflected within the EEG readout. This signal is then identified and used to control the exoskeleton.

The exoskeleton control system consists of a few parts: the exoskeleton, an ATmega128 MCU powered visual stimuli generator and a signal processing unit. As the team notes, a PC receives EEG data from the wireless EEG interface, analyzes the frequency information, and provides the instructions to the robotic exoskeleton.


This method is suitable for even those with no capacity for voluntary body control, apart from eye movements, who otherwise would not be able to control a standard exoskeleton. The researchers believe that their system offers a much better signal-to-noise ratio by separating the brain control signals from the surrounding noise of ordinary brain signals for more accurate exoskeleton operation.

“Exoskeletons create lots of electrical ‘noise,’” explains Professor Klaus Muller, an author on the paper that has been published in the Journal of Neural Engineering. “The EEG signal gets buried under all this noise — but our system is able to separate not only the EEG signal, but the frequency of the flickering LED within this signal.”


The control system could serve as a technically simple and feasible add-on to other devices, with EEG caps and hardware now emerging on the consumer market. According to the researchers, it only took volunteers a few minutes to get the hang of using the exoskeleton. Because of the flickering LEDs, participants were carefully screened and those suffering from epilepsy were excluded from the study. The team is now working to reduce the ‘visual fatigue’ associated with long-term use.

“We were driven to assist disabled people, and our study shows that this brain control interface can easily and intuitively control an exoskeleton system — despite the highly challenging artefacts from the exoskeleton itself,” Muller concludes.

Those wishing to learn more can read the entire paper here, or watch the brain-controlled exoskeleton in action below.

[Images: Korea University / TU Berlin]

Capturing the movement of musicians through light painting

One Waterloo artist uses LEDs and long-exposure photography to reveal the hidden patterns of musicians. 

Stephen Orlando has come up with an innovative way to capture music in photos. By attaching LED lights to the bows of violin, viola and cello players, the Waterloo-based photographer is able to snap a visual representation of the sounds being created with the help of a long-exposure camera.


Orlando can track these movements through space, following the arms and bows with vibrantly lit bands. We would assume that, like his other projects from the Motion Exposure series, he uses an Arduino Mega (ATmega2560) to program the set of LEDs to change colors, as a way to convey a sense of time.


“The progression of time is from left to right in the viola and violin photos and from top to bottom in the cello photos. Each photo is a single exposure and the light trails have not been manipulated in post processing.”

Orlando reveals to Colossal that he drew inspiration from light painter Gjon Mili, who experimented with violin paintings back in 1952.


This isn’t the first time that he has employed the help of LEDs and long exposure photography to tell the story of movement either. If you recall, Orlando has captured the “invisible” patterns of outdoor activities, such as kayaking, paddle boarding and skiing. You can see all those incredible images here.

[h/t Colossal]