LiSense uses shadows created by the human body from blocked light and reconstructs 3D human skeleton postures in real-time.
As our homes become increasingly smarter, what if we could use the light around us for more than just illumination? In other words, imagine if the light in your room could sense you waving your hand as you enter, or was able to trigger your smart coffee machine, unlock the door and turn on your entertainment center. While it sounds like something straight out of a sci-fi novel, it may soon all be possible thanks to a new project from researchers at Dartmouth University.
The team is looking to transform ubiquitous light into a medium that integrates communication with human sensing. LiSense works by decoding information made from visible light to turn everyday lighting into sensors that can then recognize and respond to what we do. This is achieved through visible light communication (VLC), which encodes data into light intensity changes at a high frequency invisible to the human eye.
Not only does LiSense use light to sense people’s movements, but it also allows them to control devices in their environment with simple gestures, employing light to transmit the information. The hope is that you will be able to gesture and engage with objects in a room via nothing more than light, similar to how you’d use a Kinect or Wii gaming system to interact with your TV.
For LiSense to track a person’s movements, the researchers built a three-meter by three-meter light-sensing testbed with five off-the-shelf Cree LEDs in the ceiling and 324 photodiodes on the floor. A total of 29 microcontrollers, Arduino Due (SAM3X8E) and Uno (ATmega328), were embedded as well. The system uses the shadows created by a person standing on the testbed to reconstruct their 3D human skeletal posture in real-time (at 60 Hz).
To get their shadow-based human sensing to work, the researchers had to overcome two critical challenges. Since multiple ceiling lights lead to diminished and complex shadow patterns on the floor, they had to devise light beacons to separate light rays from individual LEDs and ambient light. Additionally, they came up with an algorithm capable of taking the collected limited resolution, 2D shadow maps from the photodiodes in the floor and reconstructing a person’s posture in 3D.
By waving your hand, LiSense lets you freely control things, play games and track behavior without the need of cameras and on-body devices. One day, the team says it may even respond to your feelings. Compared to existing methods that use wireless radio signals such as Wi-Fi to track user gestures, VLC has several appealing properties and advantages. For starters, light-based sensing is secure, doesn’t penetrate walls, and isn’t limited to classifying a pre-defined set of gestures and activities. On top of that, it’s energy efficient, operates at a bandwidth 10,000 times greater than the radio frequency spectrum, and reuses existing lighting infrastructure.
“Light is everywhere and we are making light very smart,” says Xia Zhou, lead author and researcher on the project. “Imagine a future where light knows and responds to what we do. We can naturally interact with surrounding smart objects such as drones and smart appliances and play games, using purely the light around us. It can also enable a new, passive health and behavioral monitoring paradigm to foster healthy lifestyles or identify early symptoms of certain diseases. The possibilities are unlimited.”
Sounds intriguing, right? See it all in action below, and be sure to read the team’s entire paper here.
Pingback: These lights will let you control your smart devices through gestures - Internet of Things | Wearables | Smart Home | M2M