Hitachi researcher Kawamoto Ken recently debuted Myra, a robotic platform capable of autonomously optimizing lighting conditions. According to Ken, the motivation behind Myra is simple – freeing people’s lives from the constraints of a conventional “fixed” lighting system.
“By sensing what you’re doing (e.g reading, sleeping, eating, etc) using depth sensors, it changes the orientations of the lights to ensure you always have the perfect lighting, no fumbling around with switches needed,” he explained in a recent blog post.
“Conventional room lighting is static (or only mildly flexible). For example, many rooms (especially in Europe) are designed with a dark ambient, with some spots of light. This is aesthetically fine, but what if you decide that you want to read a book in the middle of the room?”
In contrast to traditional lighting arrays, Myra automatically configures itself by recognizing various activities within a residence or office space.
The platform currently consists of three primary components:
- The Myra Light – Robotic arms with an LED placed strategically around a room, controlled by a central PC.
- An RGBD sensor – Microsoft Kinect or Asus Xtion.
- PC – Responsible for analyzing sensor readings and regulating individual Myra lights.
The Myra light is controlled by a stand-alone Atmel ATMega microcontroller (MCU on a breadboard).
Meanwhile, the LED is fitted with a lens, which provides a fairly strong light beam on a 15° arc. On the software side, Myra uses NiTE to “extract people” from the point cloud, tagging individuals and their features.
“This is where much of the hard work happens,” Ken continued.
“Myra first classifies the state of each person into 5 different activities: reading, watching TV, standing, walking, sleeping. Then, ‘lighting targets’ are set according to [individual] activity and postures.”
Although Myra is still very much in development, Ken says he ultimately plans on making the project open source by releasing all schematics and code.
Interested in learning more? You can check out Ken’s full blog post here.