Queens University researchers developing a real-life AR system that will enable users to physically interact with data through different types of drones.
Get ready to file this recent project from researchers at Queen’s University’s Human Media Lab under the “What the…” category. That’s because the team is developing a human-computer interface that employs a swarm of tiny drones as flying pixels in an immersive 3D display. The hope is that BitDrones one day can revolutionize the way people interact with virtual reality. These itsy bitsy flying apparatuses will enable users to explore virtual 3D information by engaging with physical self-levitating building blocks. In other words, they’re turning drones into holograms that people can actually touch.
According to Queen’s professor Roel Vertegaal and his team, BitDrones will be the first step towards creating interactive self-levitating programmable matter — materials capable of changing their 3D shape in a programmable fashion — using swarms of nano quadcopters. The work highlights many possible applications for the new technology, including real-reality 3D modeling, gaming, molecular modeling, medical imaging, robotics and online information visualization.
“BitDrones brings flying programmable matter, such as featured in the futuristic Disney movie Big Hero 6, closer to reality. It is a first step towards allowing people to interact with virtual 3D objects as real physical objects,” Dr. Vertegaal explains.
The team has already built three types of BitDrones: First, PixelDrones are equipped with one LED and a small dot matrix display. Next, ShapeDrones are augmented with a lightweight mesh and a 3D-printed geometric frame and serve as building blocks for complex 3D models. Meanwhile, DisplayDrones are fitted with a curved flexible high resolution touchscreen, a forward-facing video camera and Android smartphone board. All three models have reflective markers, which allow them to be individually tracked and positioned in real-time via motion capture technology. The system can detect a user’s hand motion and touch, which lets them manipulate the pixels in midair as if they were standing inside a 3D display.
But that’s not all — it gets even cooler. Since the program that commands the drones knows where each drone is, it can tell when someone has moved the tiny drone around in space. So what can the technology be used for, you ask? Thus far, the team has been able to demonstrate using the system to browse through files by simply swiping drones left and right to show their contents. The operator of the drone was able to open an architectural drawing, and the ShapeDrones then formed the basic positioning of the building in 3D. From there, users can drag drones to adjust the orientation of the building, and even modify parameters of the ShapeDrone using the touchscreen.
Aside from that, the BitDrone platform can be used for telepresence by letting remote users move around locally through a DisplayDrone with Skype. In this scenario, the DisplayDrone can automatically track and replicate all of the remote user’s head movements, giving a remote person the ability to virtually inspect a location and make it easier for the local user to understand the other individual’s actions.
While the platform currently only supports a dozen of comparatively large 2.5” – 5” sized drones, the team at the Human Media Lab is working hard to scale BitDrones so that it could thousands of other ‘copters. These future flying machines would measure no more than a half inch in size, and provide users the opportunity to render more high-res, programmable holograms. More importantly, it opens the doors to countess new interactions. Until then, you can check out the project on its official page, or see it all in action below!