Tag Archives: MIT Fluid Interfaces Group

This app lets you program objects by drawing lines

Like something out of science fiction, the Reality Editor lets you connect and manipulate the functionality of physical objects. 

Back in 2013, a team from MIT Media Lab’s Fluid Interfaces Group developed a method of creating Spatially-Aware Embodied Manipulation of Actuated Objects through augmented reality. The project was an effort to extend a user’s touchscreen interactions into the real world. Earlier this year, the crew released libraries and examples that could also allow others to do the same. With Open Hybridyou could directly map a digital interface onto a physical thing and program hybrid objects using Arduino and other popular hardware/software environments.


Now, the researchers have taken the project to a whole new level. The Reality Editor is a futuristic tool that empowers you to connect and manipulate the functionality of any gizmo or gadget. Just point your smartphone camera at an item and an overlay with its invisible capabilities will appear on the screen for you to edit. Drag a virtual line from one to another and form a new relationship between the two.

Although the ultimate goal of the IoT is to make ordinary objects life in our smart, most things are still pretty ‘dumb.’ They don’t communicate with one another, and most are only capable of one function. Let’s take a smart bulb for instance, which can dim and brighten, but it can’t change the channel on your TV. This is where the Fluid Interfaces Group’s app comes in.


The Reality Editor lets you define simple actions, change the functionality of objects around you, and remix how things work and interact. Essentially, the app gives you the power to turn something that is virtual into something that is physical and vice versa. The best part? It’s as easy as connecting dots.

“That light switch in your bedroom you always need to stand up in order to turn off — just point the Reality Editor at an object next to your bed and draw a line to the light. You have just customized your home to serve your convenience,” the team writes. “From now on you will use your spatial coordination and muscle memory to easily operate the object next to your bed as a tool for controlling the light.”


What’s more, you can ‘borrow’ functionalities from one object and use them on another. For example, you could employ your TV’s sleep timer as a way to switch your lights on and off, or even have the air conditioning at your house adjust the temperature when you hop into your car to head home. The possibilities are endless.

At the moment, the Reality Editor utilizes QR-like codes to identify smart devices. It works by prompting an HTML webpage and overlays a particular object’s functionalities onto the smartphone so you can program it. However, it will soon be able to recognize the objects as they are viewed with the app.

The Reality Editor can be downloaded and used along with the group’s open source platform Open Hybrid to build a new generation of Hybrid Objects. This isn’t solely geared towards designers and engineers, but Makers and other high-tech enthusiasts as well. Safe to say, a Minority Report-like future is quickly approaching.