The brainchild of MIT Media Lab’s Fluid Interfaces Group, Open Hybrid is an augmented reality platform for physical computing and the Internet of Things.
The Xerox Star was the first commercially available computer showing a Graphical User Interface (GUI). Since its debut in 1981, many of its introduced concepts have remained the same, especially with regards to how we interact with our digital world: a pointing device for input, some sort of keyboard for commands and a GUI for interaction. However, with many of today’s physical objects becoming increasingly connected to the Internet, Valentin Heun of MIT Media Lab’s Fluid Interfaces Group believes that GUI has hit its limit when it comes to extending its reach beyond the borders of the screen.
This problem is nothing new, though. Dating back the days of text-only command lines, interface designers have always been challenged by the imbalance between the countless commands that a computer can interpret, and the number of which one could store in their brain at one time.
As Heun points out, physical things have been crafted and shaped by designers over centuries to fit the human body. Because of their shape and appearance, we can access and control them intuitively. So wouldn’t an ideal solution be one in which both the digital and physical worlds come together in seamless fashion? That’s the idea behind what he and his MIT Media Lab collaborators call Open Hybrid. This project would enable users to directly map a digital interface right onto a physical item. By doing so, you would ever need to memorize a drop-down menu or app again.
Think about it, the use of these so-called smart objects isn’t all that easy. Take a smart light bulb, for instance, which might have millions of color options, thousands of brightness settings and various hue-changing patterns to select from. But in order to adjust the light, you need to first take your phone out of your pocket, enter a passcode to unlock it, open an app and search for the bulb within its main menu, all before finally accessing its functionality — a process that previously only required tapping a wall switch now requires multiple steps. Aside from that, the more objects that one has throughout their home or office, the more complex it becomes to find them in the app’s drop-down menu.
In an effort to solve this conundrum, Heun has developed the Reality Editor, which offers designers a simple solution for creating connected objects by using web standards and Arduino, in addition to a streamlined way to customize the objects’ behavior with an augmented-reality interface that eliminates complicated, and often unnecessary, steps.
“The amount of apps and drop-down menus in your phone will become so numerous that it will become impossible for you to memorize what app and what menu name is connected with each device. In this case, you might find yourself standing in the kitchen and all you want to do is switch on a light in front of you,” he writes.
These new tangible things are known as Hybrid Objects, as they share the best characteristics of virtual and physical UIs: a virtual interface for occasional modifying, connecting and learning about them, as well as physical interface for everyday operations. Meaning, this system transforms the actual physical world into a transparent window, while the smartphone in your pocket acts as a magnifying glass that can be used to edit reality when necessary.
How it works is pretty straightforward: Hold your phone up so the camera is pointed towards the object, while the app displays a virtual control panel hovering over the item — whether it’s a drone, a lamp, a kitchen appliance, a radio or even an entertainment system. This will prompt its settings and whatever other menu options to magically appear.
You’ll also see nodes corresponding to the physical controls the gadget offers, and can then create interactions between devices by drawing a line from the origin I/O to the designation I/O. And voilà!
“Traditionally, you would create some kind of standard that knows every possible representation of the relevant objects so that every interface can be defined. For example, say you have two objects, a toaster and a food processor, and now you would need to create a standard that knows how to connect these two objects.”
With Open Hybrid you have a visual representation of your object’s functionalities augmented onto the physical object. Where before an abstract standard needed to be devised, you can now just visually break down an object to all its components. Using the same example from above, the toaster now consists of a heating element, a setup button, a push slider and a timing rotation dial. All of these elements are represented with a number between 0.0 and 1.0. This same simple representation applies to the food processor. If you want to connect two things, you are really only pairing the numbers associated with each given item, never the objects themselves.
“This is the power of Open Hybrid. Now that the interface allows you break down every object to its components, you only need to deal with the smallest entity of a message: a number. As such, Open Hybrid is compatible with every Hybrid Object that has been created, and any object that will be built,” Heun adds.
What’s nice is that all of the data about the interfaces and connections are stored on the object itself, and each one communicates directly with handheld devices or with one another, so there’s never a need for any centralized hubs or cloud servers.
“Wherever you can run node.js you can run the Hybrid Object platform. We have successfully experimented with MIPS, ARM, x86 and x64 systems on Windows, Linux and OSX,” Heun notes. “If you have the latest head-mounted, projected or holographic interfaces, feel free to compile the code for your platform and share your findings with the community.”
Safe to say, it’s always exciting to see new projects come out of MIT’s Fluid Interfaces Group. While we’ve seen several attempts in bridging the gap between the physical and digital worlds before, this one is certainly among the most unique. Intrigued? Head over to Open Hybrid’s detailed page here to learn more, or watch Heun’s recent Solid 2015 presentation below.