Tag Archives: Interactive Display

MIT Media Lab’s morphing table has Atmel under the hood


Tangible Media Group has created a shapeshifting display that lets users interact with digital information in a tangible way. 


As previously shared on Bits & Pieces, MIT Media Lab’s Tangible Media Group has devised a morphing table with several ATmega2560 MCUs under the hood. The installation was recently exhibited at the Cooper-Hewitt Smithsonian Design Museum in New York, and can be seen in action below!

car

inFORM is described its creators as a dynamic shape display that can render 3D content physically, so users can interact with digital information in a tangible way. In order to make that a reality, the table is equipped with 900 individually actuated white polystyrene pins that make up the surface in an array of 30 x 30 pixels. The interactive piece can display 3D information in real-time and in a more accurate and interactive manner compared to the flat rendering often created by computer user interface.

flashlight_remotesite

This was all accomplished by tasking a Kinect sensor to capture 3D data. This information was then processed with a computer and relayed over to a display, enabling the system to remotely manipulate a physical ball. Aside from being able to produce a controlled physical environment for the ball, the pins are able to detect touch, pressing down and pulling.

flashlight_hands

An overhead projector provides visual guidance of the system, with each pin capable of actuating 100mm and exerting a force of up to 1.08 Newtons each. Actuation is achieved via push-pull rods that are utilized to maximize the dense pin arrangement — making the display independent of the size of the actuators. The table is driven by 150 ATmega2560 based Arduino PCBs arranged in 15 rows of vertical panels, each with 5×2 boards. The boards then communicate with a PC over five RS485 buses bridged to USB. Meanwhile, graphics are rendered using OpenGL and openFrameworks software.

“One area we are working on is Geospatial data, such as maps, GIS, terrain models and architectural models. Urban planners and architects can view 3D designs physically and better understand, share and discuss their designs,” the team writes. “Cross sections through Volumetric Data such as medical imaging CT scans can be viewed in 3D physically and interacted with. We would like to explore medical or surgical simulations. We are also very intrigued by the possibilities of remotely manipulating objects on the table.”

sphere_hands02

Its creators are hoping to spark several collaborations with everyone from urban planners and architects, to designers and modelers, to doctors and surgeons. The display could be used as an alternative to 3D printing low-resolution prototypes as well as rendering 3D data — ranging from construction plans and CT scans — that a user will be able to interact with by physically molding the pins.

Interested? A detailed paper of the project by can be found here.

“Sounds of NYC” is an interactive map powered by Arduino

A collaboration between Los Angeles-based Sonos Studio and Stockholm creative agency Perfect Fools has developed a giant, interactive Lite Brite-like display that plays music. The moving map of the New York metropolitan area, aptly dubbed “Sounds of NYC,” was recently on display during a weeklong event at NeueHouse near Madison Square Park.

The gigantic exhibit was comprised of 300 Sonos Play:1 speaker shells and four Sonos Sub woofers, while the morphing map of sound used 180 Play:1 speakers equipped with LEDs to create the display’s colored lighting effects. Triggered by motors, the shells could move toward or away from whoever is interacting with it, while the colors of each could change to create images.

Writing for WiredTim Moynihan notes that those light-up speakers act purely as “pixels,” as they do not actually produce sound. To handle the audio, there are 120 black Play:1 units on either side of the display.

100_3998

While the hardware was assembled by VolvoxLabs, the location-based music playlist was curated by Wolf+Lamb. A second playlist of ambient noises recorded at the actual locations featured in the map was created by Big Noble.

(Source: Verge)

(Source: Verge)

When a viewer stood before the wall, the interactive display greeted them with a “YO!” The wall would then morph into a map of the five New York City boroughs. Using a Microsoft Kinect camera as an input device, viewers could select different “areas” on the map via hand gestures, which would elicit music or ambient sounds characteristic to that particular neighborhood.

640w

As the visitor navigated the map, white pixels would display the selectable locations; once selected, the location would pop out of the wall thanks to [Atmel based] Arduino boards, which served as the brains of the project. A woman’s voice announced the chosen location and artist, and the system would continue on to play a song inherent to the area — ranging from soundbites of Manhattan’s Metropolitan Opera House to Brooklyn-born hip-hop artists Jay-Z and Nas.

100_3990

“All the songs are streamed from Google Play,” Brad Wolf, Senior Director of Brand Innovation at Sonos, tells Wired. “The back-end program is built using Flash, which controls the movement and the light of the units while also grabbing the songs from Google Play.”

Each of the playlists, which are rotated regularly to keep it fresh, were initially planned out using Google Maps, with one or multiple song suggestions for each location. Though Sonos Studio NYC has come to an end, its creators say that the “Sounds of NYC” hardware will continue on with new light shows and sounds adapted to its next location.

“The beauty of the installation is that it is a flexible canvas,” Wolf concludes. “It can musically bring any city or region to life — whether that’s LA, London, New Orleans, or Beijing.”