Tag Archives: NYU ITP

This robotic hand will swipe left or right on Tinder for you


The True Love Tinder Robot will “find you love, guaranteed.”


Are you an active user of popular social media dating apps? Have you made some poor decisions lately? Well, fear no more. Nicole He, a graduate student at NYU’s Interactive Telecommunications Program, has developed a robot that reads your body’s reaction as you browse through Tinder profiles, and then swipes right or left based on your skin’s response. In fact, she promises the bot will “find you love, guaranteed” merely by reading the change in your galvanic skin response over a period of time. (Meaning, how sweaty your palms get.)

true-love-tinder-robot

As simple as today’s sites make finding a potential suitor, if contemplating between age, location and looks still requires too much thought, the True Love Tinder Robot can be your perfect wingman. The system itself is powered by an Arduino, and includes a pair of servos to move the hand, some LEDs, a text-to-speech module, a bunch of wires, a speaker and a couple of sheets of metal that act as a skin sensor. There is also an indentation for your palms.

With Tinder open, you put your smartphone down in from of the rubber hand. Once you’ve placed your hands down on the sensors, a robotic voice (inspired by the villain GlaDOS from Portal 2) guides you through the process and questions your feelings. As you are looking at each profile, the True Love Tinder Robot will read your true heart’s desire through the sensors and decide whether or not you are a good match with that person based on how your body reacts.

For instance, it’ll ask things such as “Do you see yourself spending the rest of your life with this person?” If it determines that you’re attracted to that person, it will swipe right. If not, it will swipe left. Throughout the process, it will make commentary on your involuntary decisions. Although galvanic skin response may not be the most precise measurement, it is often used by Scientologists for spiritual auditing and by law enforcement as part of polygraph tests.

Pic1

The first prototype of the bot actually attempted to incorporate facial recognition, but was later swapped out for galvanic skin response. The idea behind GSR is pretty straightforward: when you see or experience something stimulating, your skin reacts appropriately by creating an electrodermal response. As your skin gets a little wetter, it becomes more conductive to electricity. GSR then measures that physiological feedback through skin conduction.

“In a time when it’s very normal for couples to meet online, we trust that algorithms on dating sites can find us suitable partners. Simultaneously, we use consumer biometric devices to tell us what’s going on with our bodies and what we should do to be healthy and happy. Maybe it’s not a stretch to consider what happens when we combine these things,” He explains.

The premise is that a computer may actually know you better than you know yourself, so why not let it pick you a date? While chances are the installation may not choose your future hubby or wifey, it’s still a pretty nifty project nevertheless.

“I want this project to be sort of amusing, kind of creepy and slightly embarrassing. I want the user to feel a tension between the robot assuring you that it knows best and not being sure whether or not to trust it. I want the user to question whether or not we should let a computer make intimate decisions for us,” He writes.

He has provided a detailed overview of the project and has made it entirely open source with all of its code available on GitHub.

 

This LED t-shirt visualizes your body movements


Digi-Weirdo is a wearable project that explores the convergence of identity and guiding communication.


Zhen Liu just loves data. So much so that it has inspired several innovations, namely her latest project Digi-Weirdo. Created as part of a class assignment at NYU’s Interactive Telecommunications Program, the interactive t-shirt was designed as a way to give clothing other roles than merely covering your body and helping to establish your personal identity. Instead, the Maker hopes one day such garments can be used to convey real-time emotions and enhance communication between one another by visualizing body movements through an LED matrix.

DigiWeirdo-e1437439775821

Built around an Adafruit FLORA MCU (ATmega32U4), the t-shirt is fitted with a battery for power as well as an accelerometer for analyzing body motions and translating them into a series of illuminated patterns. The LED matrix is embedded inside an inverted triangle that is sewn on the front of the shirt.

17

As Adafruit puts it, what may be most interesting about this project is that through some simple programming, a wearer can create a visual language of their own. See it in action below!

This machine reveals moon phases based on inputted dates


A Maker duo has devised a project that lets moon phases become both tangible and poetic. 


The moon has phases because it orbits earth, which causes the portion we see illuminated to change. And while the moon actually takes 27.3 days to complete an entire go-around, the lunar phase cycle is 29.5 days. As a way to better visualize new, quarter and full moons, Makers Yingjie Bei and Yifan Hu at NYU’s ITP Program have developed an interactive installation that they call Moon Phases. The aptly-dubbed device, which resembles an old-school turntable, lets users simply input a date and see its corresponding moon phase — from the northern hemisphere’s perspective.

6A1A9710-copy_670

“The idea started from my very first processing sketch which is a 2D drawing for moon phases. From there, I started to expand and approach it from different perspective. The moon phases machine is the ultimate work through out the whole journey,” Bei writes.

The project’s structure was inspired by Orrery, a mechanical model of the solar system that predicts the relative positions and motions of planets, as well as the simplicity of changing numbers on a thermostat. This would not only provide viewers with a new way to experience new, quarter, full and even crescent moons, but to do so in a more tangible and poetic manner. Stories about that particular phase are simultaneously displayed through the beautifully-crafted machine’s built-in screen.

6A1A9664-copy_670

How it works is relatively simple. A user selects a date — whether it’s their birthday, a historical event or even hundreds of years into the future — by turning three different knobs, each representing the year, month and day, respectively. An Arduino Mega (ATmega2560) embedded inside the device uses the Processing language to properly calculate and identify the correct phase. From there, the Arduino controls a servo located beneath the machine to rotate the turntable and accurately position the light, which is projected onto the mini cement moon.

Intrigued? You can find a detailed breakdown of the build here, and see it in action below.

This installation makes it rain data from the cloud


Where is the cloud? What does it look like? And, what exactly is the big data that we store there?


Let’s face it, not a day goes by that you don’t hear words like “big data” and “the cloud.” These ambient terms have been immersed in our modern-day vocabulary, but in many ways these buzzphrases still remain distant and abstract. Jingwen Zhu, a Master’s student at NYU ITP, started asking herself those questions upon hearing various lecturers discussing how big data is affecting our lives. In an exploration as what big data in the cloud would actually look like if it were tangible, the Maker crafted her vision as an interactive data exhibition. The aptly dubbed Big Data Cloud obtains data from users, and gives the information data back to them.

Cloud

“In this installation, people are not only encouraged to interact with the cloud, but also interact with the data,” Zhu explains. “In our daily life, we are interact with big data every day. We provide our data to the cloud, and get data back from it. Yet this repeated occurance falls to the background because we use big data so often that it goes unnoticed. By creating the Big Data Cloud, I provide people with a visible and tangible experience of interacting with big data, and let them to rethink about how big data affects our lives.”

2014-11-21-20.03.47-1024x768

How it works is simple: When a user comes under the cloud, a mobile device drops down from the cloud with a question displayed on the screen. Once the user types the answer to the question, the phone “uploads” the information back into the cloud. After some thunder and lightening, the cloud begins to “rain” just as it would in a summer night’s storm. However, the big data rain is in the form of a printed roll of paper with the users’ answers to the question instead. What’s more, the most frequently repeated words are also projected as “puddles” on the ground. Users can play either with the projected raindrops, or read all the answers on the receipt.

In order to make this concept to a reality, the Maker designed a 3D polygonal cloud comprised of folded paper to enclose her device and suspended it from the ceiling. Embedded within the paper cloud are a stepper motor connected to an ATmega328 board, a projector and a thermal printer.

2014-11-29-20.56.45-1024x768

The stepper motor coils the phone up and down from the cloud, while the mobile device itself is attached to a piece of fishing wire — this enables the phone to be drawn back up into the cloud. A Processing sketch using a Temboo Google Spreadsheets Choreo acquires the data that the user entered, which allows the newly-acquired data to be both projected onto the ground as at the user’s feet and printed from the thermal printer. Using  the program, Zhu was able to write the visual effect of the cloud and count the word frequency, before arranging and displaying the terms in different sizes. Meanwhile, an ultrasonic sensor within the printer can detect when someone puts their hand above it, thereby causing the machine to print the content, which of course, is decided by the distance.

Projectin

When all is said and done, this impressive project is a great physical representation of how we send and retrieve data from “the cloud.” Interested in learning more? You can learn all about the project as well as access a step-by-step breakdown of the build here.

Maker turns hip-hop lyrics into grillz

Within hip-hop culture, “grillz” are a type of jewelry worn over the teeth. Typically comprised of metals like gold or platinum, they have become symbolic of a wearer’s assumed over-the-top wealth, as exhibited in a number of music videos and on-stage performances from Nelly to Lil Wayne. While musicians in the early ‘80s began to sport them, it wasn’t until the mid-2000s when they became mainstream due to the rise of Southern hip-hop and its coinciding culture status.

poster-juicy

Though advancements in 3D printing technology have yielded new creations from organs and prosthetics to cars and pancakes, one Maker has decided to extend that spectrum of things to include printed grillz. New York-based artist Roopa Vasudevan has taken lyrics from famous rappers — such as Notorious B.I.G., Jay Z, Puff Daddy (P.Diddy or whatever he goes by these days), and Rick Ross — and alogirthimcally analyzed them for mentions of money and income to create a set of 3D-printed, polished gold steel “grillz.”

1412781110075

The lyrics were processed through Pygenius Python, which created 3D shapes. With the assistance of Geomerative and Modelbuilder libraries these 3D models were then finalized in a polished gold steel.

1412780514562

“Mentions in each category are scored according to relative distance from words of the opposite polarity, and the resulting landscape formed is extruded into a 3D shape and printed as wearable grills: jewelry designed to fit over one’s teeth, and which have become inextricably linked to hip-hop culture over the years as a symbol of over-the-top, ostentatious wealth,” Vasudevan writes.

1412781280095

Vasudevan’s incredibly innovative project has drawn quite the attention at recent exhibitions, ranging from Brooklyn’s Dumbo Arts Festival to the upcoming MAKE: Wearables on the Runway Show. The Maker is an alum and adjunct at NYU’s Interactive Telecommunications Program (ITP), which has produced such innovations like the Smart Hoodie.

… Don’t forget to pair these grillz with some LED stunna shades!

brpmmztcaaef128