Tag Archives: Processing

Lazy Pen combines word processing with the personal touch of handwriting

This project distorts your typeface as you write using moving palettes placed beneath your palms.

With the advent of digital technology, it’s safe to say that cursive writing has become a lost art — no longer used by adults, most likely not being taught to children. It’s a shame, though, as there’s just something about the emotional aspect associated with putting a pen to paper.


After receiving several notes from his grandmother, Maker Nicolas Nahornyj decided to combine the practical side of computer-based word processing with the personal touch of a good ol’ handwritten letter. To accomplish this, he developed a keyboard extension that allowed him to modulate his writing and produce his own typography in real-time.

Created at ECAL, the aptly named Lazy Pen enables him to distort the typeface as he writes using a set of moving palettes placed beneath his palms that transform “vertical and horizontal motions into kneecap movements.” The project is comprised of two parts: a removable recessed block for the keyboard and a desk with two trestles and a drawer for his MacBook. Meanwhile, a pair of joysticks taken from a remote control plane are used to accurately log the data.


“When you move your finger left or right on this keyboard, all the keys move like the Ondes Martenot music instrument,” the Maker explains.

In order to define the basic makeup of each letter, he devised a Processing application that allowed him to manipulate and modify their shapes. He then connected the app to an Arduino board tasked with collating and converting the raw analog data from the joysticks into digital information that could be interpreted by the “Adobe Illustrator-like software.”

Intrigued? Check out the Maker’s project page here, or see it in action below!

This LED installation mimics the movements of fireflies

This 2,000-plus LED installation reacts to the movement of its visitors, placing them inside a colorful 3D environment.

Austrian arts collective Neon Golden recently created an immersive light installation designed to mimic the movements of fireflies. The project, aptly named SWARMconsists of over 2,000 LEDs that are suspended at various heights from an overhead metal grid and arranged in a series of 40 modules throughout a dark room.


The lights use motion-sensing technology, which is controlled by Raspberry Pi and Arduino running Processing, to replicate the motion of lightning bugs. The hanging LEDs change position horizontally in response to the movements of nearby visitors. The team also employed Cinema 4D to generate SWARM’s advanced 3D effects.

“Through the movement of the visitors within the installation the LEDs are lightening up and the static, chaotic structure transforms into a vibrant, three-dimensional swarm one can visually but also acoustically experience,” Neon Golden explains.


According to its creators, SWARM is adaptable to meet different space requirements, as the configuration of light modules can be adjusted to fit smaller or larger areas. The piece made its debut back at the Olympus Photography Playground in Vienna in February 2015.


You can see it for yourself in the video below as dancer Máté Czakó makes his way through the luminescent creatures, revealing the LEDs’ reactivity.

[h/t Dezeen]

This cloud-like installation reacts to sound with light

This interactive project uses light and sound to mimic the unpredictability of lightning. 

If you’re a frequent visitor to the Bits & Pieces blog, then you’re no stranger to suspended, interactive installations meant to resemble naturally occurring events, such as weather patterns. Take designer Richard Clarkson’s aptly name Cloud, for instance. This responsive, Arduino-driven lamp and speaker system is capable of acting as a semi-immersive experience mimicking a thunderstorm. Similarly, New York design studio SOFTlab’s latest work is centered around a cell-like structure that emulates the unpredictability of lightning by reacting to footsteps and conversations of its viewers.


“Much like sponge the structure relies on redundancies and connections that cannot be achieved from a grid, giving it a soft cloud like shape. The irregularity of the network like structure imbues the piece with a playful personality as it reacts in unpredictable ways to environmental sound,” Cumulus’ creators reveal.

These reactions are achieved through a series of physical and digital systems working in unison. The structure itself is comprised of over 200 acrylic segments held together by 100-plus 3D printed joints, while nearly 230 feet of individually addressable LED strands are embedded inside. Meanwhile, the storm-inspired behaviors were programmed using Processing. This way, whenever ambient sound is detected, the signals are relayed to the LED segments via three Atmel based Arduino boards, transforming the piece into a visual barometer for the mood of the room and its occupants.


“Each time sound in the space reaches a certain volume the piece activates. The most interesting behavior was similar to our initial intent, lightning. This behavior seeks a path through one of the next connecting segments. The duration of the path is dictated by the volume of the sound that activates it. This simple algorithm creates a wide range of effects from long lightning like strands created through sporadic low frequency sounds to a staticky chatter when people are talking underneath it,” the SOFTlab crew adds.

The project was on display in the RAB showroom in Chelsea, New York through July 3rd. You can watch the cloud in action below.

This device visualizes breathing data using soap bubbles

As a study into the quantified self, this Maker duo sought out to raise awareness around subconscious breathing habits.

Philips Design Labs recently commissioned the help of two Dutch design students as part of the company’s exploration into the quantified self movement. Makers Amy Whittle and Willem Kempers developed a project that focused on the physiological act of respiration required to sustain life. Their goal was to raise awareness around our subconscious breathing habits by making data more accessible and easier to understand.


Ethereal breathing is something most of us probably rarely think about. However, it is something that can be controlled, which certainly comes in handy when trying effectively manage stress and anxiety.

“As anyone can testify, taking a deep breath before a nerve-racking experience can calm that anxiety,” the Makers explains, “But to what further extent can controlled breathing benefit the human body?”

To accomplish this feat, the duo devised a suspended installation that represented a pair of lungs and interacted with measured breathing data. Temperature sensors were placed inside their noses, and real-time information was acquired as they engaged in various daily activities, such as cooking and meditating.


The sensors were able to detect the fluctuation in temperature between the air that was being inhaled and exhaled. This data was then relayed to a fan that was tasked with blowing air into a giant soap bubble. Why a bubble, you ask? Similar to the human lungs, they are capable of expanding and contracting, thereby allowing the patterns to be easily visualized. The bubbles were highlighted using integrated LEDs, while the machine itself was controlled by an Atmel based Arduino running the Processing programming language, along with an Adafruit Motor Shield and a few stepper motors.

This machine reveals moon phases based on inputted dates

A Maker duo has devised a project that lets moon phases become both tangible and poetic. 

The moon has phases because it orbits earth, which causes the portion we see illuminated to change. And while the moon actually takes 27.3 days to complete an entire go-around, the lunar phase cycle is 29.5 days. As a way to better visualize new, quarter and full moons, Makers Yingjie Bei and Yifan Hu at NYU’s ITP Program have developed an interactive installation that they call Moon Phases. The aptly-dubbed device, which resembles an old-school turntable, lets users simply input a date and see its corresponding moon phase — from the northern hemisphere’s perspective.


“The idea started from my very first processing sketch which is a 2D drawing for moon phases. From there, I started to expand and approach it from different perspective. The moon phases machine is the ultimate work through out the whole journey,” Bei writes.

The project’s structure was inspired by Orrery, a mechanical model of the solar system that predicts the relative positions and motions of planets, as well as the simplicity of changing numbers on a thermostat. This would not only provide viewers with a new way to experience new, quarter, full and even crescent moons, but to do so in a more tangible and poetic manner. Stories about that particular phase are simultaneously displayed through the beautifully-crafted machine’s built-in screen.


How it works is relatively simple. A user selects a date — whether it’s their birthday, a historical event or even hundreds of years into the future — by turning three different knobs, each representing the year, month and day, respectively. An Arduino Mega (ATmega2560) embedded inside the device uses the Processing language to properly calculate and identify the correct phase. From there, the Arduino controls a servo located beneath the machine to rotate the turntable and accurately position the light, which is projected onto the mini cement moon.

Intrigued? You can find a detailed breakdown of the build here, and see it in action below.

Controlling the behavior of light with a physics simulator

This installation is like a rollercoaster of lights.

Created by Madrid creative studio Espadaysantacruz, Light Kinetics is an interactive installation where light behaves as matter under the laws of mechanics.


The project is comprised of 78 tungsten bulbs connected to a rack of 20 four-channel DMX dimmers, all of which are controlled by a physics simulator built with Unity3D. An Atmel based Arduino was employed to capture the impulses, while Processing was used to interface between Unity3D and the dimmers.


How it works is relatively simple. A piezoelectric sensor situated in the first bulb captures the force of a tap, generating a light particle that moves along the loop. The initial impulse is regulated by the strength of the tap, creating a very natural interaction. Adding more strength behind the touch causes the light to move faster and can overcome the force of gravity. Conversely, when one presses the bulb softly, light falls slowly along the loop.


“Light, as we usually see it, is an element that lacks mass, to treat it under the laws of gravity is somehow magical. The laws that describe the behaviour of light are hardly understandable because it neither behaves as body or as a wave,” ___ writes. “As Einstein wrote concerning the wave-particle duality: ‘We have two contradictory pictures of reality; separately neither of them fully explains the phenomena of light, but together they do.’ In this project, we have built a computer simulator that reduces this extraordinary phenomenon to the simple classical mechanical laws.”

Interested in learning more? Head over to the project’s official page. In the meantime, you can watch it in action below.

Exploring digital fabrication through an Arduino-based foam structure

This student assignment explores the relationship between digital and material-based digital fabrication. 

A team of students from the Media and Design Laboratory LDM, EPFL / SINLAB recently designed a project that explored the relationship between digital and material-based digital fabrication through an n-hedron structure composed of soap that is blown, through a mixture of air and helium, into a foam structure.


Inspired by direct observations of nature’s beauty in the form of thin films, the installation — dubbed Transient Materialization — aspired to take architecture beyond the creation of static forms and into the design of dynamic, transformable and ephemeral material experimental processes.

Screen Shot 2015-01-16 at 16.23.57

According to its creators, the project questioned a structure’s materiality by examining its physical performance and ephemeral characteristics. In order to do this, the group presented various configurations of dynamic and transformable foam structures. This fabrication interacted with an algorithm, which involved a mixture of air and helium (controlled by pneumatic valves) and additive chemical substances and thickening agents. Meanwhile, the machine itself was powered by an Atmel based Arduino and Processing. Watch it in action below.

Interested in learning more? You can read all about the project on its official page here.

Homebrewing a DIY pulse monitor

A 15-year-old Maker by the name of Angelo has designed a homebrew pulse monitor using an Atmel based Arduino board, a grippy clothes hanger, clear/bright red LED and a light dependent resistor (LDR).

The project — which can be found on Instructables — was inspired by MAKE Magazine’s homemade pulse monitor.

“Movies look cool with those EKG (electrocardiogram), the one that beeps and detects heart activities. A few months ago, we had to shoot a hospital scene for our school project. We needed an EKG instrument,” Angelo explains.

“To keep the movie authentic, we didn’t want to fake the readings so we made the next best thing, a pulse monitor. This project works and can actually monitor your pulse. [However], due to the lack of research and experimentation, the homebrew pulse monitor cannot be used for medical purposes.”

Have a friend or foe who continuously tells fibs? Good news! According to Angelo, the homebrew device can even be used as a rudimentary lie detector.

“When a person lies, you’ll notice a sudden change on the [pulse] graph,” he said.

On the software side, Angelo employs Processing 2 for graphing, along with a specially coded Arduino IDE sketch. Both are required to run the homebrew project.

Interested in learning more? You can check out the project’s official page here.

Video: Tangible Orchestra plays for the masses

Tangible Orchestra – which was recently featured on the official Arduino blog – combines electronic and classical music in a three-dimensional space.

 Designed by Rebecca Gischel and Sebastian Walter, the installation is equipped with 112 ultrasonic sensors controlled by a single Atmel-based Arduino Mega (ATmega1280 MCU).

“Human interaction within Tangible Orchestra is made possible by 16 ultrasonic sensors on the inside of each cylinder, granting a 360 degree field of view. The sensors are run by one integrated microprocessor per cylinder, evaluating and comparing the readings of all sensors making very accurate assessments,” Gischel and Walker explained.

“To avoid interference between ultra sonic waves of different cylinders, the microprocessors run consecutively rather than simultaneously. All microprocessors are controlled, assessed and coordinated by one Arduino Mega.”

On the software side, Processing is used to communicate with Arduino and the microprocessors in each cylinder.

“It is programmed to coordinate the microprocessors, so that their sensors cast their rays consecutively as with 112 ultrasonic sensors operating at the same time, there would be a substantial risk of interference and acoustic shadow misreading. It also assesses the data coming from Arduino and, after verification, generates the output,” the duo continued.

“If a person detected within the bubble of a cylinder, Processing receives the digital information as an input from Arduino and stops muting the respective instrument which then joins into the melody. Processing also reads the values of each instrumental track to calculate the digital signals for the LEDs and controls the LED stripes inside of the cylinder.”

According to Gischel and Walter, each instrument is played by a separate speaker located in the base of each cylinder – with multiple sound outputs enabled via several external sound cards paired with the minim library by Damien Di Fede.

“When an instrument plays, the beats of the audible track are analyzed and consequently values are calculated to create an equalizer-like light beam,” the two concluded.

”The outcome is transferred via Arduino to a transformer, which converts the 5V Arduino signal into an 230V output operating 192 LEDs per cylinder. Another transformer converts 5V Arduino signals into 12V output powering LED stripes inside of each cylinder as soon as they are activated.”

Interested in learning more? You can check out the the project’s official page here.

This robot was once an antique vacuum cleaner

Successfully maintaining a public FabLab, MakerSpace or HackSpace can be an expensive endeavor, so donations are almost always appreciated.

The GarageLab, a small FabLab in the German city of Düsseldorf, decided to encourage donations from its patrons by replacing a small plastic frog with the aptly named “Donation Robot,” which the team meticulously fashioned out of an antique Miele vacuum cleaner.

Key project components include:


Atmel-based Arduino Uno (ATmega328
  • Standard Processing and standard libraries
  • VLSI VS1000 audio module (+ custom firmware)
  • HC-SR04 distance sensor
  • Four LED stripes (two RGB on the backside)
  • 6 power-LEDs for the top
  • Servo for moving the top, servo for moving the bill-mouth
  • Three distance sensors for bill and coin detection
  • Switch for muting audio module
  • Reset button

“The work took about one year to construct, print and integrate all 3D-printed parts, wiring and software development with the Arduino Uno,” Holgar Prang told the official Arduino blog.

“Software development was the minor part, although parallel processing on the Arduino in order to run every component simultaneously required a small trick.”

Interested in learning more? You can check out the project’s official page here.