Steven Kemper studied music composition and computer technology at the University of Virginia. Unsurprisingly, he was always fascinated with robotic instruments that can be programmed to play music, respond to human musicians and even improvise.
So Kemper, along with colleagues Scott Barton and Troy Rogers, went on to found Expressive Machines Musical Instrument (EMMI), designing a Poly-tangent Automatic (multi)Monochord, also known as “PAM.”
As TechNewsWorld’s Vivian Wagner notes, the stringed instrument’s pitches are controlled by tangents – the equivalent of fingers – each of which is driven by a solenoid. Messages are sent from a computer via a USB to an [Atmel-powered] Arduino board, which switches the solenoids on and off.
PAM is also capable of receiving data from musical and gestural input devices – such as a MIDI keyboard, joystick or mouse – or from environmental sensors, allowing the platform to improvise its own music based on the programmer’s parameters and instructions.
“These instruments are not superior to human performers,” Kemper, now an assistant professor of music technology at Rutgers University, told TechNewsWorld. “They just provide some different possibilities.”
In addition to PAM, EMMI has created a variety of instruments, all of which can be programmed to play in multiple genres and settings.
“These instruments can improvise based on structures we determine or by listening to what performers are playing,” Kemper added. “We work with the free improv aesthetic and [our instruments] don’t fit into a particular musical genre. It’s improvising based on any decisions the performers make.”