Tag Archives: MIT

SensorTape is a sensor network in the form factor of masking tape


Sensor deployment made as simple as cutting and attaching strips of tape.


Developed by students from MIT Media Lab’s Responsive Environments group, SensorTape is a sensor network in the form factor of masking tape. Inspired by the emergence of modular platforms throughout the Maker community, it consists of interconnected and programmable sensor nodes on a flexible electronics substrate. In other words, it’s pretty much a roll of circuits that can be cut, rejoined and affixed to various surfaces.

sensortape_01-2

And what’s even cooler is that it’s a completely self-aware network, capable of feeling itself bend and twist. It can automatically determine the location of each of its nodes and the length of the tape, as it is cut and reattached.

As the neighboring nodes talk to one another, they can use their information to assemble an accurate, real-time 3D model of their assumed shape. Tapes with different sensors can also be connected for mixed functionality.

SensorTape’s architecture is made up of daisy-chained slave nodes and a master. The master is concerned with coordinating the communication and shuttling data to a computer, while each slave node features an ATmega328P, three on-board sensors (an ambient light sensor, an accelerometer, and a time-of-flight distance sensor), two voltage regulators and LEDs. The master contains the same AVR MCU, as well as serial-to-USB converter and a Bluetooth transceiver. The tape can be clipped to the master without soldering using a flexible circuit connector.

sensortape_07-800x500-1

In terms of communication protocol, the team chose a combination of I²C and peer-to-peer serial. Whereas I²C supports most of the data transmissions from the master to slave, addresses are ‘assigned dynamically’ over peer-to-peer serial. This enables a fast transfer rate of 100 KHz via I²C with a protocol initialization sequence that accommodates chains of various lengths, up to 128 units long. (For testing, the MIT Media Lab crew developed a 2.3-meter prototype with 66 sensor nodes.)

Aside from its hardware, SensorTape has black lines that instruct where it’s okay to cut and break the circuits using a pair of scissors. As you can see in the image above, this can be either in a straight line or on a diagonal, which allows you to piece together the tape into 2D shapes just as you would when forming a picture frame.

Although still in its infancy, sample use cases of SensorTape include everything from posture-monitoring wearables to inventory tracking to home activity sensing. What’s more, the team has created an intuitive graphical interface for programming the futuristic tape, and it’s all Arduino-friendly so Makers will surely love getting their hands on it and letting their imaginations run wild. You read all about the project in the MIT group’s paper, as well as on Fast Company.

Will drones become the furniture of tomorrow?


L’evolved is a project that turns everyday objects into “flying smart agents.”


If it’s up to two researchers from MIT Media Lab’s Fluid Interfaces Group, the furniture of tomorrow will fly, react and respond to your everyday needs. In their latest project, Harshit Agrawal and Sang-Won Leigh are exploring how to transform once ordinary objects into “flying smart agents.”

levolved

For starters, L’evolved features a drone that acts as a floating desk capable of switching positions, changing heights and flying along as you move. It will even auto-eject if you try to use the wrong pen while completing an assignment or filling out paperwork, and leaves when you’re all done (or in need of a break after working too hard).

The MIT duo has also developed a smart lamp drone that hovers above you to let you read in the dark. By tracking and following its user, the gadget can impressively adapt to different places and postures. What’s more, it can help remotely locate a misplaced book with only a press of a button.

“We’re exploring a future where objects become more humanized, rather than becoming dumber or a dehumanized element of our existence. We want to see more of this inter-relational reaction between humans and objects so that they’re not just being subordinated by our orders,” Leigh recently told Motherboard. “If you think about it it’s really magical, it’s like the world that you imagine in the Harry Potter novels, where everything can fly and come to you.”

Dro

L’evolved consists of two parts: a ground control tower for tracking and fixing the drone’s position and an IR motion capture system. A camera helps keep tabs on everything in the room, including the user and the drone, which receives commands from the computer via Wi-Fi. PID control enables the flying agent to move towards a goal position and provides additional stability. Meanwhile, power is fed through a wall socket, though admittedly this is one aspect of the project that the Makers are looking to improve.

Agrawal reveals to Motherboard that in the future, the team hopes to optimize steadiness by replacing a hovering desk with one that parks in front of users whenever it’s needed and then clears itself off when the user has finished the task at hand.

“On the technological side, we hear a lot about dystopian future — drones always monitoring you and taking away people’s jobs. But, in an equally possible future, we seek a more desirable synergy between man and machine,” the Makers conclude. “L’evolved objects don’t entirely change the way we go about daily tasks: desks are still desks, lamps are still lamps. They don’t substitue or subordinate human activities.”

Intrigued? Head over to the L’evolved’s official page to learn more, and see it in action below!

This LED map tracks the MBTA in real-time


Maker uses an Arduino, Raspberry Pi and LEDs to create a real-time map that keeps tabs on Boston’s trains.


Inspired by his love for making and public transit, MIT student Ian Reynolds has built an MBTA map into the wall of his fraternity room to show real-time locations of vehicles using bright LEDs.

1*aHcByYwvGkoHCREo-95kmQ

The Maker employed a few meters of NeoPixels, driven by an Arduino Uno (ATmega328) that takes orders from a Python script running on a Raspberry Pi lying on his floor. The color of the LEDs were specially designed to match those of each transit line (e.g. red line, blue line, green line, orange line, etc.). Every 10 to 15 seconds, the system receives data via the MBTA’s API, which in turn, causes the respective lights to flash based on the trains’ approximate GPS location throughout Boston.

1*zwCfDL22_qQPAAlLrb7LeQ

“It maps those to some LEDs, decides which ones actually need to be changed, and then sends that information to the Arduino, which does the bit pushing,” Reynolds explains. “In addition, I’m writing a tiny web app that lets me change visualizations and adjust the brightness for when I need to sleep.”

Intrigued? The Maker has put together an elaborate blog post that breaks down his entire project, from the hardware to the headaches. You can also get a glimpse of it all below!

MIT researchers have created a 3D printer for molten glass


Think of G3DP as the next generation of glassblowing. 


Remember the days when 3D printers were only capable of using plastic filament? Well, the times have changed. Chocolate, ceramics, metal, living tissue — these are just some of the materials now being spit out to make an assortment of things, from the practical to the absurd. Next on that ever-growing list? Glass, thanks to a team of researchers at MIT’s Mediated Matter Group.

3DP

That’s because the group has developed an unbelievable 3D printer that can print glass objects. The device, called the G3DPconsists of two heated chambers. The upper chamber is a crucible kiln that operates at a temperature of around 1900°F, and funnels the molten material through an alumina-zircon-silica nozzle, while the bottom chamber works to anneal the structures.

The machine doesn’t create glass from scratch, but instead works with the preexisting substance, layering and building out beautifully-constructed geometric shapes according to designs drawn up in a 3D CAD program. This printing method shares many of the same principles as fused deposition modeling (FDM), which is commonly employed by most 3D printers today. Except that it can operate at much higher temps and uses molten glass as the medium, opposed to plastic filament.

How does it all work, you ask? The glass is first melted at an extremely high temperature over a period of roughly four hours. For another two hours, it undergoes a fining process, in which helium may be introduced to the molten material to enlarge and carry small bubbles to the surface, eliminating them. During this stage, the extruder has to be kept cool so that the glass doesn’t begin flowing. Once fining is complete, the crucible and nozzle are set to temperatures of 1904°F  and 1850°F, respectively, and the extrusion process begins. The G3DP is controlled by three independent stepper motors, as well as the combination of an Arduino (assuming based on an ATmega2560) and RAMPS 1.4 shield.

Glass

At this time, the researchers have used G3DP to craft things like vases, prisms, and other small decorations, some of which will be on display at the Cooper Hewitt, Smithsonian Design Museum next year.

“Two trends in additive manufacturing highlight the value we expect from additive manufacturing of molten glass. First, the freedom that this process provides in terms of the forms that can be created in glass,” its creators explain. “Second, bespoke creation of glass objects provides the opportunity for complex scaffolds, fluidics and labware custom made for individual applications. Moving forward, the simultaneous development of the printer and the design of the printed glass objects will yield both a higher performance system and increasingly complex novel objects.”

As impressive as this may sound, it’s even more mesmerizing to watch it in action. It will surely be interesting to see how the G3DP will influence art, architecture and product design in the future. Intrigued? You can read the team’s entire paper here.

[Images: MIT’s Mediated Matter Group]

NailO turns your thumb into a mini wireless trackpad


This wearable input device from MIT’s Media Lab is in the form of a commercialized nail art sticker.


You’ve been there before: Your arms are full and the phone rings. You put everything down only to find out that it was a telemarketer. Or, while in the middle of preparing dinner, you need to scroll down the recipe page on your tablet. With your hands a mess, you first have to wipe them off before proceeding with the instructions. Fortunately, situations like these may be a thing of the past thanks to a new project from MIT Media Lab. Led by Cindy Hsin-Liu Kao, a team of researchers have developed a new wearable device, called NailO, that turns a users thumbnail into a miniature wireless trackpad.

image-041815-nailo-wireless-thumb-device

Resembling one of those stick-on nail accessories, NailO works as a shrunken-down trackpad that connects to a mobile device. This enables a wearer to perform various functions on a paired phone or PC through different gestures. And for the fashion-conscious, its creators envision a future with detachable decorative top membranes that are completely customizable to better coordinate with a wearer’s individual style.

Along with its use in hands-full activities like cooking or doing repairs, another potential application for the quarter-sized trackpad includes discreetly sending a quick text message in settings where whipping out a smartphone would be rude. After all, running a finger over a thumbnail is a natural occurrence, so a majority of folks would hardly notice this as a deliberate action to control a gadget.

“Fingernails are an easily accessible location, so they have great potential to serve as an additional input surface for mobile and wearable devices.”

Screen-Shot-2015-04-16-at-9.03.13-PM

Crammed within the small package of the NailO lie a LiPo battery, a matrix of sensing electrodes, a Bluetooth Low Energy module, a capacitive-sensing controller, and an ATmega328 MCU. With an average power consumption of 4.86 mA, the device can wirelessly transmit data for at least two hours — an ample amount of time for those in a meeting, in class, in a movie theater, or while working around the house.

In order to get started, wearers must first power it up by maintaining finger contact with it for two or three seconds. From there, users can move their index finger up-and-down or left-and-right across its surface, guiding the mouse on its synced device. To select something onscreen, simply press down a finger as if it were a mouse or a touchscreen.

“As the site for a wearable input device, however, the thumbnail has other advantages: It’s a hard surface with no nerve endings, so a device affixed to it wouldn’t impair movement or cause discomfort. And it’s easily accessed by the other fingers — even when the user is holding something in his or her hand,” the team writes.

JA15MIT77MassAve.1x519

For their initial prototype, the researchers built their sensors by printing copper electrodes on sheets of flexible polyester, which allowed them to experiment with a range of different electrode layouts. But in future experiments, the team notes that they will be using off-the-shelf sheets of electrodes like those found in some trackpads.

The Media Lab crew has also been in discussion with many Shenzhen-based battery manufacturers and have identified a technology that they think could yield a battery that fits in the space of a thumbnail — yet is only 0.5mm thick. In order to further develop the size of a nail art sticker, the Media Lab worked with flexible PCB factories for a slimmer and bendable prototype, which could conform to the curvature of a fingernail.

We’ll have to go out on a limb and say it: looks like this project ’nailed’ it! Want to learn more? Head over to the project’s official page here, as well as read MIT Technology Review’s latest piece on finger-mounted input devices.

12 projects that are redefining storytelling


In honor of World Book Day, here are some Maker innovations that are redefining storytelling…


They say stories can come to life, and well, these projects have taken that saying to an entirely new level.

This isn’t your typical coffee table book

104104647456798521350816554561385745032841n

Jonathan Zufi’s coffee table book entitled “ICONIC: A Photographic Tribute to Apple Innovation is the ultimate must-have for any Apple aficionado. The hardcover recounts the past 30 years of Apple design, exploring some of the most visually appealing and significant products ever created by the Cupertino-based company. The commemorative piece features a special white clamshell case along with a custom PCB configured to pulse embedded LEDs — like that of a sleeping older generation Apple notebook when moved — controlled by an Atmel 8-bit AVR RISC-based MCU.

This magical device will add augmented reality to storybooks 

HideOut-Book

The brainchild of Disney Research, HideOut explores how mobile projectors can enable new forms of interaction with digital content projected on everyday objects such as books, walls, game boards, tables, and many others. The smartphone-sized device enables seamless interaction between the digital and physical world using specially formulated infrared-absorbing markers – hidden from the human eye, but visible to a camera embedded in a compact mobile projection device. Digital imagery directly augments and responds to the physical objects it is projected on, such as an animated character interacting with printed graphics in a storybook.

This interactive piece of art tells a narrative

va_00_lowres

Created by Fabio Lattanzi Antinori, Dataflags is a narrative series of artwork that explores the financial troubles of corporations as they head towards bankruptcy, while highlighting the pivotal role data plays in today’s society. The piece — which was originally displayed in London’s Victoria & Albert Museum back in September 2014 — was powered by Bare Conductive’s incredibly-popular Touch Board (ATmega32U4) and some Electric Paint. The printed sensors were concealed by a layer of black ink, and when touched, triggered a selection of financial trading data theatrically sung by an opera performer.

This book judges you with its cover

a-cover-that-judges-you-by-moore-group-amsterdam_dezeen_468_1

Have you ever judged a book by its cover? Well, Amsterdam creative studio Moore is turning the tables on the old-school idiom by designing a sleeve equipped with an integrated camera and facial-recognition technology that scans the face of whoever comes near. The idea behind the aptly named Cover That Judges You was to build a book cover that is human and approachable-hi-tech. If someone conveys too much emotion – whether overexcitement or under-enthusiasm — the book will remain locked. However, if their expression is free of judgement, the system will send an audio-pulse to an Arduino Uno (ATmega328) and the book will unlock itself. The built-in camera is positioned at the top of the book’s sleeve, above a screen that feeds back the image when it detects a face in close proximity. Artwork featuring abstract facial features is displayed on the cover so that the user can line up their eyes, nose and mouth in the optimum position. Once the correct alignment is obtained, the screen turns green and a signal is relayed to the Arduino that opens the metal lock.

This interactive book lets you feel characters’ emotions

beach2-1024x687_slide-5c7b1107c070d417d3079c31ad6d20fb4da2c8f3-s1100-c15

A team of MIT students unveiled a wearable book that uses networked sensors and actuators to create a sort of cyberpunk-like Neverending Story, blurring the line between the bodies of a reader and protagonist. The sensory fiction project — which built around James Tiptree’s “The Girl Who Was Plugged In” – was designed by Felix Heibeck, Alexis Hope, Julie Legault and Sophia Brueckner in the context of MIT’s Science Fiction To Science Fabrication class. The “augmented book” portrays the scenery and sets the mood, while its companion vest enables the reader to experience the protagonist’s physiological emotions unlike ever before. The wearable — controlled by an [Atmel based] Arduino board — swells, contracts, vibrates, heats up or cools down as the pages of the book are turned. Aside from 150 programmable LEDs to create ambient light based on changing setting and mood, the book/wearable support a number of outputs, including sound, a personal heating device to change skin temperature, vibration to influence heart rate, and a compression system to convey tightness or loosening through pressurized airbags.

This storytelling tree reads with you

img_0853-1_1000x667-720x415-3

In an effort to bring more interaction to story time, Northwood’s Childrens Museum in Wisconsin created a storytelling tree capable of reading along with you. The old computers inside the the museum display were retrofitted with a Touch Board (ATmega32U4) from Bare Conductive. In fact, this was a welcomed replacement as one staff member said that the computers “broke constantly and hogged power, keeping us from updating sounds files periodically throughout the year.”  Unlike its embedded predecessor, the MCU allowed sound files to be changed in an expedited manner, and was slim enough to nestle neatly into the trunk’s design. And what would a treehouse-like exhibit be without a makeshift walkie talkie comprised of cans strung together? Creatively, a set of headphones were also placed inside the can to make it exciting for participants to listen to the story.

This book blends the analog and digital worlds

book-7

Makers Israel Diaz and Ingrid Ocana were on a mission to find new ways to bring children closer to the vast universe of reading. In doing so, the duo figured out a new way to enhance a traditional book with basic electronic components and some Arduino Uno (ATmega328) programming to interact with user intervention through simple built-in sensors, AC motors, LEDs and speakers.

This tale is told with the turn of a music box handle

STory

Night Sun is an interactive audiovisual installation which tells a story with the turn of a music box handle, powered by an ATmega32U4 MCU. In order to bring his idea to fruition, the Maker commissioned an Arduino Micro to control the exhibit. The Arduino was instructed to send a ‘play’ command to a computer when it sensed the touch of a passerby. Once the wired music box handle was turned, the window would light up. A pre-recorded sound would then send a signal to the computer and begin playing… and just like that, the story unfolds.

This pop-up book is made for the digital age 

31

A Maker by the name of Antonella Nonnis recently devised a unique interactive electronic book powered by two ATmega168 based boards. The book, titled “Music, Math, Art and Science,” was inspired by the work of Munari, Montessori and Antonella’s very own mother. The book contains movable parts and uses the electrical capacitance of the human body to activate sounds and lights and other sensors like a button for the math page. Comprised of recycled materials, the book is powered by a pair of Arduino Diecimila, which control the paper pop-up piano and the other controls the arts and science page.

These soft puppets are recreating fables for kids and parents

capano_footprint01-1024x762

Footprints – which was prototyped using an Arduino Uno (ATmega328) – can best be described as a network of interactive soft puppets that help create and share illustrated stories. Designed by Simone Capano, the project links various aspects of a child’s life, including school and family, by collecting and storing relevant data in the cloud. Footprints is typically initiated by a parent. Using a smartphone, the parent can record a little vocal story, add some images proposed by Footprints about the story that was just told, like the story’s characters or other objects related to it. Afterwards, the parent can send it all to the child’s puppet. The child can then listen to the story by placing the puppet on the tablet and playing with the images he or she has received to create a drawing about the story. Once the drawing is complete, Footprints send it back to the parent who then tracks the path of the stories shared with a child via the smartphone app.

This book really sets the scene

1b3ae78534ec56bdf54267d66b4fa165

Created by Bertrand Lanthiez, Hvísl is described as “an invitation to both a visual and audible journey.” Pre-recorded sounds from Icelandic atmospheres are emitted with the help of electronic sensors hidden in some pages connected to a MaKey MaKey board (ATmega32U4). These effects accompany the reading and the contemplation of pictures from the country’s landscape.

This bookmark makes sure you never miss a part

tumblr_mu9zge19QX1syufapo8_1280

Tired of having to reread pages in because you forgot which paragraph you left off on? Devised by 7Electrons, the aptly named eBookmark is envisioned to serve as a bridge between analog and digital worlds. The device — which is based on an 8-bit AVR MCU, various Adafruit components, 16 tiny LEDs and a resistive touch strip — allows the reader to save his or her place on the page, and with a switch, also select the left or right page. The top portion of the eBookmark extends for use with larger books.

This fiction machines lets you create your own narrative

photo-1024x768

Who could forget those ‘Choose Your Own Adventure’ books that became popular in the ‘80s and ‘90s? The series of children’s gamebooks where each story is written from a second-person point of view, with the reader assuming the role of the protagonist and making choices that determine the main character’s actions and the plot’s outcome. Similarly speaking, software developer Jerry Belich has created an interactive arcade machine that works on the same premise. The Choosatron is an interactive fiction machine that lets users select the story, while it prints out a transcript of the chosen story paths. In essence, the machine is a cardboard box with a small thermal printer, a coin acceptor, a keypad, an SD memory card and an Arduino-compatible board.

Arduino in research and biotech


Arduino’s acceptance into the biotech research community is evident from its increasing mentions in high-profile science and engineering journals. Mentions of Arduino in these journals alone have gone from zero to more than 150 in just in the last two years.


While it may be best known as staple for hobbyists, Makers, and hackers who build on their own time, Arduino and Atmel have a strong and rapidly growing following among professional engineers and researchers.

For biotech researchers like myself, experimental setups often require highly specific instruments with strict design rules for parameters such as timing, temperature, motion, force/pressure, and light. Such specific instruments would be time-consuming and expensive to have custom built, as the desired experimental conditions often change as we investigate different samples, cell types, etc. Here, Atmel chips and Arduino boards find a nice niche for making your own affordable, custom setups that are repeatable, precise, and automated. Arduino and Atmel provide microcontrollers in a myriad of form factors, I/O options, and connectivity that are available from a number of vendors. Meanwhile, freeware Arduino code and hardware drivers are also available with many sensors and actuators to go with your board. Best of all, Arduino is designed for a wide audience and range of experiences, making it easy to use for a variety of projects and complexities. So as experimental conditions or goals change, your hardware can easily be re-purposed and re-programmed according to specifications.

Arduino’s acceptance into the biotech research community is evident from its increasing mentions in high profile journals in science and engineering including Nature Methods, Proceedings of the National Academy of the SciencesLab on a Chip, Cell, Analytical Chemistry, and the Public Library of Science (PLOS). Mentions of Arduino in these journals alone have gone from zero to more than 150 in just in the last two years.

In recent years, Arduino-powered methods have started to appear in a variety of cutting edge biotechnology applications. One prominent example is optogenetics, a field in which engineered sequences of genes can be turned on and off using light. Using Arduino-based electronic control over lights and motors, researchers have constructed tools to measure how the presence or absence of these gene sequences can produce different behaviors in human neurons [1][6][7] or in bacterial cells [2]. Light and motor control has also allowed for rapid sorting of cells and gene sequences marked with fluorescent dyes, which can be detected by measuring light emitted to photodiodes. While the biology driving this research is richly complex and unexplored, the engineering behind the tools required to observe and measure these phenomena are now simple to use and well-characterized.

Neuroscientists Voights, Sanders, and Newman at the Open Ephys project provide walkthroughs and add-ons for using Arduino to help them create tools for probing cells.  From left to right, Arduino-based hardware for creating custom electrodes, providing multi-channel input to neurons, and for control over optogenetic lighting circuits.  [6],[7]

Neuroscientists Voights, Sanders, and Newman at the Open Ephys project provide walkthroughs and add-ons for using Arduino to help them create tools for probing cells. From left to right, Arduino-based hardware for creating custom electrodes, providing multi-channel input to neurons, and for control over optogenetic lighting circuits. [6],[7]

Neuroscientists Voights, Sanders, and Newman at the Open Ephys project provide walkthroughs and add-ons for using Arduino to help them create tools for probing cells. From left to right, Arduino-based hardware for creating custom electrodes, providing multi-channel input to neurons, and for control over optogenetic lighting circuits. [6],[7]

Arduino-based automation can be used for supplanting a number of traditional laboratory techniques including control of temperature, humidity, and/or pressure during cell culture conditions; monitoring cell culturing through automated sampling and optical density measurements over time; neurons sending and receiving electrochemical signals; light control and filtration in fluorescence measurements; or measurement of solution salinity. This kind of consistent, automated handling of cells is a key part of producing reliable results for research in cell engineering and synthetic biology.

Synthetic biologists Sauls et al. provide open-source schematics for creating an Arduino-powered turbidostat to automate the culturing of cells with recombinant genes. [5]

Synthetic biologists Sauls et al. provide open-source schematics for creating an Arduino-powered turbidostat to automate the culturing of cells with recombinant genes. [5]

Synthetic biologists Sauls et al. provide open-source schematics for creating an Arduino-powered turbidostat to automate the culturing of cells with recombinant genes. [5]

Arduino has also found an excellent fit in the microfluidics communityMicrofluidics is the miniaturization of fluid-handling technologies—comparable to the miniaturization of electronic components. The development of microfluidic technologies has enabled a myriad of technical innovations including DNA screening microchips, inkjet printers, and the screening and testing of biological samples into compact and affordable formats (often called “lab on a chip” diagnostics) [3]. Their use often requires precise regulation of valves, motors, pressure regulation, timing, and optics, all of which can be achieved using Arduino. Additionally, the compact footprint of the controller allows it to be easily integrated into prototypes for use in medical laboratories or at the point of care. Recent work by the Collins and Yin research groups at MIT has produced prototypes for rapid, point-of-care Ebola detection using paper microfluidics and an Arduino-powered detection system [4].

Microfluidic devices made from paper (left) or using polymers (right) have been used with Arduino to create powerful, compact medical diagnostics (Left: Ebola diagnostic from Pardee et. Al [4], Right: Platelet function diagnostic from Li et al. [9])

Microfluidic devices made from paper (left) or using polymers (right) have been used with Arduino to create powerful, compact medical diagnostics (Left: Ebola diagnostic from Pardee et. Al [4], Right: Platelet function diagnostic from Li et al. [9])

Microfluidic devices made from paper (left) or using polymers (right) have been used with Arduino to create powerful, compact medical diagnostics (Left: Ebola diagnostic from Pardee et. Al [4], Right: Platelet function diagnostic from Li et al. [9])

Finally, another persistent issue in running biological experiments is continued monitoring and control over conditions, such as long-term time-lapse experiments or cell culture.   But what happens when things go wrong? Often this can require researchers to stay near the lab to check in on their experiments. However, researchers now have access to on-board wi-fi control boards [8] that can send notifications via email or text when their experiments are completed or need special attention.  This means fewer interruptions, better instruments, and less time spent worrying about your setup.

The compact Arduino Yun microcontroller combines the easy IDE of Arduino with the accessibility of built-in wi-fi to help you take care of your experiments remotely [8]

The compact Arduino Yun microcontroller combines the easy IDE of Arduino with the accessibility of built-in wi-fi to help you take care of your experiments remotely [8]

True to Arduino’s open-source roots, the building, use, and troubleshooting of the Arduino-based tools themselves are also available in active freeware communities online [5]–[7].

Simply put, Arduino is a tool whose ease of use, myriad applications, and open-source learning tools have provided it with a wide and growing user base in the biotech community.


Melissa Li is a postdoctoral researcher in Bioengineering who has worked on biotechnology projects at UC Berkeley, the Scripps Research Institute, the Massachusetts Institute of Technology, Georgia Institute of Technology, and the University of Washington. She’s used Arduino routinely in customized applications in optical, flow, and motion regulation, including a prototype microfluidic blood screening diagnostic for measuring the protective effects of anti-thrombosis medications [9], [10]. The opinions expressed in this article are solely her own and do not reflect those of her institutions of research.

[1]       L. J. Bugaj, A. T. Choksi, C. K. Mesuda, R. S. Kane, and D. V. Schaffer, “Optogenetic protein clustering and signaling activation in mammalian cells,” Nat. Methods, vol. 10, no. 3, pp. 249–252, Mar. 2013.

[2]       E. J. Olson, L. A. Hartsough, B. P. Landry, R. Shroff, and J. J. Tabor, “Characterizing bacterial gene circuit dynamics with optically programmed gene expression signals,” Nat. Methods, vol. 11, no. 4, pp. 449–455, Apr. 2014.

[3]       E. K. Sackmann, A. L. Fulton, and D. J. Beebe, “The present and future role of microfluidics in biomedical research,” Nature, vol. 507, no. 7491, pp. 181–189, Mar. 2014.

[4]       K. Pardee, A. A. Green, T. Ferrante, D. E. Cameron, A. DaleyKeyser, P. Yin, and J. J. Collins, “Paper-Based Synthetic Gene Networks,” Cell.

[5]       “Evolvinator – OpenWetWare.” [Online]. Available: http://openwetware.org/wiki/Evolvinator. [Accessed: 12-Jan-2015].

[6]       “Open Ephys,” Open Ephys. [Online]. Available: http://www.open-ephys.org/. [Accessed: 12-Jan-2015].

[7]       Boyden, E. “Very simple off-the-shelf systems for in-vivo optogenetics”. http://syntheticneurobiology.org/protocols/protocoldetail/35/9 [Accessed: 12-Jan-2015].

[8]       “Arduino Yun”. http://arduino.cc/en/Guide/ArduinoYun [Accessed: 12-Jan-2015].

[9]       “Can aspirin prevent heart attacks? This device may know the answer,” CNET. [Online]. Available: http://www.cnet.com/news/can-aspirin-prevent-heart-attacks-this-device-may-know-the-answer/. [Accessed: 12-Jan-2015].

[10]       M. Li, N. A. Hotaling, D. N. Ku, and C. R. Forest, “Microfluidic thrombosis under multiple shear rates and antiplatelet therapy doses.,” PloS One, vol. 9, no. 1, 2014.

 

Robot Garden hopes to make coding more accessible for everyone


This robotic garden demonstrates distributed algorithms with more than 100 origami robots that can crawl, swim and blossom.


Created by MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) and the Department of Mechanical Engineering, the aptly named Robot Garden is a defined as “a system that functions as a visual embodiment of distributed algorithms, as well as an aesthetically appealing way to get more young students, and particularly girls, interested in programming.”

Garden

At its core, the project is a tablet-operated system that illustrates MIT’s cutting-edge research on distributed algorithms using robotic sheep that were created through traditional print-and-fold origami techniques, origami flowers (including lilies, tulips and birds of paradise) that are embedded with printable motors enabling them to ‘blossom’ and change colors, as well as magnet-powered robotic ducks that fold into shape by being heated in an oven.

“Students can see their commands running in a physical environment, which tangibly links their coding efforts to the real world. It’s meant to be a launchpad for schools to demonstrate basic concepts about algorithms and programming,” explains Lindsay Sanneman, a lead author on the recently-accepted paper for the 2015 International Conference on Robotics and Automation.

The project is comprised of 16 different tiles, each connected to an Atmel based Arduino board and programmed using search algorithms that explore the space in different ways. The garden itself can be controlled by any Bluetooth-enabled device, either through clicking on flowers individually or a more advanced “control by code” feature that calls for users to add their own commands and execute sequences in real-time. In fact, users can interact with the garden through a computer interface, allowing them to select a tile and inflate/deflate the flower or change the color of its pedals.

Flower

“The garden tests distributed algorithms for over 100 distinct robots, which gives us a very large-scale platform for experimentation,” says CSAIL Director Daniela Rus, who is also a co-author of the paper. “At the same time, we hope that it also helps introduce students to topics like graph theory and networking in a way that’s both beautiful and engaging.”

The project was recently displayed at CSAIL’s “Hour of Code” back in December, where it surely did its part in inspiring kids to get interested in STEM-related disciplines. In the near future, the researchers hope to make the garden operable by multiple devices simultaneously, and may even experiment with interactive auditory components such as microphones and music that would sync to movements.

Interested? Head over to MIT’s official page here, and be sure to watch the garden in action below.

This wearable robot can zip up your jacket for you


Could a robot be coming to your fly or gym bag zipper in the near future? 


Sartorial robotics can best be defined as a method of merging fashion theory and robotics through the design and development of robotic systems. These systems look to facilitate interaction and mimic the materiality, aesthetics and construction techniques of textiles and other apparel. Unlike others, these bots have one objective and one objective only: to enhance the social aspects of the human-robotic dynamic using clothes. In other words, cyberclothes.

zipperbot-robot-fermetures-eclair

Ultimately, researchers hope that it will one day lend a helping hand in how we include robotics in our everyday lives. And, while self-tying laces seem to get all the attention as of late, self-zippering has emerged. Created by Adam Whiton of MIT’s Personal Robots Group, the aptly-dubbed Zipperbot is exactly what its name implies: an autonomously-controlled, continuous closure for joining the edges of fabric.

“Clothing is a uniquely human pursuit and is nearly universal in its adoption and use. It plays a prominent role in our individual cultures transmitting a mixture of social signals and meanings through the semiotics of fashion. It is through this performance of assemblage of fabric surfaces we reconfigure ourselves and our identities,” Whiton explains.

(Source: Adam Whiton / Mashable)

(Source: Adam Whiton / Mashable)

The wearable mechanism does more than merely fasten your jacket. In fact, Zipperbot uses optical sensors to properly mesh the zipper teeth and motion sensors to open and close at the right time. While Whiton doesn’t go into the details of how Zipperbot was built, it appears to be a comprised of a zipper head, a stepper motor and two wires. In its current iteration, the tethered bot is likely connected to an Arduino or a similar microcontroller that enables it to glide up and down the chain.

While it may not be tiny enough to zip up your fly nor is it ready to be severed from its wires, the innovative project does currently work on sleeves, jackets and other forms of material with attached zippers. Nevertheless, this could be the start of something wonderful. Aside from helping the absent-minded, the device can play an integral role in situations where touching any part of clothing could be detrimental to one’s health, like in a medical or biohazard setting, as well as aid those wearing gloves looking to bundle up on a cold winter’s night. More importantly, the gadget has tremendous potential in spurring “assistive clothing” for those with disabilities.

Along with the Zipperbot, Whiton has also devised a number of other innovations in the past, which range from a wearable defensive jacket geared towards women to thwart off violence to an open-source, ATmega32 based MCU platform. Interested in learning more about this next-gen robotic accessory? Head over to the project’s official page here.

MIT Media Lab’s morphing table has Atmel under the hood


Tangible Media Group has created a shapeshifting display that lets users interact with digital information in a tangible way. 


As previously shared on Bits & Pieces, MIT Media Lab’s Tangible Media Group has devised a morphing table with several ATmega2560 MCUs under the hood. The installation was recently exhibited at the Cooper-Hewitt Smithsonian Design Museum in New York, and can be seen in action below!

car

inFORM is described its creators as a dynamic shape display that can render 3D content physically, so users can interact with digital information in a tangible way. In order to make that a reality, the table is equipped with 900 individually actuated white polystyrene pins that make up the surface in an array of 30 x 30 pixels. The interactive piece can display 3D information in real-time and in a more accurate and interactive manner compared to the flat rendering often created by computer user interface.

flashlight_remotesite

This was all accomplished by tasking a Kinect sensor to capture 3D data. This information was then processed with a computer and relayed over to a display, enabling the system to remotely manipulate a physical ball. Aside from being able to produce a controlled physical environment for the ball, the pins are able to detect touch, pressing down and pulling.

flashlight_hands

An overhead projector provides visual guidance of the system, with each pin capable of actuating 100mm and exerting a force of up to 1.08 Newtons each. Actuation is achieved via push-pull rods that are utilized to maximize the dense pin arrangement — making the display independent of the size of the actuators. The table is driven by 150 ATmega2560 based Arduino PCBs arranged in 15 rows of vertical panels, each with 5×2 boards. The boards then communicate with a PC over five RS485 buses bridged to USB. Meanwhile, graphics are rendered using OpenGL and openFrameworks software.

“One area we are working on is Geospatial data, such as maps, GIS, terrain models and architectural models. Urban planners and architects can view 3D designs physically and better understand, share and discuss their designs,” the team writes. “Cross sections through Volumetric Data such as medical imaging CT scans can be viewed in 3D physically and interacted with. We would like to explore medical or surgical simulations. We are also very intrigued by the possibilities of remotely manipulating objects on the table.”

sphere_hands02

Its creators are hoping to spark several collaborations with everyone from urban planners and architects, to designers and modelers, to doctors and surgeons. The display could be used as an alternative to 3D printing low-resolution prototypes as well as rendering 3D data — ranging from construction plans and CT scans — that a user will be able to interact with by physically molding the pins.

Interested? A detailed paper of the project by can be found here.