Tag Archives: UC Berkeley

Wearable sweat sensors provide real-time analysis of the body

UC Berkeley engineers have developed new wearable sensors that can measure skin temperature, as well as glucose, lactate, sodium and potassium in sweat.

As it turns out, future wearable devices may not be as interested in your activities, as they are the sweat produced during them. That’s because engineers at UC Berkeley have developed a flexible sensor system capable of measuring metabolites and electrolytes in sweat and sending the results to a smartphone in real-time.


According to researchers, these bendable plastic patches can be easily implemented into bands for the wrist and head, and provide early warnings to health problems such as fatigue and dangerously high temperatures.

“Human sweat contains physiologically rich information, thus making it an attractive body fluid for non-invasive wearable sensors,” explained Ali Javey, a UC Berkeley professor of electrical engineering and computer sciences.

The prototype consists of five sensors and a flexible circuit with (what would appear to be an Atmel) MCU and a Bluetooth transceiver. This board measures the concentration of various chemicals in sweat and skin temperature, calibrates the information and then sends it over to its accompanying mobile app.


To test their proof-of-concept, the engineers put the device and more than two dozen volunteers through various indoor and outdoor exercises, such as riding stationary bikes and running trails. In doing so, the team kept tabs on sodium, potassium, glucose and lactate. Monitoring electrolytes like sodium and potassium may help track conditions,  and can ultimately be utilized to assess a user’s state of health.

“When studying the effects of exercise on human physiology, we typically take blood samples. With this non-invasive technology, someday it may be possible to know what’s going on physiologically without needle sticks or attaching little, disposable cups on you,” added physiologist George Brooks, a UC Berkeley professor of integrative biology.

Intrigued? Learn all about the wearable sweat sensor here, or watch the team’s video below!

Meet the world’s first DIY origami robot

With Kamigami, engineering is for everyone. Build your own bug bot and then control it with your phone. 

STEM education has been a growing venture in schools across the country, with even the President himself making it a priority to encourage students as young as grade-school to pursue the science, technology, engineering and math disciplines. After all, these fields are changing the world rapidly within the areas of innovation, economic growth and employment. But let’s face it; these subjects don’t come easy to everyone, so how do we instill STEM in kids? A team of UC Berkeley graduates found a way to pique children’s interests, while also inspiring the next generation of Makers. Meet Kamigami, an origami-style robot you can build and program by yourself — no engineering degree necessary.


Kamigami is the brainchild of Dash Robotics, a startup founded by Berkeley engineers Nick Kohut and Andrew Gillies. The company firmly believes in STEM education, and that the power of innovating is for everyone. Kamigami was created with this belief in notion, and it has proven to be an educational and affordable way for kids to get an early start in robotics, engineerin and biology.

Now live on Kickstarter, these robots come in a DIY kit comprised of laser-cut body components, a motor, a transmission, a rechargeable battery, a microUSB port, and plug-and-play electronics. The assembly takes less than an hour and instructional videos online shows you how it comes together. Plus, the robot’s behavior can be programmed and controlled all through Kamigami’s accompanying mobile app (for iOS and Android).


Each Kamigami can be configured with a unique set of behaviors and characteristics through a drag-and-drop interface, opening up a range of possible modes that take advantage of the robot’s integrated sensors and functions. So what type of games is it capable of? For starters, sumo wrestling (first to fall of a table loses), relay races (one robot can’t run until it’s tagged by another), tank battles (take turns trying to get into firing position) and IR laser tag, to name just a few.

And unlike other DIY robotic kits before it, biology comes into play in the automation of each Kamigami. In fact, the bots take into account animals and mimics their locomotion through its built-in linkages and motors. The robot’s chassis is made of a patent-pending material (an extremely durable plastic composite) that allows it to fold up through an origami-like process. This material doesn’t fatigue or wear, which makes for a more durable robot.


The mechanics of the robot itself are custom designed, and packed with processing power and sensors. The main microprocessor features a Cortex-M0 core and a Bluetooth Smart radio. Plus, the cockroach-ish unit is packed with an array of sensors including ambient light, infrared detectors and emitter, a gyroscope and an accelerometer. The electronics also entail motor drivers, charging circuitry and an accessory header for expandability. The infrared emitter and detectors enable each bot to send and receive signals from its mobile app, as well as communicate with other Kamigamis. The gadget runs on a rechargeable battery, with about 30-45 minutes of play time.

Sound like a bug bot you’d love to have? Crawl over to its Kickstarter campaign, where Dash Robotics is currently seeking $50,000. Delivery is expected to get underway in March 2016.

These Arduino-powered shoulder pads make Wi-Fi visible

Hertzian Armor is a piece of shoulder armor that visually illustrates the ubiquity of Wi-Fi networks. 

In today’s constantly connected world, there are an infinite number of wireless signals being sent to and from the gizmos and gadgets around us. However, they cannot be seen. As a way to better oberseve these invisible interactions, UC Berkeley design students Anthony Dunne and Fiona Raby have created what they call Hertzian Armor — a wearable device that visualizes the ubiquity of Wi-Fi.


The duo first coined the term “Hertzian space,” as a way to best describe the interfacing between electromagnetic waves and human experiences, which served as the basis for the project.

“Our initial approach to this assignment was to create an object that allows us to see the unseen. In this way we could begin to explore how we interact with the invisible world around us, and start a conversation about something we may come in contact with everyday, but not fully understand,” the Makers write. “We initially started looking at alcohol sensors and pollution sensors, two things we are affected by but never see. While brainstorming how to implement this technology in the wearable, we stumbled on a larger goal, how can we make Wi-Fi visible?”

The wearable itself is comprised of cyberpunkish shoulder pads that are embedded with an Adafruit Wi-Fi breakout module with an on-board antenna attached to a LilyPad Arduino (ATmega328P) tasked with scanning for nearby networks. Aside from that, the piece of armor is powered by a 2000mAh polymer lithium-ion battery, while a LilyPad LiPower supply converts the 3.7V from the battery to the necessary 5V to juice up the entire unit.


Meanwhile, a few overlapping pieces of neoprene are equipped with NeoPixel strips underneath each flap that are used to signify the strength of the received wireless signals. The color-changing RGB LED output represents the security or openness of each particular network: red for highly-secure, restricted networks (WPA2), green for less sure, open networks (WPA, WEP), and blue for open hotspots.

“We decided on creating shoulder armor because we wanted a wearable that would be bold enough to display at Burning Man or an event like Silicon Valley Fashion Week, but also simple enough to be worn around Berkeley,” Dunne and Raby explain.

Well, mission accomplished! Intrigued by this wearable project? Head over to its official page on Hackster.io to learn more, and be sure to watch it in action as their prototype illumines in red, green and blue as its wearer wanders through the campus turning heads along the way.

This 3D-printed smart cap can sense spoiled milk

Researchers have 3D-printed a smart cap for a milk carton that detects signs of spoilage using embedded sensors.

3D printing has grown by leaps and bounds in recent years, ranging from affordable prosthetics and medical implants to on-demand toys and cars. However, a group of UC Berkeley engineers have pointed out, one thing that was missing up until now was the ability to produce sensitive electronic components. So in collaboration with researchers at Taiwan’s National Chiao Tung University, the team has set out to expand the already impressive portfolio of 3D printing technology to include electrical components, like resistors, inductors, capacitors and integrated wireless electrical sensing systems. In order put this advancement to the test, they have printed a wireless smart milk carton cap capable of detecting signs of spoilage using embedded sensors.

(Source: Sung-Yueh Wu)

(Source: Sung-Yueh Wu)

“Our paper describes the first demonstration of 3D printing for working basic electrical components, as well as a working wireless sensor,” explained Liwei Lin, a professor of mechanical engineering and co-director of the Berkeley Sensor and Actuator Center. These findings were published in a new open-access journal in the Nature Publishing Group entitled “Microsystems & Nanoengineering. “One day, people may simply download 3D-printing files from the Internet with customized shapes and colors and print out useful devices at home.”

While polymers are typically used in 3D printing given their ability to be flexed into a variety of shapes, they are poor conductors of electricity. To get around this, the researchers devised a system using both polymers and wax. They removed the wax, leaving hollow tubes into which liquid metal was injected and then cured. The team used silver in their latest experiments.

The shape and design of the metal determined the function of different electrical components. For instance, thin wires acted as resistors, and flat plates were made into capacitors. The electronic component was then embedded into a plastic cap to detect signs of spoilage in a milk carton. A capacitor and inductor were added to the smart cap to form a resonant circuit. The engineers flipped the carton to allow a bit of milk into the capacitor, and left the carton unopened for 36 hours at room temperature.

(Source: Sung-Yueh Wu)

(Source: Sung-Yueh Wu)

From there, the circuit sensed the changes in electrical signals that accompany increased levels of bacteria. These changes were monitored with a wireless radio-frequency probe at the start of the experiment and every 12 hours thereafter. Upon completion, the smart cap found that the peak vibration frequency of the room-temperature milk dropped by 4.3% after 36 hours. In comparison, a carton of milk kept at 4°C saw a relatively minor 0.12% shift in frequency over the same time period.

“This 3D-printing technology could eventually make electronic circuits cheap enough to be added to packaging to provide food safety alerts for consumers,” Lin added. “You could imagine a scenario where you can use your cellphone to check the freshness of food while it’s still on the store shelves.”

Looking ahead, the researchers are hoping to further develop this technology for use in health applications, such as implantable devices with embedded transducers that can monitor blood pressure, muscle strain and drug concentrations.

Interested? Read more about the study here.

This cockroach-inspired robot can make its way through obstacles

These cockroach-like robots could be used for everything from monitoring fields to search and rescue missions.

Inspired by discoid cockroaches, researchers at UC Berkeley have created a robot that can use its body shape to slip through tight spaces using natural parkour moves. Equipped with the insect’s characteristic rounded shell, the running robot can successfully complete a grass-like obstacle course, without the need for additional sensors or motors.

(Source: UC Berkeley)

(Source: UC Berkeley)

The Berkeley team, led by postdoctoral researcher Chen Li, hopes that the robot will one day inspire the design of future terrestrial robots that can be used in any number of applications, ranging from search and rescue operations to monitoring the environment. While many terrestrial robots have been developed with the ability to avoid obstacles in the past, very few have ever actually traversed them.

“The majority of robotics studies have been solving the problem of obstacles by avoiding them, which largely depends on using sensors to map out the environment and algorithms that plan a path to go around obstacles,” Li explains. “However, when the terrain becomes densely cluttered, especially as gaps between obstacles become comparable or even smaller than robot size, this approach starts to run into problems as a clear path cannot be mapped.”

(Source: UC Berkeley)

(Source: UC Berkeley)

Whereas many robots are able to work on flat surfaces with a few obstacles, in nature, cockroaches and other small animals often have to navigate environments cluttered with shrubs, leaf litter, tree trunks and fungi. So for their study, the researchers employed high-speed cameras to film the movement of the discoid cockroaches through an artificial course comprised of tall, grass-like beams with limited spacing. The cockroaches were fitted with three different artificial shells to observe how their movement was affected by various body shapes, including an oval cone, a flat oval and a flat rectangle.

When the cockroaches were left unmodified, the researchers discovered that, although they sometimes pushed through or climbed over the fake grass, they most frequently used a fast and effective natural parkour moves to slip by the obstacles. In these situations, the robotic insects rolled their body so that their thin sides could slide through the gaps and their legs could push off the beams to help them maneuver.

(Source: UC Berkeley)

(Source: UC Berkeley)

They found that with a flat oval and rectangular bodies, the robot could not often traverse the beams and frequently collided with the objects in its way, often becoming stuck. Conversely, when fitted with the cockroach-esque rounded shell, the six-legged were able to successfully get through the course using a similar roll maneuver to the cockroaches. This adaptive behavior came about with no change to the robot programming, showing that the behavior came from the shell alone.

Looking ahead, the researchers hope to follow up this discovery by searching for other shapes in nature that could enhance the robots’ ability to advance through difficult terrain.

“There may be other shapes besides the thin, rounded one that are good for other purposes, such as climbing up and over obstacles of other types. Our next steps will be to study a diversity of terrain and animal shapes to discover more terradynamic shapes, and even morphing shapes. These new concepts will enable terrestrial robots to go through various cluttered environments with minimal sensors and simple controls,” Li adds.

The first results of the robot’s performance were shared in IOP Publishing’s journal Bioinspiration & Biomimetics.

This smartphone microscope is saving lives in Africa

UC Berkeley engineers develop a new smartphone microscope that can detect infection by parasitic worms.

Access to a hematologist is not something that is all too common in many parts of Africa. That’s why a research team led by engineers at the University of California, Berkeley has developed a new mobile phone microscope that uses video to automatically detect and quantify infection by parasitic worms in a drop of blood. The latest iteration of UC Berkeley’s CellScope technology could potentially revive efforts to eradicate debilitating diseases in Africa, such as river blindness and lymphatic filariasis, by offering critical information to health providers in the field in a more accurate and efficient manner. This would allow workers to make potentially life-saving treatment decisions right on the spot.


River blindness, which is the second-leading cause of infectious blindness worldwide, is typically transmitted through the bite of blackflies. Meanwhile, the second-leading cause of disability worldwide, lymphatic filariasis, is spread by mosquitoes and leads to elephantiasis — a condition marked by painful, disfiguring swelling. Both are endemic in certain regions in Africa.

Treatment often revolves around the drug ivermectin, or IVM. Yet, public health campaigns to administer the medication have been halted, and rightfully so, due to potentially fatal side effects for patients co-infected with Loa loa — a common cause of African eye worm. When there are high circulating levels of microscopic Loa loa worms in a patient, treatment with IVM can ultimately lead to severe or fatal neurologic damage.

The standard method of screening for levels of Loa loa involves trained technicians manually counting the worms in a blood smear using conventional laboratory microscopes, making the process impractical for use in field settings and in mass campaigns to administer IVM. That’s why the team of UC Berkeley engineers joined forces with Dr. Thomas Nutman from the National Institute of Allergy and Infectious Diseases, and collaborators from Cameroon and France to develop the incredible, Arduino-based gadget.

For their most recent version of a mobile phone microscope, the aptly named CellScope Loa, the researchers paired a smartphone with a 3D-printed plastic base where the sample of blood is positioned. Fortunately, the parts housed within its base were relatively easy to allocate. These include an Atmel powered Arduino board, a Bluetooth module, LED lights, a USB port, as well as some gears and circuitry.


As the researchers explain, control of the device is automated through a custom app that was designed solely for this purpose. With just a touch of the screen by a healthcare worker in the field, the phone wirelessly communicates over Bluetooth to controllers in the base to process and analyze the sample of blood. Its gears move the sample in front of the camera, and an algorithm instantly analyzes the telltale “wriggling” motion of the worms in captured in the video by the phone. From there, the worm count is displayed on the screen.

Impressively, the entire procedure takes under two minutes, starting from when the sample is inserted to displaying its results. According to UC Berkeley associate chair Daniel Fletcher, this processing time enables health workers to quickly determine whether or not it is safe to administer IVM on site.

“The availability of a point-of-care test prior to drug treatment is a major advance in the control of these debilitating diseases,” added fellow UC Berkeley professor Vincent Resh. “The research offering a phone based app is ingenious, practical and highly needed.”

At the moment, the engineers are looking to expand the trial of the hardware to around 40,000 people in Cameroon. If successful, there’s a hope that the kit could one day be used to screen out those infected with Loa loa and assist countless others who would otherwise suffer.

Intrigued? You can read all about the project here, or watch its demonstration below.

UC Berkeley 3D prints an entire 9-foot-tall pavilion

Researchers have just 3D-printed the first and largest powder-based cement structure.

team of researchers from UC Berkeley’s College of Environmental Design has unveiled what they’re calling the “first and largest powdered cement-based, 3D-printed structure.” 


To be clear, this isn’t the first 3D-printed building. If you recall, a Chinese company recently constructed 10 homes in less than day and finished an entire apartment block back in January using 3D-printed parts. Aside from that, a Dutch design firm has already devised a canal house in Amsterdam, a New York architect planned an entire estate and a Minnesota Maker created a castle all through additive manufacturing. However, what sets this project apart is that it was constructed using dry powdered cement, whereas its predecessors were made by extruding wet cement through a nozzle.

The pavilion, which goes by the name Bloom, is 9′ tall, 12′ wide and 12′ deep, and dons a traditional Thai floral motif design on its exterior to allow for natural light to shine through its interior in daylight and glow like a luminary at night. It is composed of 840 custom-printed blocks, each comprised of an iron oxide-free Portland cement polymer, and fabricated using 11 3D Systems printers.


“This project is the genesis of a realistic, marketable process with the potential to transform the way we think about building a structure,” explained Ronald Rael, Associate Professor of Architecture at UC Berkeley.

What really sets this unique system apart from existing methods of extrusion is that, by using an iron oxide-free Portland cement polymer formulation, Bloom is able to overcome many of the previous limitations to 3D-printed architecture. These constraints include the speed and cost of production, as well as aesthetics and practical applications.

Undoubtedly, 3D printing has transcended well beyond just mere plastic figurines, with today’s advanced printers — many of which powered by Atmel | SMART and AVR microcontrollers — capable of producing everything from functional tools in space to automobiles to entire buildings. After its official unveiling, the Bloom Pavilion was disassembled and shipped to Siam Research and Innovation in Thailand, where it will be exhibited for a few months before touring the world. Those wishing to learn more can head over to the project’s official page here.

Arduino in research and biotech

Arduino’s acceptance into the biotech research community is evident from its increasing mentions in high-profile science and engineering journals. Mentions of Arduino in these journals alone have gone from zero to more than 150 in just in the last two years.

While it may be best known as staple for hobbyists, Makers, and hackers who build on their own time, Arduino and Atmel have a strong and rapidly growing following among professional engineers and researchers.

For biotech researchers like myself, experimental setups often require highly specific instruments with strict design rules for parameters such as timing, temperature, motion, force/pressure, and light. Such specific instruments would be time-consuming and expensive to have custom built, as the desired experimental conditions often change as we investigate different samples, cell types, etc. Here, Atmel chips and Arduino boards find a nice niche for making your own affordable, custom setups that are repeatable, precise, and automated. Arduino and Atmel provide microcontrollers in a myriad of form factors, I/O options, and connectivity that are available from a number of vendors. Meanwhile, freeware Arduino code and hardware drivers are also available with many sensors and actuators to go with your board. Best of all, Arduino is designed for a wide audience and range of experiences, making it easy to use for a variety of projects and complexities. So as experimental conditions or goals change, your hardware can easily be re-purposed and re-programmed according to specifications.

Arduino’s acceptance into the biotech research community is evident from its increasing mentions in high profile journals in science and engineering including Nature Methods, Proceedings of the National Academy of the SciencesLab on a Chip, Cell, Analytical Chemistry, and the Public Library of Science (PLOS). Mentions of Arduino in these journals alone have gone from zero to more than 150 in just in the last two years.

In recent years, Arduino-powered methods have started to appear in a variety of cutting edge biotechnology applications. One prominent example is optogenetics, a field in which engineered sequences of genes can be turned on and off using light. Using Arduino-based electronic control over lights and motors, researchers have constructed tools to measure how the presence or absence of these gene sequences can produce different behaviors in human neurons [1][6][7] or in bacterial cells [2]. Light and motor control has also allowed for rapid sorting of cells and gene sequences marked with fluorescent dyes, which can be detected by measuring light emitted to photodiodes. While the biology driving this research is richly complex and unexplored, the engineering behind the tools required to observe and measure these phenomena are now simple to use and well-characterized.

Neuroscientists Voights, Sanders, and Newman at the Open Ephys project provide walkthroughs and add-ons for using Arduino to help them create tools for probing cells.  From left to right, Arduino-based hardware for creating custom electrodes, providing multi-channel input to neurons, and for control over optogenetic lighting circuits.  [6],[7]

Neuroscientists Voights, Sanders, and Newman at the Open Ephys project provide walkthroughs and add-ons for using Arduino to help them create tools for probing cells. From left to right, Arduino-based hardware for creating custom electrodes, providing multi-channel input to neurons, and for control over optogenetic lighting circuits. [6],[7]

Neuroscientists Voights, Sanders, and Newman at the Open Ephys project provide walkthroughs and add-ons for using Arduino to help them create tools for probing cells. From left to right, Arduino-based hardware for creating custom electrodes, providing multi-channel input to neurons, and for control over optogenetic lighting circuits. [6],[7]

Arduino-based automation can be used for supplanting a number of traditional laboratory techniques including control of temperature, humidity, and/or pressure during cell culture conditions; monitoring cell culturing through automated sampling and optical density measurements over time; neurons sending and receiving electrochemical signals; light control and filtration in fluorescence measurements; or measurement of solution salinity. This kind of consistent, automated handling of cells is a key part of producing reliable results for research in cell engineering and synthetic biology.

Synthetic biologists Sauls et al. provide open-source schematics for creating an Arduino-powered turbidostat to automate the culturing of cells with recombinant genes. [5]

Synthetic biologists Sauls et al. provide open-source schematics for creating an Arduino-powered turbidostat to automate the culturing of cells with recombinant genes. [5]

Synthetic biologists Sauls et al. provide open-source schematics for creating an Arduino-powered turbidostat to automate the culturing of cells with recombinant genes. [5]

Arduino has also found an excellent fit in the microfluidics communityMicrofluidics is the miniaturization of fluid-handling technologies—comparable to the miniaturization of electronic components. The development of microfluidic technologies has enabled a myriad of technical innovations including DNA screening microchips, inkjet printers, and the screening and testing of biological samples into compact and affordable formats (often called “lab on a chip” diagnostics) [3]. Their use often requires precise regulation of valves, motors, pressure regulation, timing, and optics, all of which can be achieved using Arduino. Additionally, the compact footprint of the controller allows it to be easily integrated into prototypes for use in medical laboratories or at the point of care. Recent work by the Collins and Yin research groups at MIT has produced prototypes for rapid, point-of-care Ebola detection using paper microfluidics and an Arduino-powered detection system [4].

Microfluidic devices made from paper (left) or using polymers (right) have been used with Arduino to create powerful, compact medical diagnostics (Left: Ebola diagnostic from Pardee et. Al [4], Right: Platelet function diagnostic from Li et al. [9])

Microfluidic devices made from paper (left) or using polymers (right) have been used with Arduino to create powerful, compact medical diagnostics (Left: Ebola diagnostic from Pardee et. Al [4], Right: Platelet function diagnostic from Li et al. [9])

Microfluidic devices made from paper (left) or using polymers (right) have been used with Arduino to create powerful, compact medical diagnostics (Left: Ebola diagnostic from Pardee et. Al [4], Right: Platelet function diagnostic from Li et al. [9])

Finally, another persistent issue in running biological experiments is continued monitoring and control over conditions, such as long-term time-lapse experiments or cell culture.   But what happens when things go wrong? Often this can require researchers to stay near the lab to check in on their experiments. However, researchers now have access to on-board wi-fi control boards [8] that can send notifications via email or text when their experiments are completed or need special attention.  This means fewer interruptions, better instruments, and less time spent worrying about your setup.

The compact Arduino Yun microcontroller combines the easy IDE of Arduino with the accessibility of built-in wi-fi to help you take care of your experiments remotely [8]

The compact Arduino Yun microcontroller combines the easy IDE of Arduino with the accessibility of built-in wi-fi to help you take care of your experiments remotely [8]

True to Arduino’s open-source roots, the building, use, and troubleshooting of the Arduino-based tools themselves are also available in active freeware communities online [5]–[7].

Simply put, Arduino is a tool whose ease of use, myriad applications, and open-source learning tools have provided it with a wide and growing user base in the biotech community.

Melissa Li is a postdoctoral researcher in Bioengineering who has worked on biotechnology projects at UC Berkeley, the Scripps Research Institute, the Massachusetts Institute of Technology, Georgia Institute of Technology, and the University of Washington. She’s used Arduino routinely in customized applications in optical, flow, and motion regulation, including a prototype microfluidic blood screening diagnostic for measuring the protective effects of anti-thrombosis medications [9], [10]. The opinions expressed in this article are solely her own and do not reflect those of her institutions of research.

[1]       L. J. Bugaj, A. T. Choksi, C. K. Mesuda, R. S. Kane, and D. V. Schaffer, “Optogenetic protein clustering and signaling activation in mammalian cells,” Nat. Methods, vol. 10, no. 3, pp. 249–252, Mar. 2013.

[2]       E. J. Olson, L. A. Hartsough, B. P. Landry, R. Shroff, and J. J. Tabor, “Characterizing bacterial gene circuit dynamics with optically programmed gene expression signals,” Nat. Methods, vol. 11, no. 4, pp. 449–455, Apr. 2014.

[3]       E. K. Sackmann, A. L. Fulton, and D. J. Beebe, “The present and future role of microfluidics in biomedical research,” Nature, vol. 507, no. 7491, pp. 181–189, Mar. 2014.

[4]       K. Pardee, A. A. Green, T. Ferrante, D. E. Cameron, A. DaleyKeyser, P. Yin, and J. J. Collins, “Paper-Based Synthetic Gene Networks,” Cell.

[5]       “Evolvinator – OpenWetWare.” [Online]. Available: http://openwetware.org/wiki/Evolvinator. [Accessed: 12-Jan-2015].

[6]       “Open Ephys,” Open Ephys. [Online]. Available: http://www.open-ephys.org/. [Accessed: 12-Jan-2015].

[7]       Boyden, E. “Very simple off-the-shelf systems for in-vivo optogenetics”. http://syntheticneurobiology.org/protocols/protocoldetail/35/9 [Accessed: 12-Jan-2015].

[8]       “Arduino Yun”. http://arduino.cc/en/Guide/ArduinoYun [Accessed: 12-Jan-2015].

[9]       “Can aspirin prevent heart attacks? This device may know the answer,” CNET. [Online]. Available: http://www.cnet.com/news/can-aspirin-prevent-heart-attacks-this-device-may-know-the-answer/. [Accessed: 12-Jan-2015].

[10]       M. Li, N. A. Hotaling, D. N. Ku, and C. R. Forest, “Microfluidic thrombosis under multiple shear rates and antiplatelet therapy doses.,” PloS One, vol. 9, no. 1, 2014.