Tag Archives: Arduino

Have your Arduino let you know when your package arrives


How to program your Arduino to query the FedEx API every time someone comes to your door in order to determine whether that person was delivering a package.


If you’re expecting a package, and can’t be bothered to go to the door to actually check and see who is bothering you, Adafruit has your solution. That’s because they’ve developed a guide, which will teach you “how to program your Arduino to query the FedEx API every time someone comes to your door in order to determine whether that person was delivering a package. Then, you’ll program the board to use the Zendesk API to alert you if a package was delivered.”

microcontrollers_Package_Delivery_Cover

Physically, this task is fairly straightforward, involving only an Arduino Uno (ATmega328) with a Wi-Fi shield (AT32UC3) for communication, and an infrared sensor to detect whether or not someone is at your door. Setting up the software, as you might suspect, is somewhat more involved, including getting a Temboo account, a Zendesk account, and obtaining FedEx developer keys.

If you’re thinking about doing this project, it’s much easier to obtain the FedEx keys than you might suspect, and what you need to do to set everything up is laid out in a step-by-step procedure. On the other hand, if you’re expecting something from UPS or the U.S. Postal Service, you might still need to actually go to the door and see what it is. Besides, you’ll have to get the package eventually!

For another idea on how to interface devices in your house with the Internet, why not check out this Amazon Echo controlled wheelchair experiment?

This 3D-printed, Arduino-powered robotic mower will take care of your lawn for you


Build your own Ardumower for less than $300.


Mowing the lawn; it’s a nice slice of solitude and exercise for some, and an arduous task for others, to be avoided at all costs. If you fall into that second category, then the Ardumower might be for you. According to its description,“With this download project you can build your own robotic lawn mower at a fraction of the cost that one would have to apply for a commercial one.”

csm_DSC01388_1d7b22e3bc

The mower itself is an interesting build, with a nicely sloped canopy and driving wheels that resemble something found inside of a clock. Housed inside is an Arduino Uno (ATmega328) and a motor driver board for control. Two 12V electrical motors are used for locomotion around a yard, while another motor turns the cutting blade.

The robo-mower is kept within your yard using a boundary wire fence to tell it when it has reached the limits of its domain. As seen in the video below, it also has some obstacle avoidance capability, though it would likely be best to keep it in an area free from animals, children, and irresponsible adults!

If you want to assemble one yourself, you can do so for about $250-$300 — a fraction of the cost of its commercial counterparts. A manual, which is available for $12.16, claims to give step-by-step directions to build your own Ardumower (or maybe two for larger lawns!), as well as info on how to create the boundary fence.

This machine can print pictures using drops of coffee, wine and other liquids


Just when you thought you’ve seen it all…


If you’ve ever been to a Maker Faire, then chances are you’ve stumbled upon the PancakeBot, a CNC machine that extrudes delicious art out of batter. A few years ago, RIT Assistant Professor Ted Kinsman decided that he wanted to print using something other than ink as well. His choice? Coffee, or any other material with low viscosity.

4903_showcase_project_detail_item-2.jpeg

The machine itself is an xy-axis printer equipped with a solenoid liquid valve, stepper motors for positioning and an Arduino, which can store images of approximately 80×100 pixels. However, despite its mediocre resolution, it does plot human faces fairly well. The drip size, the nozzle distance and the paper that the beads of coffee extract fall onto can all be changed.

“For many years I have thought about building a machine that could paint for me,” he explains. “Since I always have leftover coffee, I thought it would be a fun medium to play with.”

For what it lacks in resolution, it surely makes up for in cost — Kinsman says that it’s super inexpensive to create images. To begin, the professor snaps a picture, heightens the contrast and converts that into a PGM file that the Arduino could read. The sketch then prints a test grid, which can be modified by dropping in a PGM image and adjusting the space between drops. As MAKE: notes, the grayscale is converted to an array of dots whose darkness corresponds to the length of time that the valve of the pipette opens to release a coffee drop.

4906_showcase_project_detail_item.jpeg

“Each of the pixels is turned into a number from 0 (no coffee) to 256 (the largest drip size). The size of each pixel is controlled by determining how long to open the drip valve for — the largest drop (and darkest pixel) requires the valve to be open for 63 milliseconds. In this way, the machine currently can do 53 different shades of coffee,” according to PetaPixel.

A Mariotte’s siphon is employed to ensure that the depth of the coffee in the reservoir won’t affect the pressure, which in turn could influence the size of the drops. Each print requires about an hour from start to finish, but takes roughly a day to fully dry.

Looking ahead, Kinsman would like to explore the possibility of adding another stepper motor so that he can make spirographs or use a syringe that would enable him to print with thicker liquids. But until then, you can watch it in action below (note that the machine is using blue ink) and read more about the project here.

 

Bring the weather forecast to your Chucks


Hack a pair of Converse using an Adafruit FLORA, NeoPixels and a Bluetooth LE module that relays weather data from your phone.


San Francisco-based creative studio Chapter, in collaboration with Converse, have hacked a pair of Chuck Taylors to bring the forecast to your feet.

Converse

The Converse Beacon consists of an Adafruit FLORA board (ATmega32U4), a Bluefruit LE module and a NeoPixel ring, which together, can alert you to custom weather conditions through IFTTT. In other words, your sneaks can let you know when rain is coming, when the surf is just right, or when conditions are perfect to take a stroll outside. Talk about walkin’ on sunshine!

What’s more, you’re not just limited to weather. Once you’ve connected IFTTT to the Adafruit channel, you open the door to hundreds of possible recipes that link various inputs to your NeoPixels.

IMG_5704

Think you want to relay data from your smartphone to create stylish alerts on your Chucks? Then check out Chapter’s full project write-up on Hackster.io.

The Ski Buddy is a FLORA-powered coat that teaches you to ski


A DIY wearable system that can make learning to ski fun for kids.


As anyone who has ever hit the slopes will tell you, learning to ski can be quite challenging — especially for youngsters. Tired of seeing children be screamed at by parents trying to teach them to ski, Maker “Mkarpawich2001” decided to develop a wearable system that would make the process much more enjoyable for kids.

F6UVQSUIKJ8TEQS.MEDIUM

The Ski Buddy is an electronic jacket that helps novice skiers through the use of lights. Based on an Adafruit FLORA (ATmega32U4), the coat is equipped with an accelerometer, a AAA battery pack, and conductive thread that connects to LED sequins.

“Knowing that childhood memorizes can unintentionally affect our adult lives, I sought out to come up with a tool to help making the process of learning to ski fun for kids at young ages,” the Maker writes. “Of course, all children love light-up toys, so why not transfer that love to learning? With changeable settings, you can use this coat for a variety of lessons.”

FJDHPFPIKEAMV5T.MEDIUM

According to Mkarpawich2001, the Ski Buddy can be used to teach linking turns, parallel skiing, hockey stops and even gradual pizza stopping (the act of pointing your skis together and pushing your heels out to form what looks like a slice of pizza).

The lights will flash once to suggest that they are working, and then guide the user along the desired path, including direction, speed and stops. While on the slopes, instruction is provided via the LEDs, depending on the particular lesson. For instance, alternating lights can let a person know to slow down, or when turned off, can mean they’re going the right way.

You can see it in action below, and head over to its page here. Those looking for a more commercial solution should check out Carv.

 

 

Trojan 77 is a gamified simulation of the Trojan virus


Inspired by labyrinth, this project highlights the most significant effects of the Trojan virus.


Developed by a team of students at the Copenhagen Institute of Interaction Design, Trojan 77 is a gamified simulation of the infamous Trojan virus — a malware that provides unauthorized remote access to a user’s computer. The game, which was originally devised as a tech museum exhibit, aims to shed light on the most important effects the virus.

trojan77.jpg

Much like the labyrinth game you played growing up, Trojan 77 simulates a few key effects of the virus, such as passwords leaking out and files being deleted, culminating in a system failure. To help explain the intricacies of the malware, the team built the project on the metaphor of a maze with players having the perspective of the hacker.

As you can see in the video below, the ball represents the Trojan virus. The player must get the ball to stop at cetain touchpoints throughout the maze by tiling the structure back and forth. Each touchpoint holds valuable data, like passwords and pictures. Once a touchpoint is hit, the data can be then be ‘accessed’ by the hacker. If successful, the vrius will crash the system once the final touchpoint is reached.

projection

“The idea of designing something analog to explain a digital construct was an exciting challenge to undertake. The way that computer viruses operate can be very complicated and hard to explain without overloading people with detailed information,” the team writes. “Making this information visual via animated projections helped to communicate the effects in a fun and memorable way. It also enabled us to communicate the same information to children without any negative connotations, but simply educational.”

Housed inside the wooden structure lies an Arduino Uno (ATmega328) and two servo motors, controlled by a joystick that enables the tilting.

 

Hate clapping? Simone Giertz’s latest machine is for you


Let’s give this project a round of applause! 


Guess who’s back with another robotic solution to yet another problem. Simone Giertz, of course! Any of us who’ve ever had to sit through a graduation ceremony, an hour-long presentation, a tennis match, a ballet recital or a political debate know all too well how annoying having to constantly give an applause can be.

Simone.png

So, as part of her aptly named “There Must Be A Better Way” series, the frequent YouTuber and Maker has developed an automated applause machine. Why? Because “clapping your own hands is tiresome and a cruel practice.”

For the mechanism itself, Giertz employed a pair of kitchen tongs and attached a metal spring below the grippers, then put an oval-shaped DC motor between the two arms. This way, when the motor spins, it forces the tongs to open and close, creating a clapping motion.

“For the machine’s hands, I wanted to find a pair that would create the most realistic clapping sound possible. So I bought four different types of plastic hands from a party-supply store. After some experimentation, I decided that hollow hands made of rigid plastic created the best noise. I fastened them to the tongs’ grippers with small bolts,” the Maker explains.

The machine was brought to life using no other than an Arduino Uno (ATmega328) connected to a MOSFET, housed inside a laser-cut base. What’s more, a slider was added to the front of the device to control the speed. According to Giertz, she can now gradually adjust the applause from a “snarky slow clap” to a “breakneck 330 claps per minute.”

Admittedly, this may be one of her best, most practical and well-polished projects yet. We love it! Now how ‘bout a round of applause for Giertz?! You can watch the future of clapping hands below, as well as read her recent write-up in Popular Science here.

HydroMorph turns splashing water into an interactive display


This MIT team has created what they call a water “membrane” that can shift shapes instantly.


A team from MIT’s Tangible Media Group has discovered a new way to turn splashing water into an interactive display, exploiting the same phenomena you’ve experienced if you ever ran a spoon under a faucet.

HydroMorph_main

Using a series of actuators and sensors placed under a stream of water, HydroMorph is able to change the shapes that result whenever water splashes onto the surface of the device, creating what they call a “dynamic spatial water membrane” that can shift from a flower to a flapping bird to an interactive countdown timer.

“HydroMorph gives a life to water, giving it a voice through its shape change. We envision a world filled with living water that conveys information, supports daily life, and captivates us,” the team writes.

Aside from the water-shaping device, the system is comprised of a computer, a camera, an Arduino (ATmega328), and a water source. As the stream hits the device, various shapes are created based on the actuation data sent from software on the computer through the MCU. The camera, which is mounted above the system, detects physical objects and human hands around the device by distinguishing color of them.

HydroMorph itself consists of a flat circular surface and an array of 10 arrow-like modules, each composed of an actuated block, a linkage mechanism and an Arduino-controlled servo motor. These arrows are arranged in a circle and pointing upward towards the stream.

HydroMorph_device

As a stream of water hits the flat surface, a membrane is formed and each module blocks the membrane to manipulate the particular shape. Using the linkage mechanism to convert the rotary motion to linear motion, servo motors enable a vertical displacement of the blocks. The software, built using Processing, generates the shapes based on the way water reacts to the height of each blocker.

“Imagining this device applied in daily life or in public spaces would give, on a practical level, a more responsive and sensitive way to interact with water. On a conceptual level, HydroMorph expands the vocabulary of interactions with this everyday medium of water,” the group adds.

Some of the use cases include notifying you whether or not water is safe to drink by revealing a full-bloomed or wilted flower, extending the functionality of a faucet by filling one or more cups by directing streams of water into them, as well as revealing the weather forecast by showing the iconic shape of an umbrella or sun.

Intrigued? Head over to the project’s paper, or watch it in action below.

Hear the sound of 300 stars with Arduino


Artist Francesco Fabris created a sonic representation of stars and constellations through a dedicated interface.


Unlike some science fiction movies would have you believe, there is no sound in space. With this fact in mind Francesco Fabris created Stellar. This interactive art installation was designed to be “a sonic representation of stars and constellations through a dedicated interface.”

stellar-e1455661629139

This project takes the form of a cylinder with several important constallations represented below its transparent cover. Inside this cover are two robotic arms which are controlled by hand motions via a non-contact sensors and an Arduino Uno (ATmega328). These arms are used to select the star that is seen and heard.

Once selected, several aspects of that star are analyzed, including temperature, brightness (as seen from Earth), distance (from Earth), frequency, amplitude and duration. These statistics are then represented and displayed as a sound and color. The video below shows the installation in action, or you can check out the “making of” video at the end for more insight into this project.

87422-1024x573

“The project has been developed using Arduino and Max7 software,” Fabris explains. “Data of more than 300 stars and 44 constellations have been stored from the open-source software Stellarium.org, and coded to interact with the robotic arms.”

In addition to Fabris, several other people helped make Steller a reality: Patrycja Maksylewicz, Przemysław Koleszka and Eloy Diez Polo. It looks like this was a huge undertaking, involving quite a bit of programming, and a lot of work at the project’s location to get everything set up.

This modified laser cutter can print complex 3D objects from powder


Rice University researchers have modified a commercial-grade CO2 laser cutter to create OpenSLS, an open source SLS platform.


Engineers at Rice University have modified a commercial-grade CO2 laser cutter to create OpenSLS an open source, selective laser sintering platform that can print complicated 3D objects from powdered plastics and biomaterials.

0222_SINTER-Osls-lg-28ae8kd-1.jpg

As impressive as that may be, what really sets this system apart is its cost. OpenSLS can be built for under $10,000, compared to other SLS platforms typically priced in the ballpark of $400,000 and up. (That’s at least 40 times less than its commercial counterparts.) To make this a reality, this DIY device is equipped with low-cost hardware and electronics, including Arduino and RAMBo boards. The Rice team provides more detail around specs and performance in PLOS ONE.

“SLS technology is perfect for creating some of the complex shapes we use in our work, like the vascular networks of the liver and other organs,” explains Jordan Miller, an assistant professor of bioengineering and the study’s co-author. He adds that commercial SLS machines generally don’t allow users to fabricate objects with their own powdered materials, which is something that’s particularly important for researchers who want to experiment with biomaterials for regenerative medicine and other biomedical applications.

To test their concept, the team demonstrated that OpenSLS is capable of printing a series of intricate objects from both nylon powder — a commonly used material for high-resolution 3-D sintering — and from PCL, a nontoxic polymer that’s typically used to make templates for studies on engineered bone.

0222_SINTER-clo-lg-2g0odvn.jpg

It should be noted, however, that OpenSLS works differently than most traditional desktop 3D printers, which create objects by extruding melted plastic through a nozzle as they trace out two-dimensional patterns and 3D objects are then built up from successive 2D layers. On the contrary, an SLS laser shines down onto a flat bed of plastic powder. Wherever the laser touches powder, it melts or sinters the powder at the laser’s focal point to form a small volume of solid material. By tracing the laser in 2D, the printer can fabricate a single layer of the final part. After each layer is complete, a new one is laid down and the laser is reactivated to trace the next layer.

The best way to think of this process, says Miller, is to think of “finishing a creme brulee, when a chef sprinkles out a layer of powdered sugar and then heats the surface with a torch to melt powder grains together and form a solid layer. Here, we have powdered biomaterials, and our heat source is a focused laser beam.”

The professor, who happens to be an active participant in the burgeoning Maker Movement, first identified commercial CO2 laser cutters as prime candidates for a low-cost, versatile SLS machine three years ago. According to Miller, that’s because the cutter’s laser already possessed the right wavelength and perfectly suitable hardware for controlling power and its axes with precision.

Intrigued? You’ll want to see it in action below, and then head over to the team’s Wiki page and GitHub repository to delve a bit deeper.

[Images: Rice University]