Tag Archives: robots

This Maker built his own robot drinking buddy

Bot-toms up!

Let’s face it, there’s no fun in drinking alone. This is what inspired South Korean Maker Eunchan Park to develop a robot that can literally go shot for shot with him, albeit never actually consuming the alcohol. Although he may not be able to chat like some of your best buds, the slick device can accompany you if you feel like throwing back a few when no one else is around.


While there have been plenty of bots capable of preparing and mixing cocktails for you in the past, we’re not sure if we’ve ever seen one that actually drinks with you instead. Not only can the aptly named Robot Drinky cheers your glass, his cheeks emit a red light with every chug and he can even signal for a refill as well.

The idea for such a companion was conceived after experiencing a lonely holiday a few years back. As Park explains:

On Christmas in 2012, I drank Soju (Korean alcohol) alone because I had no girlfriend at that time. Drinking alone was definitely terrible! So I couldn’t drink anymore.
Lastly, I put an extra glass in front of me and poured Soju into it. And then, I cheered by myself with the glass of Soju, as though there was someone in front of me. Surprisingly, after that, the taste became totally to be changed!!!!!! WOW!!!

So, I could finally find the secret of taste of alcohol totally depends on existence of partner. This is why I made this robot.

There’s no word yet on whether the Maker has any future plans for Drinky, but we wouldn’t be surprised to find it on Kickstarter or at a CES in the near future. See him in action below!

This autonomous robot feeds on filthy water

The Row-bot is a self-powered robot that can eliminate pollutants and contaminants from water.

Don’t expect to find the tiny robot pictured below swimming in any bathtub or pool anytime soon; in fact, you won’t probably won’t find it in any clean body of water. That’s right, the Row-bot thrives on pollution — the more, the merrier.


Inspired by beetles and other insects like the water boatmen bug who feed off nutrients found in the dirty water it swims in, researchers at the Bristol Robotics Laboratory have developed an autonomous machine with hopes of eliminating pollutants and other dangerous contaminants.

When it is hungry, the Row-bot opens its soft robotic mouth and rows forward to fill its microbial fuel cell (MFC) stomach with nutrient-rich dirty water. It then closes its mouth and slowly digests the nutrients, before using the bio-degradation of organic matter to generate electricity via bio-inspired mechanisms. That same electrical energy keeps the Row-bot propelling to a new location for another gulp of H2O.

In order to produce the most efficient movement possible, the researchers tried to mimic the water boatman whose legs are covered by swimming hairs that span laterally to maximize drag during the power stroke and collapse to minimize drag during the recovery stroke. But whereas the insect has hair-covered legs, the Row-bot’s propulsion mechanism is comprised of a 3D-printed paddle powered by a tiny 0.75 watt brushed DC motor.


Row-bot consists of a 3D-printed composite structure with a rigid frame supporting an elastic membrane — each paddle is stretched out to increase the paddle surface area during the power stroke. The membrane has a hinge that changes the angle of attack on the part of the paddle that remains underwater during the recovery stroke to reduce its frontal area, and therefore, its drag.

This robot has plenty of practical applications, such as remote sensing and environmental monitoring. Row-bot can be used in any kind of water, from fresh to salt to waste water. For instance, they can be thrown in a polluted pond and rove for months, while feeding on the filth and cleaning as they go.

“The work shows a crucial step in the development of autonomous robots capable of long-term self-power. Most robots require re-charging or refuelling, often requiring human involvement,” explains Jonathan Rossiter, Professor of Robotics at the University of Bristol and BRL.

Just think of the possibilities… Head over to the Row-bot’s official paper here to read more.

Phiro is a LEGO-compatible robot for kids

Phiro is a LEGO-compatible robotics toy that kids can play, code and innovate in various ways. 

Research shows that one of the most effective ways for kids to learn problem-solving is through robotics and coding. This is an area that sisters Deepti Suchindran and Aditi Prasad — who are the founders of Boston-based startup Robotix — hope Phiro can play an integral role. The LEGO-compatible kit will enable the future generation to program and solve challenges in a more engaging and interactive manner, whether that includes making a movie or cleaning their room.


Robotix has acquired many years of experience teaching coding and robotics to several K-12 schools. Along the way, they have discovered that such gadgets are usually expensive, use proprietary programming languages and are not so fun for its young user base. Instead, the team is looking to change that with an affordable robotics toy that will assist kids in learning to code and develop computational thinking skills. Young Makers will be able to enhance their coding skills in five different ways, either without a computer or with open source programming languages.

With Phiro, children can play music, create games, flash lights, detect faces and much more. The combination of programming and playing with such a toy will empower the next generation to pursue STEM-related disciplines and to become the innovators of tomorrow.


And so, Robotix has launched a pair of ATmega2560 powered robots for two different age groups: Phiro Unplugged and Phiro Pro. Both units come fully assembled and are ready for use right out of the box. First, Phiro Unplugged is designed for those between the ages of four and eight, and is an excellent instructional tool for sequential programming and binary coding. The best part is that it can all be achieved without a computer. Meanwhile, Phiro Pro has shares many of the same qualities as the Unplugged and then some.

Geared towards Makers between nine to 18, users can program their bot with a computer, tablet or smartphone, which connects wirelessly over Bluetooth to an assortment of programming languages: Scratch 2.0 (MIT), Snap4Arduino (UC Berkeley/Citilab) and Pocket Code mobile apps (Graz University of Technology). Learners can link to an online community that will encourage collaboration, sharing, and of course, more education.

Perhaps one of its greatest selling points is its LEGO compatibility. For instance, Phiro lets you transform your robot into a bull dozer or snow plow with LEGO attachments, and command it to navigate your room and clean your things!


“Be endlessly creative and transform Phiro into an animal, alien, car, join your tea party, or anything you imagine with Phiro’s LEGO-compatible connector. Kids can personalize their own Phiro robots,” the Robotix crew writes. “Want speed? Create code for a remote control in Scratch 2.0, Snap4Arduino, Pocket Code mobile app’s and gear up Phiro with LEGO parts and watch your race car go!”

Want an awesome bot of your own? Head over to Phiro’s Kickstarter campaign, where Robotix is currently seeking $50,000. Delivery is slated for May 2016.

CellRobot is the world’s first smart modular robot

Assemble and control your own robot with the shapes you imagine and the functions you want.

When it comes to the Maker Movement, modularity is king. Think littleBits. Think Microduino. Think Modulo. Think RePhone. But why stop at DIY devices? Sharing many of the same principles, Chinese startup KEYi Technology has introduced a customizable, reconfigurable robot that encourages users to devise their own creation, piece by piece.


As its name would suggest, CellRobot is comprised of various spherical “cells” that can be assembled together to make a wide variety of robots. Once connected, they can be manipulated into different shapes and programmed to perform an assortment of functions. Each building block ball has a 360-degree angle sensor, a servo motor and an MCU, and can attach to any other cell at any angle thanks to its convenient snap joint system.

At the core of every pieced-together project is the “heart,” which supplies power to all of the other modules and wirelessly syncs the robot to its accompanying app. It features a charge port, a tiny speaker, a power indicator, a small screen and standard snap-twist connectors. Additionally, the heart houses a Bluetooth 4.0 module for communicating with your phone or tablet, as well as Wi-Fi and ZigBee for linking to other cells.


CellRobot also offers what the team calls “X-cells,” add-on modules that expand the functionality of the cells with wheels, connectors, spotlights and even a camera. All of the cells and supplemental parts are controlled through a companion iOS and Android app, which comes with two different modes — one that guides you through the building process and another that lets you create any shape or command — as well as a library with preexisting shapes for you to choose from. Despite whichever mode you’re using, the program recognizes the state and shape of your robot and visualizes it in 3D on your mobile device.

The question is: why only have one robot when you can have 100? Head over to CellRobot’s Kickstarter campaign, where KEYi Technology is currently seeking $75,000.

This 3D-printed robot is powered by an ATtiny85

Canbot is an Ollie-like robot that can autonomously drive itself. 

Maxmillian Kern has created an adorable, 3D-printed robot that rolls its way across hard surfaces. The Sphero Ollie-like device, aptly named Canbot, is based on an ATtiny85 MCU and comprised of only four 3D-printed parts for the body and wheels, all connected by screws. The “heavier” components, including the servos and 3.7V battery, are embedded in the lower half to to help it remain balanced.


Not only can it autonomously drive itself around via its ultrasonic sensor, the Maker can also use a modified TV remote for control. Commands are received from the infrared signal of the old clicker. As for power, he originally considered adding some kind of plug to program and charge the robot, but settled for a switch due to space constraints.

“There still is a problem with the weight distribution. I put a piece of lead in the front but that didn’t make it much better,” Kern writes. “It needs some kind of stabilization. But that’s difficult with an ATtiny that only has 5 [technically 6] I/O pins. You would have to sacrifice the ultrasonic sensor with a gyro board. There are lots of possible improvements. The first thing would be nice geared motors instead of servos.”

Interested in building your own? You can find all of Canbot’s files on Thingiverse here.

RoboCup 2015 kicks off in China

An American team beats an Iranian squad 5-4 in this year’s RoboCup final. 

Hot on the heels of its Women’s World Cup victory, the United States has another piece of soccer hardware to add to its collection this year. That’s because a team of American humanoid robots have defeated an Iranian squad in the finals of the 2015 RoboCup held in Hefei, China.


For those unfamiliar with the event, RoboCup is an annual international robotics competition that was first proposed in 1995 and founded in 1997, bringing together engineers, students and Makers alike as they compete against one another via humanoids on a small-scale, artificially-turfed indoor football pitch. The ultimate goal is that one day maybe, just maybe, they will be skilled enough to beat an actual soccer team.

The “players,” which vary in shapes and sizes, are divided into one of eight divisions. Some require the competitors to develop robots with the same hardware and different software, while others call for the machines to be built from scratch.

When robots initially began playing soccer, it was a feat in itself just to have them see the ball, let alone stay upright and kick. Nowadays, these bots are capable of autonomously running up and down the field, scoring goals and even sensing when the ball is nearby. However, as you can see from the video below, they’ve still got some ways to go before they get past Tim Howard or Hope Solo.

This year’s showdown attracted over 300 teams from 47 countries, with a group from University of Pennsylvania (named THORwIn) beating a team from Iran in a tight-fought 5-4 battle between adult-sized bots. What’s more, RoboCup’s opening ceremony featured various robot technologies, while two banners for the occasion were held by drones.

[Image: IB Times]

Why do drones love the Atmel SAM E70?

Eric Esteve explains why the latest Cortex-M7 MCU series will open up countless capabilities for drones other than just flying. 

By nature, avionics is a mature market requiring the use of validated system solution: safety is an absolute requirement, while innovative systems require a stringent qualification phase. That’s why the very fast adoption of drones as an alternative solution for human piloted planes is impressive. It took 10 or so years for drones to become widely developed and employed for various applications, ranging from war to entertainment, with prices spanning a hundreds of dollars to several hundreds of thousands. But, even if we consider consumer-oriented, inexpensive drones, the required processing capabilities not only call for high performance but versatile MCU as well, capable of managing its built-in gyroscope, accelerator, geomagnetic sensor, GPS, rotational station, four to six-axis control, optical flow and so on.


When I was designing for avionics, namely the electronic CFM56 motor control (this reactor being jointly developed by GE in the U.S. and Snecma in France, equipping Boeing and Airbus planes), the CPU was a multi-hundred dollar Motorola 68020, leading to a $20 per MIPS cost! While I may not know the Atmel | SMART SAM E70 price precisely — I would guess that it cost a few dollars — what I do I know is that the MCU is offering an excess of 600 DMIPS. Aside from its high performance, this series boasts a rather large on-chip memory size of up to 384KB SRAM and 2MB Flash — just one of many pivotal reasons that this MCU has been selected to support the “drone with integrated navigation control to avoid obstacle and improve stability.”

In fact, the key design requirements for this application were: +600 DMIPS, camera sensor interface, dual ADC and PWM for motor control and dual CAN, all bundled up in a small package. Looking at the block diagram below helps link the MCU features with the various application capabilities: gyroscope (SPI), accelerator (SPI x2), geomagnetic sensor (I2C x2), GPS (UART), one or two-channel rotational station (UART x2), four or six-axis control communication (CAN x2), voltage/current (ADC), analog sensor (ADC), optical flow sensor (through image sensor Interface or ISI) and pulse width modulation (PWM x8) to support the rotational station and four or six-axis speed PWM control.

For those of you who may not know, the SAM E70 is based on the ARM-Cortex M7 — a principle and multi-verse handling MCU that combines superior performance with extensive peripheral sets supporting multi-threaded processes. It’s this multi-thread support that will surely open up countless capabilities for drones other than simply flying.

Atmel | SMART ARM Cortex M7 SAM E70

Today’s drones already possess the ability to soar through the air or stay stationary, snapping pictures or capturing HD footage. That’s already very impressive to see sub-kilogram devices offering such capabilities! However, the drone market is already looking ahead, preparing for the future, with the desire to get more application stacks into the UAVs so they can take in automation, routing, cloud connectivity (when available), 4G/5G, and other wireless functionalities to enhance data pulling and posting.

For instance, imagine a small town tallying a few thousand habitants, except a couple of days or weeks per year because of a special event or holiday, a hundred thousand people come storming into the area. These folks want to feed their smartphone with multimedia or share live experiences by sending movies or photos, most of them at the same time. The 4G/5G and cloud infrastructure is not tailored for such an amount of people, so the communication system may break. Yet, this problem could be fixed by simply calling in drone backup to reinforce the communication infrastructure for that period of time.

While this may be just one example of what could be achieved with the advanced usage of drones, each of the innovative applications will be characterized by a common set of requirements: high processing performance, large SRAM and flash memory capability, and extensive peripheral sets supporting multi-threaded processes. In this case, the Cortex M7 ARM-based SAM E70 MCU is an ideal choice with processing power in excess of 640 DMIPS, large on-chip SRAM (up to 384 KB) and Flash (up to 2MB) capabilities managing all sorts of sensors, navigation, automation, servos, motor, routing, adjustments, video/audio and more.

Intrigued? You’ll want to check out some of the products and design kits below:

This post has been republished with permission from SemiWiki.com, where Eric Esteve is a principle blogger as well as one of the four founding members of SemiWiki.com. This blog first appeared on SemiWiki on July 18, 2015.

This cockroach-inspired robot can make its way through obstacles

These cockroach-like robots could be used for everything from monitoring fields to search and rescue missions.

Inspired by discoid cockroaches, researchers at UC Berkeley have created a robot that can use its body shape to slip through tight spaces using natural parkour moves. Equipped with the insect’s characteristic rounded shell, the running robot can successfully complete a grass-like obstacle course, without the need for additional sensors or motors.

(Source: UC Berkeley)

(Source: UC Berkeley)

The Berkeley team, led by postdoctoral researcher Chen Li, hopes that the robot will one day inspire the design of future terrestrial robots that can be used in any number of applications, ranging from search and rescue operations to monitoring the environment. While many terrestrial robots have been developed with the ability to avoid obstacles in the past, very few have ever actually traversed them.

“The majority of robotics studies have been solving the problem of obstacles by avoiding them, which largely depends on using sensors to map out the environment and algorithms that plan a path to go around obstacles,” Li explains. “However, when the terrain becomes densely cluttered, especially as gaps between obstacles become comparable or even smaller than robot size, this approach starts to run into problems as a clear path cannot be mapped.”

(Source: UC Berkeley)

(Source: UC Berkeley)

Whereas many robots are able to work on flat surfaces with a few obstacles, in nature, cockroaches and other small animals often have to navigate environments cluttered with shrubs, leaf litter, tree trunks and fungi. So for their study, the researchers employed high-speed cameras to film the movement of the discoid cockroaches through an artificial course comprised of tall, grass-like beams with limited spacing. The cockroaches were fitted with three different artificial shells to observe how their movement was affected by various body shapes, including an oval cone, a flat oval and a flat rectangle.

When the cockroaches were left unmodified, the researchers discovered that, although they sometimes pushed through or climbed over the fake grass, they most frequently used a fast and effective natural parkour moves to slip by the obstacles. In these situations, the robotic insects rolled their body so that their thin sides could slide through the gaps and their legs could push off the beams to help them maneuver.

(Source: UC Berkeley)

(Source: UC Berkeley)

They found that with a flat oval and rectangular bodies, the robot could not often traverse the beams and frequently collided with the objects in its way, often becoming stuck. Conversely, when fitted with the cockroach-esque rounded shell, the six-legged were able to successfully get through the course using a similar roll maneuver to the cockroaches. This adaptive behavior came about with no change to the robot programming, showing that the behavior came from the shell alone.

Looking ahead, the researchers hope to follow up this discovery by searching for other shapes in nature that could enhance the robots’ ability to advance through difficult terrain.

“There may be other shapes besides the thin, rounded one that are good for other purposes, such as climbing up and over obstacles of other types. Our next steps will be to study a diversity of terrain and animal shapes to discover more terradynamic shapes, and even morphing shapes. These new concepts will enable terrestrial robots to go through various cluttered environments with minimal sensors and simple controls,” Li adds.

The first results of the robot’s performance were shared in IOP Publishing’s journal Bioinspiration & Biomimetics.

Build a walking robot with credit cards and an ATmega328

Unlike some POS terminals, this robot takes Visa, Mastercard and Discover.

Writing for MAKE: Magazine, Jeremy Cook has revealed another way that your credit card may wander off other than pickpocketing, of course. The brainchild of Maker “Roger’s Home,” Monster Chan is a wallet-sized, AVR based robot that is actually capable of walking away on its own.


The body of the DIY device is comprised of two expired credit cards along with a set of electronic components. An ultrasonic sensor attached to a servo is employed as its head and tasked with navigating the terrain with its paperclip legs.


Between the pair of plastic pieces lie an Arduino-compatible VISduino Uno board (ATmega328) and a sensor shield serving as its brains, a battery box for its power supply, an IR sensor for remote commands and six servos.


A set of middle servos seem to handle the movement and turning of the budget-friendly robot, as it makes its way left and right and propels itself forward with the aid of its other legs.


If somehow your credit cards vanish, not to worry. Cook jokes, “It looks like it would be very hard to use in a reader.” You can see it in action below!

Control your own swarm of robots with a swipe of your finger

Using a tablet and a red beam of light, researchers have created a system that enables people to control an army of robots with the swipe of a finger.

Have you always wanted to control your own swarm of tiny robots? Thanks to Georgia Tech researchers, not only is it possible, it’s as easy as swiping your finger across a tablet. While commanding an army of Terminator-like machines may be cool and all, these bots were designed to work unison to accomplish a common objective throughout industrial and agricultural settings, as well as in disaster recovery missions.


As the team reveals, leading the mob of robots is pretty straightforward, only calling for a mobile device and a red beam of light. An operator taps the tablet to control where a beam of light appears on a floor. The swarm then rolls toward the illumination, constantly communicating with each other and deciding how to evenly cover the lit area. When the user swipes the tablet to drag the light across the floor, the robots follow. If the operator puts two fingers in different locations on the tablet, the machines will split up to cover both areas and repeat the process.

“It’s not possible for a person to control a thousand or a million robots by individually programming each one where to go,” explained Magnus Egerstedt, Schlumberger Professor in Georgia Tech’s School of Electrical and Computer Engineering. “Instead, the operator controls an area that needs to be explored. Then the robots work together to determine the best ways to accomplish the job.”

Professor Egerstedt envisions various use cases for such bots, including tsunami-ravaged regions. For instance, the robots could search for survivors, dividing themselves into equal sections. If the help of machines were suddenly required in a new area, a single person could instantly redeploy them. Another prime example is employing the robots for agriculture. Given that the technology is simple to use, any farmer could now streamline the crop-checking process without having to physically walk down each field.

Impressively, what sets the Georgia Tech algorithm apart from others is that the robots can change their minds. In other words, if a user sends them to one area, that operator can quickly change their path with just a swipe to send them somewhere else. Egerstedt adds, “The field of swarm robotics gets difficult when you expect teams of robots to be as dynamic and adaptive as humans. People can quickly adapt to changing circumstances, make new decisions and act. Robots typically can’t. It’s hard for them to talk and form plans when everything is changing around them.”

What’s more, the tablet-based control system is geared towards everyone — even those without robotics experience. Now, if only we can deploy a similar army of bots to clean the house…. Intrigued? Read more from the Georgia Tech team here, as well as head over to its official page to keep up-to-date with the latest projects from the GritsLab.