Tag Archives: robotics

Students design a hybrid exploration robot with Arduino

A team of mechanical engineering students at the University of Pennsylvania has developed a search and rescue robot that overcomes many limitations seen in many modern designs using an ATmega32U4 based Arduino unit. The Hybrid Exploration Robot for Air and Land Deployment (H.E.R.A.L.D.) combines a quadroter and a robotic snake to enable movement both through and over obstacles while also surveying them from the skies.

839x400xPosterimageresized2.jpg.pagespeed.ic.U1FKcEa2d1

With guidance from their professor, Dr. Mark Yim, the team set out to build a search and rescue robot that would “be able to traverse uneven and unstable terrain, avoid damaging obstacles and fit through narrow spaces.” All while being able to communicate with the user and be light enough for the average human to carry. Also, with saving lives a major goal, the robot needed to possess the ability to move at a speed that would not hinder the search and rescue process.

The team integrated the two robotic designs to limit the flaws of each individual construct. While quadrotors are known to have short battery life, the team’s system “allows for the quadrotor to be carried by two snakes while not in use, providing increased battery life without sacrificing mobility.” The snake itself would also be limited as to what kind of terrain it could climb, therefore the quadrotor is equipped to carry the snake over large obstacles or debris.

snakelabels-e1367619043976

In order achieve this high degree of maneuverability, the snake was designed with seven degrees of freedom: two vertically actuated (pitch) servos, two horizontally actuated (yaw) servos, and three drive motors. As its creators reveal, these motors are incorporated into a mainly 3D-printed design that aims to optimize structural integrity while minimizing weight.

“Integrated treads on the wheel rims prevent excessive slip and provide edge-catching capability for obstacle clearance. The servo coupling arm acts as a bracket between the two steering actuators while providing a docking interface between the snake and quadrotor.”

A custom-made PCB, designed in Eagle, commands the snake via an Arduino Micro (ATmega32U4) and wirelessly communicates to the user over XBee radio. As for software, the team writes, “On each snake robot, we have an Arduino microcontroller running custom-written software in C++.” The team further details, “This low-level embedded software takes motor commands from a serial packet and outputs to the snake’s motors.”

quadlabelled

The quadrotor itself runs on ArduPilot, an open-source Arduino-based system for operating DIY flying vehicles. After tweaking a few aspects of the software, the team was able to get their desired flight time of approximately 20 minutes. Using a series of magnets, the quadrotor can also carry the snakes for up to 10 minutes.

The team will continue to further develop the search and rescue implementations for the H.E.R.A.L.D. but this combination proves that we have barely scratched the surface of robotic design possibilities. Interested in learning more? The UPenn students’ entire project breakdown can be found here.

Pinokio is an animatronic lamp that can track people

As kids filled with thoughts from Disney movies, we all imagined that our household items might one day magically come alive and interact with us. Makers Shanshan Zhou, Adam Ben-Dor, and Joss Doggett have now made that dream a reality with their face-tracking lamp, appropriately dubbed Pinokio.

rsz_h500_19_344

While it may appear to be an everyday desktop item, this “attention-seeking robot” sports a webcam where traditional lamps usually have a bulb. The webcam possesses the ability to track the motion of individuals within the room and seeks out human faces. Once it finds a face, the Pinokio is programmed to monitor the face and follow its movement.

According to its creators, customized computer code and electronic circuit design “imbues Pinokio with the ability to be aware of its environment, especially people, and to expresses a dynamic range of behavior.”

rsz_h500_19_346

In the event that someone hides their face behind their hands, or say turns around, the Pinokio will curiously seek for another face within the room. Think of this high-tech lamp as having the same mentality as an eager puppy! Just like a puppy, if the Pinokio loses track of a face, a few quick claps will snap it right back to attention.

Zhou says that the Pinokio “compares so greatly to interacting with a real personality, interacting with a real animal, rather than a semi-intelligent toy.” This lifelike personality comes from the on-board [Atmel based] Arduino that is programmed to procedurally manipulate six servo motors. The lamp can even be toggled into “introvert” or “extrovert” modes, which will drastically alter the movement decision-making process.

rsz_h500_19_228

While the Pinokio may be the first step into human interaction with robots, Zhou notes, “I do believe with future robots or human-machine-interaction, we should look into our natural interaction with something that is alive, such as animals and children. I believe getting artists and designers involved in mechanics and electronics creations will give machines of tomorrow a bit more humanity.”

“In the end we may ask: Is Pinokio only a lamp? A useful machine? Perhaps we should put the book aside and meet a new friend.”

Watch out for those snake robots!

Every engineer loves robots, it’s one of the few disciplines that mechanical, electrical, and software engineers all admire. There is a class of robots called snake robots due to their means of locomotion resembling the way a snake works. One such robot , Wheeko, was recently unveiled by the folks at NTNU, the Norwegian University of Science and Technology, the self-same place that Vegard Wollen, the inventor of the AVR microcontroller chip, attended before starting at Atmel.

Wheeko-snake-robot-NTNU

Wheeko, a snake robot developed at the Norwegian University of Science and Technology.

When I asked a Norwegian co-worker if Wheeko might have Atmel microcontrollers in it, he was not sure about Wheeko, but pointed out and earlier robot at NTNU, the Anna Konda was run by eleven mega128 AVR chips.

The Anna Konda was intended as a fire-fighting robot that could crawl through burning or collapsed buildings. There are other applications as well, anywhere that a robot has to work in confined spaces.

So whether Wheeko goes to Mars or his little sister crawls through your veins, you can bet there will be a snake robot in your future.

Humanoid robot Nao can drive its own car

We already know that robot cars will be taking over the highways and byways in the coming years, but what about robot drivers? Aldebaran Robotics and RobotsLab have recently partnered to unveil a version of the NAO robot that can autonomously drive a miniature BMW Z4.

nao-driving-1410269729287

The vehicle has an integrated laser range finder linked to an onboard Atmel based Arduino, which analyzes the vehicle’s surroundings and then relays steering inputs to the NAO unit in the driver’s seat. Additionally, the robot features a two-camera computer vision system, a sonar distance sensor, two infrared emitters and receivers, nine tactile and eight pressure sensors.

While out of the box, the robot and mini sports car work just fine, the entire platform is designed to be open-source. “Bring your craziest ideas to life, send your robot to do things for you, connect him to the internet and share his adventures with the world, create robot-apps and publish them on the NAO app store. There’s no limit to what you can do with that,” the team at RobotsLab writes.

nao-car-features_d66955bb-77a1-4b4a-b750-4d1dbd382fb1

The robot and ride pair will cost around $10,000, but if you are purchasing directly from RobotsLab, you can throw the code “TechCrunchie2014” in at checkout to drop $2,000 from the price. What a steal! No matter the price, the open-source platform makes this Atmel powered robotic tool ideal for implementation into STEM curriculums in high schools or universities.

For more information about this project, check out this overview from RobotsLab.

Flames and acid can’t stop this soft robot

This robot may not look all that intimidating at first glance, but beware — is is as resilient as they come. A group from Harvard University’s Whiteside Research Group has unleashed their latest indestructible design upon the world: pneumatically-powered, fully-untethered mobile soft robot. In other words, a quadruped that can stand up and walk away from its designers.

(Source: Harvard)

(Source: Harvard)

In the original design, a tethered air compressor created this robot’s mass; while, the newest iteration of the crawler possesses an internal compressor that inflates the silicone skin. With more parts being incorporated inside the design’s silicone shell, there is little that can slow down this amorphous android.

The prototype’s design encompassed a complete set of functional elements — including body, power source, control system, and sensors.

“The body of our soft robot consists of four legs connected to a central body, each of which is actuated by a Pneu-Net, in a configuration identical to our previous, tethered quadrupedal soft robot design. In order to increase the rate of actuation of the larger untethered robot, we used a Pneu-Net design that allows for actuation at lower pressures, and with less volumetric flow of gas into the Pneu-Nets, than our prior design,” the researchers note. “The spine of the robot is actuated by two parallel Pnet-Nets with space between them to accommodate the power supply, control board, and two air compressors.”

(Source: Harvard)

(Source: Harvard)

The soft robot measures in at 25.6-inches, significantly larger than its predecessor. The silicone shell housing the unit has been tested at sub-zero temperatures, 40 km/h winds and 3,000 degree kelvin flames for up to 50 seconds. That’s almost 5,000° F! Researchers have even the robot in snow, submerged it in water, walked it through flames, and ran it over with a car. Oh, and this little fellow is resistant to acid, in case you thought you had a bright idea on how to discourage him. After every single experiment, it emerged unscathed.

According to the report, a custom, lightweight board was designed to control the mini air compressors and solenoid valves that actuate the soft robot. An ATmega168 MCU on the controller board with an Arduino bootloader was used for uploading, storing, and executing programs to control the soft robot. Control programs were stored in the onboard memory of the controller.

“These programs, written and uploaded using the Arduino interface, consisted of sequences of commands to the control valves and air compressors. The extent of actuation of a Pneu-Net was controlled by the duration that the valve connecting it to the source of pressurized gas was opened,” the report states.

What does this device mean for the technology community as a whole? This innovation by the team at Harvard is ushering in a new era of autonomous robots. Could you imagine strapping a GPS locater and camera to this crawling unit and exploring areas that are uninhabitable by humans? As with all Atmel powered gadgets, the possibilities are truly endless. As the materials become stronger and the production becomes further internalized and streamlined, an army of these soft robots could be in the field in the coming years.

“Earlier versions of soft robots were all tethered, which works fine in some applications, but what we wanted to do was challenge people’s concept of what a robot has to look like,” said Michael Tolley, co-author of the report and a Research Associate at Wyss Institute. “We think the reason people have settled on using metal and rigid materials for robots is because they’re easier to model and control. This work is very inspired by nature, and we wanted to demonstrate that soft materials can also be the basis for robots.”

Read the Harvard team’s entire publication here.

A night at the museum – with robots!

For the past couple of days, there have been four robots roaming around the Tate Britain museum in London, streaming video to the world as part of a new project. If it’s not cool enough to have robots making their way around a museum in the dark, it gets even cooler, as people from all around the world are controlling their movements from their computers.

0814_tate_ikprize_fullres-0535-660x495

The museum held a contest to promote the use of digital technology while exploring the Tate’s notable history. A digital design team, The Workers, developed an idea to install robot curators to the Tate’s knowledge rich halls. In a program called After Dark, The Workers enabled four robots to be fully controlled by curious individuals over the Internet for a select few evenings. What this means is that anyone anywhere with access to the Internet can become a curator.

Built in collaboration with RAL Space, the four nocturnal tour guides each feature an on-board Wi-Fi receiver, an Arduino unit, a Raspberry Pi computer, lights, sonar sensors, a powerful electric motor, and of course, video streaming technology. The units can navigate the grounds autonomously using a sonar sensor and a custom 3D-printed enclosure.

0814_tate_ikprize_fullres-0518-660x495

People can control the robots using the on-screen buttons or the arrow keys on their keyboard, enabling the embedded curators to turn, move forward and look up or down. Though, if some controllers get a little too overzealous with their inputs, there is a failsafe built into the design. If a robot gets too close to an object they will not move any closer and they will notify you through the control interface. According to Wired, “The Tate consulted with conservationist and health and safety experts to triple-check that the robots wouldn’t knock over or damage the art—some of which dates back 500 years. The robots use sonic sensors to ping signals out, and measure proximity to other objects. They also come with bumpers, as added protection.”

This installation could be the spark of a new trend and may allow users to experience the wonders of a museum halfway across the globe just by the click of a button. As Wired mentions, the real appeal of it all is the nocturnal element. “The darkness is part of the mystery and excitement, you encounter art works in the shadows, the lights from the robots throw pools of light and you can see details and things look different. It has a twist, it’s mysterious, it’s fun.”

Interested in learning more about the After Dark project? You can find more details on its official website here.

Hitchhiking robot finishes cross-Canada trip

Three weeks and 3,700 miles later, the hitchhiking-robot appropriately named hitchBOT has completed its journey through Canada, having relied entirely on the kindness of strangers and its tablet-and-Arduino brains. The robot’s adventure, which began in Halifax on July 27, ended in Victoria on Saturday.

Despite the journey having taken only 21 days, it has been exhausting expedition, even for a robot. hitchBOT sustained minor injuries including a cracked LED shield protector, and its speech is “a little bit more random than it was at the start of the trip.” Nevertheless, Smith notes, the team was elated to report the robot’s journey went off without any problems and it even made countless friends along the way.

“I’m on a boat,” hitchBOT tweeted Saturday night from a British Columbia ferry with a photo showing some fellow passengers. “I’m on my way,” he shared with followers.

hitchbo

“We’re elated. It’s been really great fun and to me it seems like it [has] brought people together in a really interesting way,” explained co-creator David Smith, a professor at McMaster University.

BtUtivYIcAAUxWT.jpg-large

The initial goal of the project, as explained by its creators in a recent news release, was to test how comfortable humans are when traveling with robots, while also seeing how a robot would react to an unpredictable situation. Every 30 minutes or so, hitchBot would snap and send a photo to headquarters and its social media accounts via its 3G wireless connectivity. Based on the photos people have been tweeting and sharing on social media, it appears a vast majority of public have grown to love the adorable fellow. To make picking up hitchBot a bit easier, the gadget came equipped with a car seat attached to its torso so it can be easily strapped to cars and a GPS system so that researchers can track its travels. In addition, it has speech recognition software and can answer simple questions.

Anne Saulnier watches as her husband Brian buckles up the anthropomorphic robot named hitchBOT near Halifax

Trekking coast-to-coast can be a daunting task, and certainly energy draining to say the least. When hitchBOT was running low on battery, it would ask its driver to plug it into an outlet or cigarette lighter within the car. As previously discussed on Bits & Pieces, the hitchhiking gizmo merely consisted of a tablet and Atmel based electronics for a brain, a bucket for a torso, blue swimming-pool noodles for limbs, and a smiling LED panel for a face, which was protected by a cake saver. It wore yellow gloves on its hands and rubber boots on its feet. Together, all the parts set the Makers back only about $1,000; however, the experience of picking up this friendly robot… priceless.

So what’s next for the two-foot-tall bot? Well, unfortunately, robots can’t get driver’s licenses… yet.

The latest high-tech hotel employee rolls into action

One hotel in Silicon Valley is attempting to appeal to its most tech-savvy guests utilizing robots to deliver towels and other necessities. The trendy Aloft Hotel in Cupertino will begin testing a robotic butler dubbed a Botlr — in lieu of humans — in the near future. The first prototype, named A.L.O., will be rolled out on August 20th.

screenshot-2014-08-13-09-58-39

“It’s something that’s very Silicon Valley. It’s very novel and I think it’s the future,” Steve Cousins, CEO of Savioke, told CBS San Francisco.

Devised by the brilliant minds at Savioke, Botlr will be able to navigate the halls of the hotel, as well as communicate with the front desk and guests via Wi-Fi. “As you can imagine, hiring for this particular position was a challenge as we were seeking a very specific set of automated skills, and one that could work — literally — around the clock,” the team at Aloft explains to PSFK.

39677_3_robot_butler_botlr_goes_to_work_in_california_hotel_accepts_tweets_as

So, when a guest calls the front desk asking for an extra towel, a late night snack or even roll of toilet paper, the hotel employees simply load up the robot’s basket, program the room number and send the R2D2-inspired gadget on its way. An LCD touchscreen allows for the input of information, while his movement system is quiet enough that it will not startle guests. Even though you expect A.L.O. to hold his hand out for a Bitcoin or two after he completes a task, he works for nothing but… tweets as tips!

1407864948000-Botlr-Upclose-lr

“This is currently a pilot at Aloft Cupertino and is under consideration, though not yet confirmed, to be implemented at Aloft Sunnyvale when it opens at the end of this year,” Aloft Hotels’ Brian McGuinness tells TechCrunch. “Based on the success of the pilot, we will look to roll out at our nearly 100 hotels around the world in 2015 and beyond.”

The Aloft is just the latest Bay Area hotel to experiment with futuristic technology. In March, the local Personality Hotels announced a program that enables guests to use their smartphones to unlock their hotel room doors.

Though the cost of the robot has yet to be revealed, the location of the launch is quite intriguing. The Aloft Hotel that will launch A.L.O. next week just happens to sit across from Apple’s corporate campus. Talk about a high-tech community!

1,024 tiny robots assemble into shapes like intelligent insects

Researchers in Harvard’s Self-Organizing Systems Research Group have introduced Kilobots — a 1,024-strong swarm of decentralized cooperating robots that can assemble themselves into complex shapes with very little human input.

rubenstein1HR-1408014837622

A team comprised of Michael Rubenstein, Alejandro Cornejo, and Professor Radhika Nagpal have described their 1,024-robot swarm in a detailed study published in Science“Each robot has the basic capabilities required for a swarm robot, but is made with low-cost parts, and is mostly assembled by an automated process. In addition, the system design allows a single user to easily and scalably operate a large Kilobot collective, such as programming, powering on, and charging all robots systems,” the researchers explain.

The thousand plus bots are each embedded with an Atmel microcontroller, two vibrating motors powering rigid legs that allow them to skitter across smooth surfaces, and an infrared emitter-sensor pair to receive commands and communicate wirelessly. They can transform into a variety of shapes, including a starfish and the letter K (as seen below).

rubenstein2small-1408024399454

What makes this piece of work so exceptional is that, before the Kilobot, most collectives were limited to less than 100 robots. In order to exceed previous limitations, this required completely rethinking how the robots were designed. To do this, the team of researchers created a coin-sized robot that possessed the ability to move on three stick-legs using two vibrating motors. It could then communicate with neighbouring robots using the aforementioned infrared light, signal its state by changing a color LED and sense ambient light.

Kilobot robots

In current robotics research, there has been a vast body of work on algorithms and control methods for groups of decentralized cooperating robots, called a swarm or collective. “These algorithms are generally meant to control collectives of hundreds or even thousands of robots; however, for reasons of cost, time, or complexity, they are generally validated in simulation only, or on a group of a few 10s of robots,” the study reveals. With the robots ready, the team developed an algorithm which could guarantee that a large numbers of robots, with limited capabilities and local communication, could cooperatively self-assemble into user-specified shapes. Four “seed” robots kick off the process, generating a domino-effect of signals that propagate through the rest of the swarm. How each Kilobot positions itself is dependent upon the distance between itself and its nearby bots. IEEE Spectrum explains that while in biological systems, swarms can organize and control themselves based on a set of very simple rules. With the Kilobots, however, the algorithm that they use to create shapes are based on a similarly simple set of capabilities:

  • Edge-following, where a robot can move along the edge of a group by measuring distances from robots on the edge
  • Gradient formation, where a source robot can generate a gradient value message that increments as it propagates through the swarm, giving each robot a geodesic distance from the source
  • Localization, where the robots can form a local coordinate system using communication with, and measured distances to, neighbors

“Increasingly, we’re going to see large numbers of robots working together, whether its hundreds of robots co-operating to achieve environmental clean up or a quick disaster response, or millions of self-driving cars on our highways. Understanding how to design ‘good’ systems at that scale will be critical,” said Professor Radhika Nagpal.

For those interested in making, buying or programming their own Kilobot swarm, you can check out Harvard’s official project page here.

Half human, half machine: Cyborgs are upon us

According to Webster’s Dictionary, a cyborg is defined as “a person whose body contains mechanical or electrical devices and whose abilities are greater than the abilities of normal humans.” With the field of biomedical science growing at a rapid pace, there is a rising trend for willing individuals to embed technology into, or onto, their bodies.

140731121216-pkg-segall-cyborb-man-antenna-00002109-horizontal-gallery

Like countless other people on the planet, Neil Harbisson is colorblind. Instead of simply dealing with the disability, the Maker turned to robotics. According to CNN, he has had an antenna surgically implanted into his skull, thus enabling him to “hear” colors. The installed device converts the frequencies for different colors into the frequencies for different sounds.

Neil simply “didn’t want to wear technology, [he] wanted this to be an integral part of [him].” With the antenna installed, Neil can now overcome his affliction with the help of modern technology. This sort of situation is not short of ethical concerns, though. Neil notes that he needed to find a discrete doctor that would carry out the procedure anonymously because of bioethical committees that “don’t really agree with the unions between humans and technology.”

Time will only tell if implants like Neil’s become a norm, but other, more understated devices may drastically improve our daily lives. As NBC News reports, a vast majority of cyborgs get the technology embedded in their fingers or hands, where the skin is thin enough for the devices to interact with external objects. Take for instance Amal Graafsra, creator of Dangerous Things, who recently implanted a tiny RFID chip within his hand that now allows him to gain access to his car, his home and his personal safe. Then there’s cyborg Zoe Quinn, a well-known developer in the independent gaming world, who installed a magnet and chip into her fingers.

While these cyborgs may be no RoboCop, they still fit the definition and may be ushering in a new bionic trend in modern-day technology. And, as we take that leap from today’s wearable technologies to tomorrow’s implantable ones, many of them will likely be used for detecting and preventing disease. Some of the most recent “firsts” include the first bionic eye from California’s Second Sight, the first bionic body suits from companies like ReWalk and Ekso Bionics, and even groundbreaking research from BrainGate in Massachusetts are finding ways for those unable to move or speak to communicate via brain waves.