Tag Archives: robotics

Video: Watch this little robot ski down a hill


Who said humans should have all the fun in the snow? 


So, this blog may be a little premature for our usual Futuristic Friday posts, however we couldn’t help ourselves. As our friends in Northern California, Colorado or New England hit the slopes, they may soon be joined by a few friendly, pint-sized robots. That’s because University of Manitoba Autonomous Agents Laboratory just taught a humanoid how to ski.

darwin-skiing-1422644436104

Jennifer — who has demonstrated her athletic ability before having played both hockey and soccer — was equipped with a pair of custom wooden skis and two poles. While the open-source robot has already proven capable of climbing walls, running and conquering an obstacle course, her latest challenge was alpine and cross-country skiing as part of the lab’s project for the 2015 DARwIn-OP Humanoid Application Challenge.

The team took to snow to test out the humanoid’s skills in both cross-country and alpine skiing. According to the lab, the control of the alpine skiing was their primary focus, along with improving the cross-country gait. After all, different kinds of snow have different effects on cross-country skiing.

B84AkgUCIAAxid1

“This is the latest extension of our work furthering our research into dynamic balancing and walking under realistic conditions. The changing nature of snowy ground, and the rapid control response required by alpine skiing, present significant challenges to gait design and dynamic balancing in Humanoid Robots, as does the challenge of operating this equipment in cold weather,” the team writes.

In addition, the team set out to have the robot dynamically switch from cross-country to alpine skiing when it detected a change in inclination. In doing so, the robot must deal with uneven surfaces and the gravity pulling it rapidly downhill, while also figuring out how to properly react to these forces. Not to mention, the snow can wreak havoc with Jennifer’s vision-based systems.

The 2015 DARwIn-OP Humanoid Application Challenge will be held in Seattle this May. Until then, be sure to watch the robot traverse the snowy terrain and down a little bunny slope below. Winter Olympics 2018, anyone?

Drawbot is a wireless pen plotter robot


This free-roaming artist on wheels has no work area limitations.


As reported on Bits & Pieces, a number of artistic robots have emerged on the Maker scene as of late. However, unlike some of its predecessors, the newly-revealed Drawbot is a wireless pen plotter that is not bound to a defined work area.

3a00017c6ed4be174fa419828c2689b0

Originally conceived as free-roaming, wireless drawing machine, Matthew Lim decided to explore a bit further into pen plotters — which are based on very similar logic as that of a 3D printer, just with less sophisticated Z-axis movement.

“Upon some research in both the open source community and in the commercial sector, I realized that I was making something new. All pen plotters have limited work areas because of how they work: they all move within a specifically-defined space in order to get precision. I decided to make one that is wireless and free-roaming,” the Maker writes.

The tetherless digital fabrication tool is driven by an Arduino Uno (ATmega328), while its wheels and caster were 3D-printed using on a MakerBot. The remaining parts of the frame were comprised of laser-cut masonite. On the software side, the Drawbot is based on the open-source code for TinyCNC by Makerblock. However, since the Drawbot moves differently than the TinyCNC, Lim needed to significantly modify its Arduino program.

5ffd36b0218ae635c2db51adb5dfa7f1

“I am currently developing a second version chassis. The new frame has fully integrated assembly, which means that minimal hardware is required to put the robot together. I will also be making Drawbot open source by creating an instructable to share the platform and have others participate in its development,” Lim concludes.

Want a pen plotting robot of your own? Head over to the project’s official page to get started. In the meantime, check out some of its latest creations below.

3D printing robots will soon build structures anywhere


The future has arrived. These autonomous 3D printing robots act like a colony of ants to create a structure with materials it finds.


Led by Jason Kelly Johnson and Michael Shiloh, a team of students at California College of the Arts (CCA) in San Francisco have developed autonomous, Arduino-powered robots capable of 3D printing in hostile environments. The two-monthlong project was conducted in the college’s Creative Architecture Machines studio, which was designed to assist aspiring architects bring their ideas to life, rather than simply relying on pre-existing CAD software and other technologies.

FR0X26LI3THBMJN.MEDIUM

Aptly named Swarmscapers, the small bots are equipped to traverse rough terrain, while working solely with on-site materials to build inhabitable structures — something that will certainly come in handy when traditional construction equipment may not be readily available or in a setting where it would have trouble operating.

“Extreme heat and the abundance of raw materials in the desert make it an ideal testing bed for the robotic swarm to operate, creating emergent seed buildings for future habitations that are ready for human occupancy over the course of multiple decades,” its creators write.

FHYFW4NI3THHZ01.LARGE

Each member of the “swarm” is programmed with a rule-set to complete one specific task while working in unison with one another. The Swarmscrapers also come loaded with a binding agent, which allows them to turn nearly any granular material — like sand, salt, rice and sawdust (which was used in tests conducted at CCA) — into intricate shapes.

“The robot works by driving on top of the sawdust based on a tool-path defined in the computer, and dropping a binding agent on the material, hardening it in place. It does this repeatedly, layer by layer until the object is complete.”

FNY3MU6I3THFO1X.MEDIUM

When devising the robots, the team 3D-printed each of its parts right down to the cogs for the wheels. The chassis and frame had to be assembled using a number of metal parts, washers and nuts, along with some aluminum sleeves and zip ties. On the hardware side, there are two stacks: a power module that supplies 7V to the drive motors and the pump motor, and a control module responsible for driving the motors and communicating to the computer.

Based on an Arduino Uno (ATmega328), the latter stack was comprised of an Adafruit battery shield and LiPo battery, two XBee 802.15.4 units, an XBee shield as well as a USB adapter, which enables the robots to be controlled via PC. In addition, an H-bridge motor controller and MOSFET transistor were employed to power the peristaltic pump.

FEOQBPGI3VHA3CY.MEDIUM

“We believe that the potential of autonomous mobile 3D printing is enormous, and with enough time and research, that this is a viable method for 3d printing actual buildings in the future. There is of course, much more work to be done,” the team concludes. “The concept of autonomous machines constructing architecture in bottom up ways will require a huge amount of research into sensory systems, communication systems, advanced machine vision, as well as machine learning.”

Interested in learning more? You can head over to the project’s official page here, or watch it in action below.

Can’t draw? This machine will show you how


Have you always wished you had some sort of artistic abilities? Well, thanks to one Maker, a tiny machine can help. 


Da Vinci, van Gogh, Rembrandt, Monet, Picasso. Those are just some of the names responsible for pioneering art as we have to come to know and love. Fast forward several years, the Maker Movement is ushering in a new era of visionaries who aspire to revolutionize the scene in a similar fashion by granting the everyday Joe (or Jane) the ability to create their own masterpieces. Doing that just, Copenhagen Institute of Interaction Design student Saurabh Datta recently developed a wearable robotic device that can teach you how to draw.

15369172482_33b0158ffc_k_1200

Aptly named Teacherthe exoskeleton-like gadget gently forces your arm into the repetitive motions required for sketching simple shapes using force feedback and haptic response systems. Once strapped to the hand, the wearable directs a user’s wrist and fingers to the necessary positions, while the machine itself records the movement. It then repeats the motion and forces the hand to go to those previous positions, thereby creating a device rhythm.

Before focusing his efforts toward the drawing experience, Datta explored the use feedback mechanisms as a way to give piano lessons. While the first contraption controlled a single finger, the second took care of controlling the learner’s wrist — thus capable of modulating the hand movement over the whole keyboard.

15182734479_6112e2aa70_k_1200

“The whole notion is to understand when machines start knowing more about you and they start showing that to you as feedback — sometimes which may appear against our will, how do you act upon it. On one hand it can act as a teacher and on the other it might appear as machines are operating us,” Datta writes.

Atmga

In order to bring this creation to life, the Maker salvaged 3D printer components and reused their encoders along with (what appears to be) an Arduino and a few EMG nodes. So far, there have been three iterations of Teacher prototypes, each of which demonstrate the potential of machine-led instruction. For Datta, however, the ideal scenario would incorporate both learning and teaching from robotics.

For initial purposes, Datta had employed an Arduino Yún (ATmega32U4), which was later shrunken down to an Arduino Pro Mini (ATmega328) for the final, more compact prototype.

1421185112DSC_4992_2_1250

“We can be better in designing an enabling system rather than just service robots, systems that allow us to do things ourselves better or making us better in certain things rather than doing it for us all the time.”

As to what inspired Datta to pursue this idea, the Maker shares, “I remember when I started first learning alphabets my teachers used to hold my hand with the pen and trace on the paper multiple times, the letters. After letting me go I would do it over and over again and finally it achieved a muscle memory and I could do it by myself. I’m taking this metaphor of the importance of holding hands when learning a new skill.”

Intrigued? Those wishing to learn more can watch Data’s entire thesis below, as well as access technical details on its official project page here.

 

A look back at this week’s top robotics stories


From cleaning to cooking, it looks like The Jetsons were right. 


Fresh on the heels of CES 2015 where the presence of robotic devices grew nearly 25% from last year’s show, this week has seen a number of advancements in the space. In fact, Engadget revealed that robotic-based hardware startups have already raised more than $51.9 million this year alone, with consumer-level droids on the rise thanks to recent crowdfunding campaigns and the burgeoning Maker Movement.

Meet the newest member of your family

Robotbase has set out to create a smart, all-in-one AI robot that can serve as aersonal assistant, photographer, telepresence device and a connected home automation system.

ATLAS becomes more human

DARPA revealed upgrades to its ATLAS robot with a sleeker look and improved functionality. The update represents a 75% enhancement in parts over the robot’s previous version.

New karaoke kings?

Developed by UK-based Engineered Arts, RoboThespians are life-sized humanoid robots that not only serve as museum guides and dish out jokes a comedians, they can sing their little hearts out as well.

Care-o-bot gets older and smarter

Designed as an affordable service robot for personal and professional use, the newly-announced Care-o-bot 4 is a more modular, agile and personable device than its previous iteration, which was introduced six years ago.

Robots learn to cook by watching YouTube

Researchers in UMIACS are exploring autonomy in robotics that includes action recognition. After watching how-to cooking videos, robots are able to learn the complicated series of grasping and manipulation motions required for becoming a master chef, simply by observing what humans do on the Internet.

This is so much cooler than Purell

Futuristic technology has come to the aid of an 8-month-old boy with a congenital heart defect who got a germ-free home courtesy a robot.

Uploading a worm’s mind into a LEGO robot

Called the Open Worm Project, researchers are looking to recreate the behavior of the common roundworm in a machine.

Meccano enters a new era of DIY

With the Meccano Meccanoid, the classic Erector set evolves into an arm-waving, fast-talking and programmable robot for children.

DALER is a bio-inspired robot that can both fly and walk


Inspired by bats, researchers hope this robot may one day find victims in dangerous areas.


As we’ve discussed on Bits & Pieces, drones offer a number of advantages that would have otherwise been inconceivable in previous years, with one area in particular being search-and-rescue. Natural disasters and other emergencies call for timely distribution of medication and aid. Fortunately, unmanned aerial vehicles can make this more efficient. In an effort to prove just that, the robotics division of Switzerland-based National Centre of Competence in Research (NCCR) has recently developed DALER, a bio-inspired robot capable of both flying and walking.

DALER-ROBOHUB-1

DALER, short for Deployable Air-Land Exploration Robot, uses adaptive morphology inspired by the common “vampire bat,” meaning that the wings have been actuated using a foldable skeleton mechanism covered with a soft, flexible fabric, enabling it to be used both as wings and as legs, or whegs.

“In order to design the robot, the team had to first designate the primary mode of locomotion — in this case flight, as the DALER will cover the longest distances this way. With this in mind, a method of using the wings also for walking was devised in a way that does not give extra weight,” Ludovic Daler writes.

DALER-2-for-robohub1

The robot is equipped with triangular, multi-use wingerons that rotate when it is on the ground to push the bio-inspired robot forward and maneuver through the air. This dual-mode locomotion gives DALER the ability to fly long distances to survey large spaces in a short timespan, and then to traverse the terrain in dangerous or inaccessible areas, such as a damaged building to locate victims.

According to the research team, future developments of the robot will include the possibility to hover and to take off autonomously from the ground in order to allow DALER to return to the air and come back to base after the mission. Interested in learning more? Head on over to its official page here.

Wigl is an education robot with a musical ear


A toy robot that teaches kids basic programming and music skills at once.


With the emergence of the Maker Movement, we’ve seen a number of low-cost, easy-to-use kits seeking to make building robots a more enjoyable experience. Instead of generating commands using a smartphone or PC, a company by the name of Wigl is looking to make learning as simple as picking up an instrument and hitting the right note.

wigl-robot-music-learning

Wigl is equipped with a microphone, some motors for movement and what we believe is an Arduino Uno (ATmega328) for its brain. (An Uno had been used for prototyping.) How it works is relatively simple: The device’s built-in microphone registers a recognized note in auto mode, the bot responds by lighting its LEDs and moving in a specific way. The note A played on a recorder, guitar or fiddle, for example, might move it forward, a C could result in a right turn or a D might put it in reverse.

Meanwhile, in programming mode, the bot sits still and listens to the notes being played, storing it in its memory. Every note played is memorized, like lines of code in a computer program. In order for an aspiring Maker to run their Wigl program, they must play a special “ENTER” note. Different notes result in different actions, and planning the order of those notes makes Wigl move in various ways.

Electrical engineer Vivek Mano developed the first prototype way back in July 2013 before beginning to test the proof-of-concept at a Portland, Oregon school. Now, he’s working on creating content for schools to complement the robot in a two-month courses, targeted towards alternative elementary education establishments, such as Waldorf and Montessori.

wigl-robot-music-learning-10

“I want to effectively alter the way kids approach learning,” Mano told Gizmag. “Seeing a child’s eyes light up when they realize that sound thatthey’re making (via musical instrument) can control something is powerful. It’s not something they’re used to. That gets them curious as to what else is possible and (hopefully) will lead them down that rabbit hole.”

As the company continues to generate more exposure and financial support, Wigl as a whole can go one of two ways: open or closed-source. Mano reveals that it can be sold as a standalone, pre-built robot geared more towards the consumer and musical education programs, or as a ready-to-assemble kit incorporating the Arduino bootloader for ease.

“Arduino code is very similar to C code (almost interchangeable at some points) and is a highly marketable skill to learn,” Mano explained to Gizmag.

Interested in learning more? Head over to Wigl’s official page here, and watch it in action below!

 

 

 

Charlie and Billy are cute, smartphone-controlled bots


An Israeli engineer designs a pair of bio-inspired, 3D-printed hexapod robots. 


If you’ve ever stopped by one of our Maker Faire booths, then you surely know our love for hexapod robots. Just ask “Wizard of Make” Bob Martin. Inspired by UC-Berkeley’s recent STAR project, Israel-based Maker Jonathan Spitz recently created a 3D-printed, blue beetle-like bot named Billy.

billycharlie

The proof-of-concept is not only comprised of 3D-printed parts, but is powered by an an Arduino Leonardo (ATmega32U4), a pair of LiPo batteries and dual DC motors. Billy can be controlled using a joystick smartphone app via a built-in Bluetooth module, while his two different sets of legs — straight and spiral — allow him to navigate any terrain.

Shortly after the success of his first build, Spitz decided to develop a second working prototype. Charlie is 3D-printed, cricket-esque hexapod robot that can also be controlled via a mobile device. Impressively, Billy’s smaller and smarter sibling is capable of walking upside down (if he ever flips), climbing over objects of his size, as well as maneuvering up slopes as steep as 45 degrees. The latest iteration of the bot is driven by an Arduino Micro, which receives commands through Bluetooth. The ATmega32U4 based board relays signals to two “baby orangutan” microprocessors that control Charlie’s four motors, which of course, are used for strolling and sprawling.

While Billy consists of 20 parts, Charlie’s 38 different components will require a little longer to assemble. Interested in learning more about the bot brothers? Head on over to their official page here. Meanwhile, watch them in action above!

8 trends shaping the future of making


Our friends at Autodesk explore the significant design and technology trends for 2015. 


Mass personalization will march toward the mainstream

Normal allows its customers to take a few pictures of their ears and uses that to create personalized 3D-printed headphones that fit perfectly in your ear. Normal CEO Nikki Kaufman describes it best as “Personalized, customized products built for you and your body.” In the last few years, we’ve seen companies that offer customers the ability to customize their products, by allowing customers to select from pre-defined options. Diego Tamburini, Manufacturing Industry Strategist at Autodesk predicts that customers will demand products that are uniquely tailored to their needs, tastes and bodies.

(Source: Normal)

(Source: Normal)

Big data will inform our urban landscapes

The design and construction of buildings, infrastructure and the cities they reside in are far too complex to rely on the wooden scale models of old. Architects, engineers and city planners are able to do things that were not possible in the past. As Phil Bernstein, V.P. Strategic Industry Relations at Autodesk put it, “Scale models, however beautifully made, are hardly up to the job of understanding how a building operates in the context of a city.

Thanks to advances in laser scanning, sensors and cloud-based software, cities are now being digitized into 3D models that can be viewed from every angle, changed and analyzed at a moment’s notice.

Cities like Los Angeles, Chicago, Singapore, Tokyo and Boston are working to digitize not just the shapes and locations of the buildings but create a data-rich, living model of the city itself — complete with simulated pedestrian traffic, energy use, carbon footprint, water distribution, transportation, even the movement of infectious diseases.

(Source: Autodesk)

(Source: Autodesk)

Our relationship with robots will be redefined

In the future, humans and robots will collaborate and learn from each other. Today, robots are receiving data and use machine learning techniques to make sense of the world and provide actionable analytics for themselves and humans. Nevertheless, robots are not artists and they will need inspiration and guidance from us for the foreseeable future. In the words of Autodesk Technology Futurist Jordan Brandt, “A robot is no more a craftsman than an algorithm is a designer.”

(Source: Autodesk Gallery France Pop-Up)

(Source: Autodesk Gallery France Pop-Up)

Designs will “grow”

When Lightning Motorcycles wanted to develop a next generation swing arm for their electric motorcycle, they adopted a new Autodesk approach for the project: A computer-aided (CAD) system called Project Dreamcatcher that automatically generates tens, hundreds, or even thousands of designs that all meet your specific design criteria.

Software like Autodesk’s Project Dreamcatcher is ushering a new era of design best described by Autodesk CTO Jeff Kowalski, “We’ll start to see more intensely complex forms, that could appear very organic, or very mathematic.”

(Source: Lightning Motorcycles)

(Source: Lightning Motorcycles)

Manufacturing in space

Made In Space is focused on one thing: making and manufacturing in space. With over 30,000+ hours of 3D printing technology testing, Made In Space has led to the first 3D printers designed and built for use on the International Space Station. As Made in Space CTO Jason Dunn explains, “2015 will be the year of space manufacturing. No longer do engineers need to design around the burdens of launch — instead, in 2015 we will begin designing space systems that are actually built in the space environment. This opens an entirely new book on space system design, a book where complex 3D printed structures that could only exist in zero-gravity become possible.”

(Source: Made in Space)

(Source: Made in Space)

Live materials will be integrated into our buildings

Today, buildings are dead, but new materials and technology are enabling living structures. For example, David Benjamin, founding principal of the design and research studio The Living, is collaborating with plant biologists at the University of Cambridge in England to grow new composite materials from bacteria, a process that uses renewable sugars as a raw material rather than non-renewable petroleum used for plastics. In 2014, The Living delivered Hy-Fi, a “living” installation for the Museum of Modern Art and MoMA PS1’s Young Architects Program competition. The temporary installation involved a 40-foot-tall tower with 10,000 bricks grown entirely from compostable materials — corn stalks and mushrooms — and developed in collaboration with innovative materials company Ecovative. That building was disassembled at the end of the summer and all of the bricks have been composted, returning to grade A soil.

(Source: The Living)

(Source: The Living)

Virtual and augmented reality will be integrated into everyday apps

New virtual devices like the Oculus Rift and augmented reality applications will require an innovative generation of spatial designers. According to Autodesk Technology Futurist Jordan Brandt, current touchscreen interaction will give way to ‘Immersion Design’ that leverages the spatial dimensions offered through emerging augmented and virtual reality platforms.

There’s a bright future for architecture students, game designers and multi-dimensional talent to join app development teams.

(Source: Autodesk and Neoscape)

(Source: Autodesk and Neoscape)

The amount of 3D data will rapidly increase

“With the ability to create 3D models on mobile devices through apps like 123D Catch or the Structure sensor, virtually anyone can begin to capture the spatial world around them. Coupled with the broader adoption of WebGL technology and 3D printing, we can expect an explosion in the amount of 3D data available in 2015. Responding to user demand, social platforms will enable direct sharing of 3D data and start to provide immersive, collaborative experiences.” — Autodesk Technology Futurist, Jordan Brandt

(Source: 123D Catch)

(Source: 123D Catch)

This article written by the Autodesk team originally appeared on Medium.

 

LocoRobo is an IoT bot inspiring the next generation of Makers


LocoRobo offers a modern, cutting-edge robotics kit and a technology-rigorous learning experience.


Drexel University professor Pramod Abichandani and a team of three undergraduate students have developed LocoRobo, a low-cost robot capable of being wirelessly programmed with minimal to no effort. Born out of his own frustrations with bots, Abichandani aspires to advance programming and robotics education for everyone — from first-graders to experienced Makers — by combining a world-class programming ecosystem with a high-quality device.

blueRobot

Abichandani hopes that educators and students alike will be able to utilize the ATmega32U4 based LocoRobo to increase awareness and excitement around STEM. While younger Makers can wirelessly control their robot through the companion mobile app, experienced developers can use various programming languages.

“We have developed open-source application programming interface (APIs) in C, Python, Matlab and Node.js which will allow you to dive into programming LocoRobo beyond the apps. Using these languages you will realize a higher level of control of the LocoRobo robot. While working with our APIs, you will be exposed to several robotics exercises and concepts including multi-robot motion planning and multi-sensor fusion.

ecosystem

As seen inside the Atmel CES booth, the little WALL-E-like gadget is equipped with two wheels, sensors for eyes and antennas in the form of ears. Recently launched on Indiegogo, the Arduino-compatible LocoRobo comes in two separate models: the LocoBasiX and the LocoXtreme. While each possess the same custom main board, status LEDs, differential drive, ultrasponic sensors, lithium-ion battery and BLE, the LocoXtreme model boasts a number of additional features such as motor encoders, an on-board accelerometer and a gyroscopic sensor for those seeking some more sophisticated movement.

Abichandani hopes that every school throughout America (and the world) will one day have a solid robotics program. And, LocoRobo may be able to make that dream possible.