Tag Archives: Wired

This $11 robot can teach kids how to program

A group of Harvard University researchers — Michael Rubenstein, Bo Cimino, and Radhika Nagpal — have developed an $11 tool to educate young Makers on the fundamentals of robotics. Dubbed AERobot (short for Affordable Education Robot), the team hopes that it will one day help inspire more kids to explore STEM disciplines.

robot-kids-inline1

Fueled by the recent emergence of the Maker Movement, robots are becoming increasingly popular throughout schools in an effort to spur interest in programming and artificial intelligence among students.

The idea behind this particular project was conceived following the 2014 AFRON ellenge, which encouraged researchers to design low-cost robotic systems for education in Third World countries. As Wired’s Davey Alba notes, Rubenstein’s vast experience in swarm robotics led to him modding one of his existing systems to construct the so-called AERobot. While it may not be a swarm bot, the single machine possesses a number of the same inexpensive components.

So, what is the AERobot capable of doing?

  • Moving forward and backward on flat, smooth surfaces
  • Turning in place in both directions
  • Detecting the direction of incoming light
  • Identifying distances using reflected infrared light
  • Following lines and edges

With a megaAVR 8-bit microcontroller as its brains, the team assembled most of its other electronic parts with a pick-and-place machine, and to reduce costs some more, used vibration motors for locomotion and omitted chassis. AERobot is equipped with a built-in USB plug that also enables it to be directly inserted into any computer with a USB port — unlike a number of other bots.

“Using this USB connection, it can recharge its lithium-ion battery and be reprogrammed all without any additional hardware. AERobot has holonomic 2D motion; using two low-cost vibration motors, it can move forward, backwards, and turn in place on a flat, smooth surface such as a table or whiteboard. It also has three pairs of outward-pointing infrared transmitters and phototransistors, allowing it to detect distance to obstacles using reflected infrared light, and passively detect light sources using just the phototransistors.”

usb

In addition, the bot features one downward-pointing infrared transmitter along with a trio of infrared receivers to detect the reflectivity of the surface below, which is useful for line following. To aid in learning programs and debugging, AERobot also boasts an RGB LED.

On the software side, AERobot uses a graphical programming environment, which makes reprogramming easy for beginners. By modifying the minibloqs programming language, Rubenstein says you don’t really need to type code, instead you just drag pictures. He went on to tell Wired, “Say I wanted an LED on the robot to turn green. I would just drag over an image of an LED, and pick the green color.”

Interested in learning more? You can scroll on over to the project’s official page or read its entire Wired feature here.

 

Wearable Knitgadget controls your (musical) devices

Royal College of Art student Yen Chen Chang recently debuted the Knitgadget, a wearable glove that allows users to control various devices, musical or otherwise.

As Engadget‘s Mariella Moon reports, the glove is comprised of conductive yarn that’s 80% polyester and 20% stainless steel (and 100% pure awesomeness). Chang knit and crocheted a series of objects that control devices by rubbing, pulling and stroking. When manipulated, the overlap of the metal fiber causes the textile to change conductivity which is then measured by an Atmel powered Arduino and communicated to the gadgets.

“[The glove is] wired so that it [also] functions as a wearable musical instrument that’s both a keyboard and a guitar. This glove is but one of Chang’s unusual creations designed to control devices without the use of buttons and touchscreens,” Moon writes.

IMG_5744

The objects’ sensitivity to change in conductivity is contingent upon on how the textiles are constructed. For instance, a knit which is looser because of its looping is better suited for stretching. This provides objects a bigger range of motion. Whereas something like a weave, which Chang used to produce this pair of conductive gloves, means the fibers are much tighter together thus limiting the range of resistance and giving the textile a more precise function.

“The differences of sensing abilities in each textile sensors are determined by the the level of skill that constructed them, and are suitable for different kinds of motion sensing. This forms a new way to control and interact with electronic objects, and encourage people to reimagine how we use them,” the Maker explains.

This glove is but one of the Maker’s recent projects designed to control devices without the use of buttons and touchscreen — ranging from a ball that can power a juicer to a mat that turns on a small electric fan when you pet it.

IMG_5927

Chang believes his yarn could potentially revolutionize wearable computing, and one day dreams of working with clothing companies that in the knitted footwear space.

Interested in learning more? You can check out Wired’s coverage of the KnitgadgetEngadget’s write-up, as well as the project’s official page. Not a big fan of reading? Watch the glove in action below.

A night at the museum – with robots!

For the past couple of days, there have been four robots roaming around the Tate Britain museum in London, streaming video to the world as part of a new project. If it’s not cool enough to have robots making their way around a museum in the dark, it gets even cooler, as people from all around the world are controlling their movements from their computers.

0814_tate_ikprize_fullres-0535-660x495

The museum held a contest to promote the use of digital technology while exploring the Tate’s notable history. A digital design team, The Workers, developed an idea to install robot curators to the Tate’s knowledge rich halls. In a program called After Dark, The Workers enabled four robots to be fully controlled by curious individuals over the Internet for a select few evenings. What this means is that anyone anywhere with access to the Internet can become a curator.

Built in collaboration with RAL Space, the four nocturnal tour guides each feature an on-board Wi-Fi receiver, an Arduino unit, a Raspberry Pi computer, lights, sonar sensors, a powerful electric motor, and of course, video streaming technology. The units can navigate the grounds autonomously using a sonar sensor and a custom 3D-printed enclosure.

0814_tate_ikprize_fullres-0518-660x495

People can control the robots using the on-screen buttons or the arrow keys on their keyboard, enabling the embedded curators to turn, move forward and look up or down. Though, if some controllers get a little too overzealous with their inputs, there is a failsafe built into the design. If a robot gets too close to an object they will not move any closer and they will notify you through the control interface. According to Wired, “The Tate consulted with conservationist and health and safety experts to triple-check that the robots wouldn’t knock over or damage the art—some of which dates back 500 years. The robots use sonic sensors to ping signals out, and measure proximity to other objects. They also come with bumpers, as added protection.”

This installation could be the spark of a new trend and may allow users to experience the wonders of a museum halfway across the globe just by the click of a button. As Wired mentions, the real appeal of it all is the nocturnal element. “The darkness is part of the mystery and excitement, you encounter art works in the shadows, the lights from the robots throw pools of light and you can see details and things look different. It has a twist, it’s mysterious, it’s fun.”

Interested in learning more about the After Dark project? You can find more details on its official website here.

Exoskeleton helps paraplegics walk again

Dr. Amit Goffer, Founder of Argo Medical Technologies, has embarked on a quest to change the way the world thinks of ambulatory aids. The inventor, who himself became a quadriplegic after an ATV accident in 1997, has been tinkering with the ambulatory device’s schematics on various levels for years. The project — which hopes to enable those suffering from spinal cord injuries to walk upright once again — saw its first test runs in Goffer’s garage in 2004 and eventually reached clinical trials by 2006. Just this June, ReWalk became FDA approved.

ReWalker Radi – Rehacare 2012, Düsseldorf, Germany

The ReWalk is the first exoskeleton to be approved by the government regulatory organization for personal use. While the device does carry a hefty $70,00 price tag, the benefits that it can provide the wearer are priceless. Though, Argo CEO Larry Jasinski hopes to convince insurance companies to cover the costs of the ReWalk in the future, as the positive gains of the device are innumerable.

As Wired’s Issie Lapowsky points out, most of today’s basic wheelchair design is the same as it was when it was drawn on ancient Chinese vases centuries ago. By strapping braces to a user’s legs and using computers and motion sensors to control movements, ReWalk aspires to revolutionize the outdated market.

ReWalk04-660x440

Gene Laureano, an Army veteran, has been paralyzed since 2001 and often works with Argo Medical Technologies to demonstrate the benefits of the ReWalk. Having regained his mobility thanks to the ReWalk, he happily posits the question to Wired, “Does it get any better?” The motorized device uses two crutches and an electronically aided leg system to help paraplegics prove those wrong who told them they would never walk again.

Dr. Ann Spungen, an initial skeptic of the ReWalk, has also endorsed the widespread adoption of the device. Dr. Spungen is the Associate Director of the National Center of Excellence for the Medical Consequences of Spinal Cord Injury at the James J. Peters Center. There are 6 ReWalk units at the Center responsible for getting 14 paraplegic patients back on their feet. She has noticed sweeping improvements from her patients that use the ReWalk, whether they gain muscle or simply have an improved self-image.

ReWalk is designed for a top speed of about 1.3 miles per hour, fast enough to make it safely across an intersection before the light changes. Moving forward, Dr. Spungen thinks the 3 mile per hour speed of the ReWalk should be heightened, as a conventional wheelchair is still much faster.

In all, the fact that an exoskeleton has been approved for personal use may shift the path medical technology takes in the future. Since Dr. Goffer and Argo have proven the benefits of this kind of design, the possibilities for medical aid are seemingly endless. Dr. Goffer, a quadriplegic himself, cannot reap the benefits of his own invention… just yet. “My time will come, I’m patient enough for that,” he says.

 

Open source IoT with Contiki

Contiki – an open source OS for the IoT – is developed by a world-wide team of devs with contributions from a number of prominent companies such as Atmel, Cisco, ETH, Redwire LLC, SAP and Thingsquare.

Image Credit: Wikipedia

Essentially, Contiki provides powerful low-power Internet communication, supporting fully standard IPv6 and IPv4, along with recent low-power wireless standards: 6lowpan, RPL and CoAP.

With Contiki’s ContikiMAC and sleepy routers, even wireless routers can be battery-operated. 

Contiki facilities intuitive, rapid development, as apps are written in standard C. Using the Cooja simulator, Contiki networks can be emulated before being burned into hardware, while Instant Contiki provides an entire development environment in a single download.

Recently, the open source Contiki was featured by Wired’s Klint Finley, who describes the versatile OS as the go-to operating system for hackers, academics and companies building network-connected devices like sensors, trackers and web-based automation systems.

“Developers love it because it’s lightweight it’s free, and it’s mature. It provides a foundation for developers and entrepreneurs eager to bring us all the internet-connected gadgets the internet of things promises, without having to develop the underlying operating system those gadgets will need,” he writes.

Image Credit: Wikipedia

“Perhaps the biggest thing Contiki has going for it is that it’s small. Really small. While Linux requires one megabyte of RAM, Contiki needs just a few kilobytes to run. Its inventor, Adam Dunkels, has managed to fit an entire operating system, including a graphical user interface, networking software, and a web browser into less than 30 kilobytes of space.”

Unsurprisingly, consumer technology companies are beginning to embrace Contiki as well. To help support the burgeoning commercial usage of Contiki, OS founder Adam Dunkels ultimately left his job at the Swedish Institute of Computer Science and founded Thingsquare, a startup focused on providing a cloud-based back-end for Contiki devices.

“The idea is to make it easy for developers to connect their hardware devices with smartphones and the web,” added Finley.

Image Credit: Wikipedia

“Thingsquare manages the servers, and provides all the software necessary to manage a device over the web.”

It should be noted that Thingsquare recently showcased various Internet of Things (IoT) applications at Embedded World 2014 in Nuremberg, Germany.

Indeed, a number of Thingsquare’s demonstrations were powered by Atmel’s recently launched SAM R21 Xplained PRO evaluation board – illustrating the seamless integration of Thingsquare’s software stack with Atmel’s new SAM R21 ultra-low power wireless microcontroller (MCU).

Interested in learning more? You can check out Contiki’s official page here and read about Thingsquare’s use of Atmel tech here.

ATmega328P-based TinkerBots hit Wired’s Gadget Lab

TinkerBots is an Atmel-powered (ATmega328P MCU) building set that enables Makers and hobbyists of all ages to easily create an endless number of toy robots that can be brought to life without wiring, soldering or programming.

http://vimeo.com/91590326

Indeed, TinkerBots’ specialized “Power Brain” and kinetic modules twist and snap together with other TinkerBots pieces – and even LEGO bricks – adding movement and interest to whatever sort of robot a Maker can imagine and build.

The centerpiece of the TinkerBots building set is a square, red “Power Brain” module (approximately 1.5”x1.5”x1.5”) that contains Atmel’s ATmega328P microcontroller. This module is tasked with providing wireless power and data transmission to kinetic modules such as motors, twisters, pivots and grabbers.

Kinematics launched its official TinkerBots Indiegogo campaign a few weeks ago, with the building set garnering coverage from a number of prominent publications, including Wired’s Gadget Lab.

“Once you snap together a contraption, you can program it in a few different ways. By pressing the ‘record’ button on the Powerbrain brick and twisting the robot’s motorized parts, it will remember those movements and replicate them when you hit the ‘play’ button. And if you want to step it up and write your own code, you can also program your robots via the Arduino IDE,” writes Wired’s Tim Moynihan.

“TinkerBots started out as an Indiegogo campaign, and it blew past its $100,000 goal in less than a week; its funding now is nearly double that amount, with about a month left to go in its campaign. You can preorder various kits now, and prices vary depending on the number and type of pieces in each set. For $160, you get a basic car-building set with the Powerbrain, motors, wheels, a twister joint and some other bricks. There’s an animal-themed set for $230, a grabber claw set for $400 and $500 gets you a fully loaded kit with bricks to build anything.”

Interested in learning more? You can check out the official Indiegogo TinkerBots page here.

Why Makers are the new Industrial Revolution



Writing for OpenSource.com, Luis Ibanez offers a succinct review of “Makers: The New Industrial Revolution” by Chris Anderson. 

As Ibanez notes, Anderson is a former Editor in Chief of Wired and no stranger to the economic paradoxes of peer-production and open source. He is also the CEO and co-founder of 3D Robotics, a company dedicated to producing kits for the DIY drone community.



”In his most recent book, Anderson examines the historical parallels between the Maker movement and the second Industrial Revolution [which] took place between 1850 and the end of World War I,” writes Ibanez.

makerschrisanders

“While the first Industrial Revolution (1760-1840) was based on large factories and expensive means of production, the second was characterized by the development of small machines (in particular the spinning wheel and the sewing machine) that democratized the means of production, leading to the proliferation of home-based micro business and cottage industries.”

Anderson then explains how the advent of the Maker Movement and 3D printing ecosystem will prompt a second Industrial Revolution which is expected to unfold at the speed of the information age. More specifically, Anderson discusses the Atmel-powered MakerBot 3D printer, noting that the platform is not just a tool, but rather:

  • 

A plaything
  • Revolutionary act
  • Kinetic sculpture
  • Political statement
  • Thrillingly cool

Of course, the above-mentioned description applies to other 3D printers as well like RepRap, along with the rest of the DIY Maker Movement.

“Open source is not just an efficient innovation method—it’s a belief system as powerful as democracy or capitalism for its adherents,” Anderson emphasizes.

The author also offers a closer look at a number of Maker-related business stories, including Local Motors, SparkFun, Kickstarter, Etsy, MFG and OpenPCR.

“This book, Makers, helps us put into perspective the impact that the maker culture will have in the following years on the renaissance of manufacturing, while showing us how we can apply to the new revolution, the lessons that we’ve learned from the second Industrial Revolution of 1850 and the lessons from the more recent emergence of desktop computers in the 1980’s,” adds Ibanez.

Interested in learning more? You can pick up Makers: The New Industrial Revolution for $11.84 on Amazon Kindle here.

Neuroscience goes open source at MIT & Brown

Josh Siegle, a doctoral student at MIT’s Wilson Lab, recently told Wired that today’s neuroscientists are expected to be accomplished hardware engineers, fully capable of designing new tools for analyzing the brain and collecting relevant data.

“There are many off-the-shelf commercial instruments that help you do such things, but they’re usually expensive and hard to customize,” Siegle explained. 

”Neuroscience tends to have a pretty hacker-oriented culture. A lot of people have a very specific idea of how an experiment needs to be done, so they build their own tools.”

The problem? As Wired’s Klint Finley notes, few neuroscientists actually share the tools they create, which often lack design principles such as modularity. Meaning, project-specific devices and platforms can’t be reused for other experiments. 

That is precisely why MIT’s Siegle and Jakob Voigts of Moore Lab at Brown University founded Open Ephys, a project for sharing open source neuroscience hardware designs.

“We don’t necessarily want people to use our tools specifically,” Siegle clarified. “We just want to build awareness of how open source eliminates redundancy, reduces costs and increase productivity.”

Open Ephys officially kicked off three years ago as part of a research project tracking hippocampus and cortex activity in mice.

“We spent about half a year looking for the perfect commercial data acquisition tool to use for our experiment recording electrical signals from brains,” said Siegle. “We looked at all of the commercial systems and all of them were inadequate in some way.”

Rather than MacGuyver yet another platform, the duo decided to adopt a more modular approach by moving the creative process online. In addition, the two chose many of the same tools used by hackers and modders, including Arduino boards.

“We like Arduinos because lots of people know how to use them, and they’re easy to get your hands on,” Siegle added.

Interested in learning more? You can check out Wired’s full write up here and the Open Ephys gallery here.

Will cyborg plants monitor our world?

Writing for Wired, Klint Finley says the world could soon see cyborg plants that tell us when they need more water, what chemicals they’ve been exposed to and what parasites are chomping away at their roots.

“These half-organic, half-electronic creations may even tell us how much pollution is in the air,” writes Finley. “And yes, they’ll plug into the network. That’s right: We’re on our way to the Internet of Plants.”

Indeed, Andrea Vitaletti, who heads a research group on Italy, is working on a project known as PLEASED, an acronym for “Plants Employed as Sensing Devices.” Although the initiative is still in a nascent stage, Vitaletti believes plants could ultimately serve as sophisticated sensors tasked with monitoring our environment.

“Plants have millions of years of evolution. They are robust. They want to survive,” Vitaletti told Wired. “There’s evidence that plants react to damages, parasites, pollutants, chemicals, acids, and high temperature. But what’s not known is whether it’s possible to look into the signal and see what generated the event.”

To be sure, Vitaletti acknowledges that it may be somewhat difficult to definitively analyze and interpret the signals.

“In some ways, this is easier than doing research on humans, because the signals are simpler,” he concludes.

Nevertheless, Vitaletti and other scientists already are working to connect various species with Atmel-based Arduino boards capable of recording and transmitting information. Ultimately, cyborg plants could detect parasites and pollutants in crops, or play a critical role in precision agriculture by automatically requesting water and nutrients.

Interested in learning more? You can check out the full Wired article here and “The Internet of Things, Stalk by Stalk,” written by Atmel’s very own Paul Rako here.

Ardunio-powered Colour Chasers offer music for all

A London-based sound artist and designer named Yuri Suzuki has designed a robot that allows people of all ages and abilities to write music.

“I have passion to make and play music. I used to learn piano, trombone, guitar, however reading musical score is the biggest wall for me,” Suzuki told Wired (UK).

“I used to play trombone in a Ska music band. We had been working together for seven years. However, I got fired because I cannot read musical score. So I dreamed to create new musical notation [to give] dyslexic people [easy access].”

As Wired’s Liat Clark notes, Colour Chasers are aptly named, as the small train-like robots literally chase colors drawn onto a black line, playing a different sound for each different color or shape they meet.

“Each robot is fitted with two sensors that run on [an Atmel-powered] Arduino – one is programmed to follow black lines made by marker pens, and the other to detect different colors,” Clark explained. “There are a total of five different car robots that each read colors differently, producing different sounds, from drums to chords.”

The Colour Chasers were exhibited as a public audiovisual installation “Looks Like Music” this summer at Mudam Luxembourg.

“I [wanted] to show the potential of music and sound.This installation is based on basic musical logic and people understand how the process works,” Suzuki added.

So what’s next for talented musician? Well, Suzuki and his R&D consultancy company Dentaku are currently working on a synthesizer board dubbed Ototo that transforms saucepans into drum kits and makes origami sculptures sing when touched using accompanying sensors, inputs and touch-pads. Ototo is slated to make its debut at the Abandon Normal Devices Festival Fair on Saturday, October 5th.