Tag Archives: Arduino Due

The Linux Foundation is building an RTOS for the Internet of Things


The Zephyr Project will offer a modular, connected operating system to support IoT devices.


The Linux Foundation recently introduced the Zephyr Projectan open source collaborative effort that hopes to build a real-time operating system (RTOS) for the Internet of Things. Announced just days before Embedded World 2016, the project is looking to bring vendors and developers together under a single OS which could make the development of connected devices a simpler, less expensive process.

RTOS.png

Industrial and consumer IoT devices require software that is scalable, secure and enables seamless connectivity. Developers also need the ability to innovate on top of a highly modular platform that easily integrates with embedded devices regardless of architecture.

While Linux has proven to be a wildly successful operating system for embedded development, some smart gadgets require an RTOS that addresses the very smallest memory footprints. This complements real-time Linux, which excels at data acquisition systems, manufacturing plants and other time-sensitive instruments and machines that provide the critical infrastructure for some of the world’s most complex computing systems.

If all goes to plan, the Zephyr Project has the potential to become a significant step in creating an established ecosystem in which vendors subscribe to the same basic communication protocols and security settings.

With modularity and security in mind, the Zephyr Project provides the freedom to use the RTOS as is or to tailor a solution. The initiative’s focus on security includes plans for a dedicated working group and a delegated security maintainer. Broad communications and networking support is also addressed and will initially include Bluetooth, BLE and IEEE 802.15.4, with more to follow.

The Zephyr Project aims to incorporate input from the open source and embedded developer communities and to encourage collaboration on the RTOS. Additionally, this project will include powerful developer tools to help advance the Zephyr RTOS as a best-in-breed embedded technology for IoT. To start, the following platforms will initially be supported:

  • Arduino Due (Atmel | SMART SAM3X8E ARM Cortex-M3 MCU)
  • Arduino 101
  • Intel Galileo Gen 2
  • NXP FRDM-K64F Freedom board (ARM Cortex-M4 MCU)

Intrigued? Head over to the Zephyr Project’s official site to learn more.

This Arduino-powered machine turns tweets into cocktails


Who knew you could get drunk on data? 


You’ve most likely read a tweet, you’ve probably even heard a tweet aloud, but chances are you’ve never tasted a tweet. But that may all soon change, because Clément Gault and Koi Koi Design have developed Data Cocktail, an Arduino-powered machine that whips up cocktails based on, you guessed it, Twitter activity.

dt2

Data Cocktail works by scouring the web for the five latest posts mentioning keywords that are linked to available ingredients, represented by differently colored bulbs. (The system will accept either words, hashtags and mentions.) These messages are then used to define the composition of the drink and fill the glass accordingly. The result is an original, crowdsourced concoction whose recipe can be printed out.

“If you’re wondering whether a tweet about Santa Claus in Winnipeg, Canada can take part in generating a cocktail in Nantes, we say yes! Data Cocktail is a machine but it doesn’t exclude a minimum of politeness,” its creators reveal. “Once the cocktail mix is realized, Data Cocktail will thank the tweeters who have, without knowing it, helped at realizing it.”

dt3

Its creators reveal that they can easily change the keywords, ingredients and proportions to suit specific events. Meaning, the robotic bartender can make drinks based on everything from election coverage (whether you’re experiencing a Trumpertantrum or feeling the Bern) to what’s trending at any particular moment.

In terms of software, Data Cocktail uses the Processing and Arduino programming languages. A first application, developed in Processing, pilots the device. The requests are performed using the Twitter4J library, while the app processes the data and commands the robotic gadget.

dt6

As for its electronics, Data Cocktail is comprised of a robot, solenoid valves and LEDs. The robot is built around a modified Pololu Zumo chassis with a motor shield, a Bluetooth module and an Arduino Pro (ATmega328). Meanwhile, the valves and lights are controlled by an Arduino Due (SAM3X8E) connected via USB.

Intrigued? Head over to the project’s page here, or watch it in action below.

Build your own spider-like robot with STEMI


This DIY kit lets kids make their own nature-inspired robot while learning electronics, programming and more. 


What’s better than a bio-inspired, crawling robot? A spider bot that you can build yourself, that’s what. Locomotion mimicking nature has been around for a little while, but up until now has only been available to university researchers. That’s all going to change. In an effort to entice more young Makers to pursue STEM fields, one Croatian startup has developed a DIY smartphone-controlled hexapod.

STEMI

STEMI, a play on the acronym STEM, ships in the form of a DIY kit along with a series of multimedia tutorials that instruct its teenage Makers to piece together their gadget and bring it to life. More than just a robot, however, STEMI is designed to be a learning experience for users ages 13 and up as they explore the basics of 3D modeling, electronics, Arduino and programming. In the near future, they’ll also be able to create their own 3D-printable custom covers, ranging from Batman to a Walking Dead-like zombie.

Although primarily targeted for the younger generation, there’s nothing that says kids at heart can’t get in on the fun as well. STEMI is capable of performing complex movements, adjusting heights, walking in three different way and dancing. The best part? Using a smartphone’s built-in gyroscopic sensor, Makers can completely control the robot’s movement by simply tilting their handheld device.

MG_8892

Making it even cooler is the fact that STEMI is fully open source, meaning anyone can freely modify its code, blueprints, 3D models and more. The robot itself is built around an Arduino Due-compatible (SAM3X8E) board and a custom PCB packed with an Arduino shield, a Bluetooth module, a USB battery charger, voltage regulators and LED indicators. Aside from that, the kit comes with 18 servo motors, a rechargeable battery pack, aluminum body parts, rubber leg caps, as well as various nuts and spacers.

So, are you ready to begin assembling your own spider bot? Then crawl over to its Indiegogo campaign, where the STEMI team is currently seeking $16,000.

ANDBOT is a C-3PO-like robot for your home


ANDBOT is a humanoid that is less of a robot but more of a companion to you and your family.


If there’s one thing that recent crowdfunding projects have demonstrated, it’s that social robots will soon be making their way into our homes. And that’s not necessarily a bad thing, either. Think about it: You’d never have to worry about all those tedious chores. No more sweeping. No more laundry. No more doing the dishes. Heck, no more arguments with your significant other for failing to do something!

Andbot4U

Developed by the team over at Advanced Robotics, ANDBOT is a C-3PO-esque humanoid that can handle daily activities and protect your household. The robot boasts an impressive range of hand/arm motions that allows it to perform complex tasks, with human-like precision. For instance, you can have it hand deliver breakfast to you in bed or should you get locked out of your home, simply text ANDBOT and it will unlock the front door for you.

Its creators also designed ANDBOT with expandability in mind. Meaning, you will have the option of adding on various modules, whether it’s a vacuum attachment for spring cleaning, a beer holder for your Monday Night Football party, and even a rim for some indoor NERF basketball action.

“Robots have been around for some time, but not many can offer the human likeness as ANDBOT, where its arms can move 90 degrees up/down and hands can rotate 360 degrees. With its full range of motion, imagine the possibilities,” the startup explains. “Not all robots are created equal. With ANDBOT, it is as close to human as you can get, with a full range of hand and arm motions. What we do with our hands, so can ANDBOT, holding, pulling, opening, pushing, etc.”

nzgbniuvhmf38vl2fg65

ANDBOT is equipped with facial recognition, which enables it to decipher between your family members and unwelcomed guests. And just like us, the humanoid is capable of sensing different emotions and then reacting with appropriate responses. Having a bad day? Your social bot will always be there for you, especially when no one else is around.

So, what functionalities does ANDBOT possess? For starters, it can serve as your personal assistant with up-to-the-minute reminders and information, your security guard with remote monitoring for intruders and dangerous gases sensors, your smart home controller with light, thermostat and media center integration, your workout buddy, your own chef, or simply your favorite bedtime storyteller. The hope is that it will become less of a robot and more of a companion to you and your family.

The humanoid is driven by an Arduino Due (SAM3X8E) along with a pair of Arduino Uno (ATmega328) boards. ANDBOT relies on Wi-Fi and Bluetooth connectivity for communication, a 12V lithium-ion battery for power and and runs on the ROS/Android platform. In terms of its electronics, the bot features two HD speakers, LEDs for eyes, a multitude of sensors (ultrasonic, sound, bumper, humidity, carbon monoxide, air quality and temperature), an accelerometer and gyroscope, a LIDAR-Lite laser, a PIR motion detector, motor drivers, a camera and a 10.1″ touch display on its belly.

mliip8jbm6as7p7qrua0

What’s more, ANDBOT will send notifications to your smartphone via its accompanying app whenever its embedded sensors are triggered. There’s even a built-in portable oxygen system for the elderly or those living in fire prone areas. The better question is, what can’t this social robot do?

Lastly, as we’ve seen with other platforms, ANDBOT will be open source with an SDK that can be used to help expand and improve its capabilities. And more importantly, you will have access to an extensive developer community to further the advancement of the robot.

Phew, that was a lot… sound like a companion you’d like to have in your household? Head over to ANDBOT’s Indiegogo campaign, where the Advanced Robotics team is currently seeking $150,000. You’ll have to sit tight, though, as delivery isn’t expected to get underway until April 2016.

7Bot is a desktop robot arm that can see, think and learn


This desktop robot can play chess, tic-tac-toe and ping pong against a human.


While industrial robots may not be anything new, a vast majority of them can start at $50,000, not to mention require an engineering background to program it. But what if there was a much smaller, IRB 2400-like unit that packed the same punch as its counterparts for a fraction of the cost? That’s the idea behind 7Bot, a desktop robot arm that can see, think and learn.

79268ea84d1edf18933749093d1aff7a_original

Designed with aspirations of making robots more accessible for everyone, 7Bot boasts an aluminum body with six high-torque servos and an optimized control algorithm for enhanced accuracy, stability and agility. Its creators tell us that the arm is embedded with an Arduino Due (SAM3X8E).

But that’s not all. 7Bot is equipped with artificial intelligence and will learn as it goes. Looking for someone to play chess against? Need some help doing your homework? Whatever it is, this robotic arm is up for the task! Using the team’s computer vision sample codes, you can adjust the parameters to build an automated assembly line right on your desk. And should you have two 7Bot arms, you can combine them to make your very own humanoid.

7a64a980d561b60b37c16d07c1e3ba95_original

In terms of controlling the arm, any common human interactive device will do the trick. This includes everything from a traditional PC mouse to a keyboard, as well as gestures using Leap Motion and Kinect sensors. Additionally, custom built servos with feedback enable you to teach the robotic arm to accomplish tasks without coding.

“You can simply drag each joint of the robot to a serious of desired way points. The movements will be recorded, and could be replayed in an optimized path. Using teaching mode, you can easily guide your 7Bot arm performing some tasks,” the team writes. “With our embedded inverse-kinematics algorithm, the 7Bot arm can be precisely controlled using coordinates. And we have made web controlling application by using a Raspberry Pi as the host and with real-time feedback.”

They have also provided 3D visualization software for programing, which allows you to manipulate the arm intuitively. With this application, you can set and read the position of each joint separately with a real-time graphic interface and then interact with the 3D model using a mouse and keyboard.

“The robot can follow the movement in real-time. Or on the other side, you can perform simulation first, and generate way-point with the software, and then download the optimized moving path to your 7Bot arm. This is well suited for many algorithms that need lots of iterations in simulation, like reinforcement learning. You can get rid of any low-level coding for the robot.”

As for coding, 7Bot is compatible with Scratch, while more advanced developers have access to a wide range of open source APIs in C and C++.

b95f90f3c34459f74f939ed1f4808b53_original

7Bot is super flexible and can impressively mimic a real human limb. But just in case six degrees of freedom aren’t enough, you can always add a sliding mechanism to gain a seventh. Or, for a roving robot, simply throw it on an omni-directional mobile platform and roll around on its four Mecanum wheels.

The arm comes with a number of accessories too, such as a 3D-printed, dual-finger claw or an air vacuum gripper that can pick up and hold any two-pound object with a smooth exterior. It’s also super easy to be controlled with two digital signals. Meaning, you can use your Arduino, Raspberry Pi or any other microcontrollers.

Interested? Head over to its Kickstarter page, where the 7Bot crew is currently seeking $50,000. Delivery is slated for January 2016.

These lights will let you control your smart devices through gestures


LiSense uses shadows created by the human body from blocked light and reconstructs 3D human skeleton postures in real-time.


As our homes become increasingly smarter, what if we could use the light around us for more than just illumination? In other words, imagine if the light in your room could sense you waving your hand as you enter, or was able to trigger your smart coffee machine, unlock the door and turn on your entertainment center. While it sounds like something straight out of a sci-fi novel, it may soon all be possible thanks to a new project from researchers at Dartmouth University.

55ca00f1e2f3f

The team is looking to transform ubiquitous light into a medium that integrates communication with human sensing. LiSense works by decoding information made from visible light to turn everyday lighting into sensors that can then recognize and respond to what we do. This is achieved through visible light communication (VLC), which encodes data into light intensity changes at a high frequency invisible to the human eye.

Not only does LiSense use light to sense people’s movements, but it also allows them to control devices in their environment with simple gestures, employing light to transmit the information. The hope is that you will be able to gesture and engage with objects in a room via nothing more than light, similar to how you’d use a Kinect or Wii gaming system to interact with your TV.

For LiSense to track a person’s movements, the researchers built a three-meter by three-meter light-sensing testbed with five off-the-shelf Cree LEDs in the ceiling and 324 photodiodes on the floor. A total of 29 microcontrollers, Arduino Due (SAM3X8E) and Uno (ATmega328), were embedded as well. The system uses the shadows created by a person standing on the testbed to reconstruct their 3D human skeletal posture in real-time (at 60 Hz).

To get their shadow-based human sensing to work, the researchers had to overcome two critical challenges. Since multiple ceiling lights lead to diminished and complex shadow patterns on the floor, they had to devise light beacons to separate light rays from individual LEDs and ambient light. Additionally, they came up with an algorithm capable of taking the collected limited resolution, 2D shadow maps from the photodiodes in the floor and reconstructing a person’s posture in 3D.

Light

By waving your hand, LiSense lets you freely control things, play games and track behavior without the need of cameras and on-body devices. One day, the team says it may even respond to your feelings. Compared to existing methods that use wireless radio signals such as Wi-Fi to track user gestures, VLC has several appealing properties and advantages. For starters, light-based sensing is secure, doesn’t penetrate walls, and isn’t limited to classifying a pre-defined set of gestures and activities. On top of that, it’s energy efficient, operates at a bandwidth 10,000 times greater than the radio frequency spectrum, and reuses existing lighting infrastructure.

“Light is everywhere and we are making light very smart,” says Xia Zhou, lead author and researcher on the project. “Imagine a future where light knows and responds to what we do. We can naturally interact with surrounding smart objects such as drones and smart appliances and play games, using purely the light around us. It can also enable a new, passive health and behavioral monitoring paradigm to foster healthy lifestyles or identify early symptoms of certain diseases. The possibilities are unlimited.”

Sounds intriguing, right? See it all in action below, and be sure to read the team’s entire paper here.

RepRapPro launches a $300 Delta 3D printer


The Fisher Delta 3D printer is an easy-to-assemble and even easier-to-afford machine for Makers of any level.


Safe to say that the adoption of 3D printing will rely heavily upon both affordability and accessibility to Makers. And one of the companies continuing to lead the way is RepRapPro, who has debuted yet another open source machine for the DIY community. Recently unveiled during 3D Printshow London, Fisher is an easy-to-assemble, Delta style 3D printer that is expected to cost around $300 — quite the wallet-friendly price compared to many other devices on the market today.

fisher3

“In order to achieve the low price, a Delta configuration was chosen, utilizing mainly parts and processes which can be found in our other RepRap kits,” its team revels. “Although in this configuration the machine lacks a heated bed, many great features are included, such as an automatic bed probing and new compact all metal hot-end, which all combine to give the same great print quality as all our other RepRap 3D printer designs.”

One of its other notable features is RepRapPro’s Arduino-compatible, 32-bit controller. Based on an Atmel | SMART SAM3X8E Cortex-M3 MCU, the Duet board is equipped with four stepper motor controllers, an SD card slot, as well as USB and Ethernet ports. Makers can drive the platform with a conventional RepRap app like Pronterface or command the platform via a standard web server. What’s more, an expansion board offers an additional four stepper motor controllers, allowing for a total of five extruders and up to eight axis controls.

fisher2

Key specs of the Fisher:

  • Build volume: 150mm diameter, 180mm height
  • Nozzle diameter: 0.4mm
  • Resolution: 12.5um in all axes
  • Print bed: Removable
  • Extruder: Direct drive extruder with an all-metal stainless steel nozzle
  • Connectivity: Ethernet and USB interface
  • Storage: On-board microSD
  • Software: Prints G-code files provided by Slic3r and other open-source slicing programs

At the moment, the design is in its beta stage, as the team gathers feedback from end users throughout the open source community. Meanwhile, upgrades are already in the works which include a heated bed and color touchscreen kits. Interested? Head over to its official page here.

%d bloggers like this: