Tag Archives: megaAVR

The first-ever Rad Tolerant megaAVR is out of this world!


With billions of AVR chips already deployed throughout the world, it’s now time to take them into space!


This news may come as one small step for boards, one giant leap for Maker-kind: the ATmegaS128 has launched! Not only does Atmel’s first uC Rad Tolerant device share the popular features of the megaAVR family, this out-of-the-world MCU delivers full wafer lot traceability, 64-lead ceramic package (CQFP), space screening, space qualification according to QML and ESCC flow and total ionizing dose up to 30 Krad (Si) for space applications. What’s more, the ATMegaS128 is “latch up” immune thanks to a dedicated silicon process: SEL LET > 62.5Mev at 125°C, 8MHz/3.3V. SEU to heavy ions is estimated to 10-3 error/device/day for low Earth orbit applications.

space

With billions of commercial AVR chips widely deployed throughout the world, the new space-grade AVR family benefits from support of the Atmel Studio ecosystem and lets aerospace developers to the industrial-version of the ATmega to prototype their applications for a fraction of the cost. The latest board is available in a ceramic hermetic packaging and is pin-to-pin and drop-in compatible with existing ATmega128 MCUs, allowing flexibility between commercial and qualified devices, enabling faster-time-to-market and minimizing development costs. With this cost-effective approach and a plastic Hirel-qualified version, the ATmegaS128 can be also considered in more general aerospace applications including class A and B avionic critical cases where radiation tolerance is also a key requirement.

“With nearly three decades of aerospace experience, we are thrilled to bring one of our most popular MCU cores to space — the AVR MCU,” explained Patrick Sauvage, General Manager of Atmel’s Aerospace Business Unit. “By improving radiation performance with our proven Atmel AVR cores and ecosystem, the new ATmegaS128 provides developers targeting space applications a smaller footprint, lower power and full analog integration such as motor and sensor control along with data handling functions for payload and platform. We look forward to putting more Atmel solutions into space.”

Among its notable features, the space-ready MCU boasts high endurance and non-volatile memory, robust peripherals (including 8- and 16-bit timers/counters, six PWM channels, 8-channel, 10-bit ADC, TWI/USARTs/SPI serial interface, programmable watchdog timer and on-chip analog compactor), power-on reset and programmable brown-out detection, internal calibrated RC oscillator, external and internal interrupt sources, six sleep modes, as well as power-down, standby and extended standby.

maxresdefault

The STK600 starter kit and development system for the ATmegaS128 will provide users a quick start in developing code on the AVR with advanced features for prototyping and testing new designs. The recently-revealed AVRs are supported by the proven Atmel Studio IDP for developing and debugging Atmel | SMART ARM-based and AVR MCU applications, along with the Atmel Software Framework. Intrigued? Check out the uC Rad Tolerant device here.

Eedu is an easy-to-use drone kit for young Makers


Assemble. Code. Fly. It’s as simple as that.


According to Mary Meeker’s 2015 “State of the Internet” presentation, drone shipments are estimated to hit 4.3 million units this year, with consumer drone usage expected to jump 167%. Combine those figures with the hundreds of thousands of Makers looking to begin tinkering with their next DIY project, and well, you have yourself quite the market. Much like a number of educational robotic kits that have been introduced to provide children with basic electronics and programming principles over the years, one Las Vegas startup is looking to take that education from the ground and into the skies.

photo-original

Inspired by the hands-on learning that goes on inside classrooms, Skyworks Aerial Systems has launched Eedu an easy-to-use drone set that allows young Makers, educators and hobbyists starting out to devise new applications, other than just flying cameras. In order to make this a reality, the team has developed an intuitive platform that gives Makers the canvas they need to design their own UAV. The airborne apparatuses can be quickly pieced together using nothing more than its included parts, and are completely compatible with Arduino shields and other open hardware (littleBits and Seeed Studio).

c994f5692561736902761a590c047160_original

Once assembled, the drone can be paired with its special robotic development environment (RDE) called Forge. This cloud-based, community-driven software lets users code their vision into a reality, while offering ground control, community interaction and various programming capabilities. What’s nice is that, being open source, Makers can build from existing codes. As soon as an app is completed and compiled onto their Eedu, the DIY copter is ready for the skies.

a5c7535c1c9ad63733f58e1db8cc0789_original

The drone itself is based on an Intel Edison, which enables programs to be easily created on a full Linux OS and boast enough processing power to develop more advanced apps, and employs an ARM Cortex-M4 running on RTOS for sensor processing, main flight control and to interface with the Edison. Eedu also comes with a set of four brushless motors with standard trapezoidal drive, each powered by megaAVR MCUs. What’s more, the machine features a sensor mounting platform, an Arduino shield port and a quick release battery pack. Crafted with safety in mind, the propellers are extremely lightweight and comprised of soft plastic alongside intelligent speed controllers that automatically disable the rotors whenever something gets in the way.

001b879e0c59e4022ef6dbf1b1dec437_original

Beyond that, the team has unveiled a highly-advanced, adaptable flight controller driven by an Atmel | SMART Cortex-M7 MCU. Equipped with all of the electronics required for a drone to take to the sky, LUCI includes four built-in 20Amp brushless speed controllers, an Intel Edison expansion port, a DSMX compatible radio receiver, an optical flow position sensor, GPS and Arduino shield capability. Impressively, she can even be integrated on a number of consumer 250mm sized drones, giving Makers the ability to produce their own LUCI and Forge-powered UAV.

1156525ff8acc6a018ba2f56668e3fe4_original

With hopes of granting future Makers and engineers access to the necessary tools for innovation, the team has given its crowdfunding backers the option to purchase a kit for students or entire classrooms.

“More than ever, schools are having a hard time acquiring technology. We passionately believe that students’ accessibility to technology should not be hindered! As such, we are creating a donation fund that will allow us to distribute drones to schools across the nation.”

Intrigued? Fly on over to Eedu’s Kickstarter page, where Skyworks Aerial Systems is currently seeking $100,000. Delivery is expected to begin in December 2015.

Detect air pollution levels in your city with this helmet


This sensorial wearable prosthesis provides a new human sense.


From handheld devices that mapped air pollution, to smart umbrellas that sensed it, to creations that turned offensive air into enticing art, we thought we’ve seen it all when it came to Makers and their surrounding environment. That was before coming across this wearable project by Maker Susanna Hertrich. Living with poor air quality seems be what most of us are doing these days, particularly those of us who happen to reside in metropolitan areas such as Beijing or New York City that are filled with exhaust, smoke and an omnipresent haze that never seems to fade.

SHertrich_JFO_frontal

Cognizant of this, Hertrich has devised what she calls the Jacobson’s Fabulous Olfactometer (JFO)a head-mounted contraption that offers sensory augmentation for the human olfactory system under extreme living conditions of polluted cities. While the device may not resemble other wearable devices on the market — and appears to better suited for steampunk attire or medieval times for that matter — the JFO enables its user to directly sense chemicals in the air, as a warning signal, modifies the wearer’s face similar to the ‘Flehmen response.’ (This refers to the way in which cats, horses, donkeys, cattle and a whole slew of other animals curl their upper lip back on itself, open their mouths and lift their heads to the sky.)

SHertrich_JFO_sideview

The device isn’t designed to help you entertain the crowd with funny faces, but rather, to detect the levels of air pollution in your immediate vicinity at a far higher level of accuracy. In fact, Hertrich says that it is “an accelerated human evolution driven by means of existing technologies — with the goal to help us cope with extreme environments. The device utilizes off-shelf-technology to fill a gap in human evolution and provide us with a new sense.”

SHertrich_JFO_device-back

Embedded into the forehead of the prosthesis are chemical sensors, which are capable of collecting air data and detaching carbon dioxide levels. This data is then fed to a megaAVR based Arduino board, which deciphers whether CO2 levels are at a high enough level to be harmful. If so, motors activate gears that pull the wearer’s upper lip upwards, stimulating the aforementioned “Flehmen response” when a dangerous threshold is overridden.

“Can we accelerate human evolution by means of existing technologies to cope with extreme living environments? What if we extend our sensorial abilities to ‘smell’ airborne chemicals?” Hertrich asks. Whether or not this is the solution, the device blends both futuristic tech with inherent traits of animals to solve an all-too-real problem. Intrigued? Head over to the Maker’s official page to learn all about the sensorial project.

Building an ATmega1284P prototyping board

For his latest vintage CPU/MCU mashup, Maker Dave Cheney recently decided to replace an Arduino Mega board with a bare megaAVR microcontroller to create a two chip solution — just the Atmel and the 6502, no glue logic or external support chips.

IMG_20141227_123602

“While toying around with the project, [Cheney] found the microcontroller he was using, the ATMega1284P, was actually pretty cool. It has eight times the RAM as the ever-popular 328P, and twice as much RAM as the ATMega2560P found in the Arduino Mega,” Hackaday’s Brian Benchoff writes.

The minimal design was laid out in Fritzing along with a crystal, load capacitors, an ISP connector, and pins for a serial connector. “The trickiest piece was fitting the crystal and load capacitors into the design without disrupting to many of the other traces. It worked out well so I decided to add ICSP and FTDI headers,” Cheney notes.

mega6502c_pcb

Since the ATmega1284P MCUs that he ordered were unprogrammed, all the bootloading was done through Manicbug’s Mighty1284 Arduino Support Package. Though the package only supported Arduino 1.0, the Maker still had a nifty little prototyping board on hand.

“I’m smitten with the ‘1284P. It feels like the right compromise between the pin starved ‘328 and the unfriendly ‘2540 series. The 1284P supports more SRAM than either of its counterparts and ships in a package large enough that you get a full 24 pins of I/O.”

Image-555235

Interested in learning more about Cheney’s build? You can get a detailed breakdown of the prototyping board here.

Creating an open-source, yearlong time-lapse camera

At first, all Maker “val3tra” wanted was an RF-accessible camera, capable of snapping some photos, saving them onto a microSD card, and on occasion, relaying them to a computer via an RF link. Well, the project has now evolved into an open-source device capable of capturing a year-long time-lape videos.

FlheAaB

With the idea of leaving the camera “in a nice spot and coming back next year, without worrying about getting power there,” the build first began using a $20 JPEG camera from eBay that was modded for 3.3V, along with a $4 RF module, a megaAVR MCU and some batteries. The camera was 640×480 with each frame only an average of 48kb, while the additional components drew nearly 100 Joules of power per hour.

Since a D-cell has about 60,000 Joules, the Maker estimated that four of them would provide enough run time for about 200 days. As Hackadays Brian Benchoff notes, “This build was then improved, bringing the total battery consumption down to about 3.5-4 Joules per frame, or at one frame every 10 minutes, about 24 Joules an hour. That’s impressive, and getting this camera to run longer than a dozen or so months raises some interesting challenges. The self-discharge of the battery must be taken into account, and environmental concerns – especially when leaving this camera to run in a Moscow winter, seen in the video below – are significant.”

camera_img_1935

Power was supplied from 4.8+ V and over a 3.3V LDO, so four alkaline batteries were ideal. “I thought of using a switching regulator to increase efficiency, but it just isn’t worth it on this scale — at best you can get 20% increase in run time,” val3tra adds.

Now, one frame captures three seconds of footage at 100mA and takes seven seconds at 60mA to jot the picture down. Between frames, the Maker says it stays in deep sleep, consuming 91 uA.

clKIqit

Interested in learning more about this megaAVR based design? You can read the Maker’s entire log here, while also watching the time-lapse in action below.

This $11 robot can teach kids how to program

A group of Harvard University researchers — Michael Rubenstein, Bo Cimino, and Radhika Nagpal — have developed an $11 tool to educate young Makers on the fundamentals of robotics. Dubbed AERobot (short for Affordable Education Robot), the team hopes that it will one day help inspire more kids to explore STEM disciplines.

robot-kids-inline1

Fueled by the recent emergence of the Maker Movement, robots are becoming increasingly popular throughout schools in an effort to spur interest in programming and artificial intelligence among students.

The idea behind this particular project was conceived following the 2014 AFRON ellenge, which encouraged researchers to design low-cost robotic systems for education in Third World countries. As Wired’s Davey Alba notes, Rubenstein’s vast experience in swarm robotics led to him modding one of his existing systems to construct the so-called AERobot. While it may not be a swarm bot, the single machine possesses a number of the same inexpensive components.

So, what is the AERobot capable of doing?

  • Moving forward and backward on flat, smooth surfaces
  • Turning in place in both directions
  • Detecting the direction of incoming light
  • Identifying distances using reflected infrared light
  • Following lines and edges

With a megaAVR 8-bit microcontroller as its brains, the team assembled most of its other electronic parts with a pick-and-place machine, and to reduce costs some more, used vibration motors for locomotion and omitted chassis. AERobot is equipped with a built-in USB plug that also enables it to be directly inserted into any computer with a USB port — unlike a number of other bots.

“Using this USB connection, it can recharge its lithium-ion battery and be reprogrammed all without any additional hardware. AERobot has holonomic 2D motion; using two low-cost vibration motors, it can move forward, backwards, and turn in place on a flat, smooth surface such as a table or whiteboard. It also has three pairs of outward-pointing infrared transmitters and phototransistors, allowing it to detect distance to obstacles using reflected infrared light, and passively detect light sources using just the phototransistors.”

usb

In addition, the bot features one downward-pointing infrared transmitter along with a trio of infrared receivers to detect the reflectivity of the surface below, which is useful for line following. To aid in learning programs and debugging, AERobot also boasts an RGB LED.

On the software side, AERobot uses a graphical programming environment, which makes reprogramming easy for beginners. By modifying the minibloqs programming language, Rubenstein says you don’t really need to type code, instead you just drag pictures. He went on to tell Wired, “Say I wanted an LED on the robot to turn green. I would just drag over an image of an LED, and pick the green color.”

Interested in learning more? You can scroll on over to the project’s official page or read its entire Wired feature here.

 

This portable device takes air monitoring into its own hands

Designed by the Brooklyn-based HabitatMap team, AirBeam is a portable, palm-sized system for mapping, graphing and crowdsourcing air pollution in real-time as you make your way around city streets. While the wearable instrument may not purify the air, it does enable you to monitor what you are breathing in, thereby increasing your awareness of the budding issue. As its creators note, pollution is among the leading causes of chronic illnesses as well as contributor to a number of terminal illnesses.

photo-main

In an effort to share and improve the atmosphere, the device is powered by an Atmel ATmega32U4 and based on the Arduino Leonardo bootloader. AirBeam uses a light scattering method to take regular measurements of fine particular matter (also known as PM2.5), convert the data into a more digestible form and route it to its companion smartphone app via Bluetooth. PM2.5 is just one of the six air pollutants the EPA regulates.

Since being founded in 2011, the AirCasting team has been working diligently to create a wearable device that would not only increase the amount of data collected, but improve the accuracy as well. Up until its Kickstarter campaign, HabitatMap has used a series of hacked-together third-party devices to measure air quality.

AirBeamDiagram

The Android app then maps and logs the data in real-time. Those wishing to share their findings can also add to HabitatMap’s crowdsourced map of air quality readings, which indicates where PM2.5 concentrations are the highest and lowest.

da915f1246cef7d812abc4c0ff3f7cd0_large

AirBeam is just one component of the open-source AirCasting platform — which consists of the mobile app, online platform and the megaAVR embedded wearable — enabling so-called AirCasters to individually and accurately collect and broadcast their surrounding air quality data. As the team points out, at its core, AirCasting is a DIY air monitoring movement that informs and empowers citizen scientists to take “matters into their own hands.” After all, the more cognizant we are about the air we breathe in, the better!

Here’s a surprise: The air in and around the New York City subway is downright disgusting. In fact, “You’re breathing in diesel exhaust, steel particles, sulfur dioxide. It’s well above the EPA’s standard,” HabitatMap Founder Michael Heimbinder tells Wired

B1ykxjBIcAAOf5g

The mobile app has been well-received, having already been downloaded over 10,000 times with thousands of active changemakers currently using the AirCasting platform. Pledges garnered through the Kickstarter will further enable HabitatMap to run more programs at schools and in communities and do so cheaper, faster, and with more devices.

This initiative is just a part of an ever-evolving, emerging trend focused on gathering ambient data throughout today’s urban space. While a number of cities are embedded sensors and other technologies to attain information on noise levels and air quality, as Wired points out, what sets AirBeam apart is the concept that everyday citizens can contribute to this rise in data themselves.

Interested in learning more? Head over to its official Kickstarter page, which successfully completed its funding round moments ago.