Tag Archives: tag2

Are conductive temporary tattoos the future of wearables?


Time to get skintimate with Tech Tats.


Although there’s already an abundance of activity monitoring wearables on the market today, mobile development studio Chaotic Moon is exploring a new frontier in the industry. The Austin-based firm has decided to go beyond just a fitness tracker with a collection of biosensors that affix to your skin like a temporary tattoo.

Tattoo2

In one of its uses cases, the aptly named Tech Tats consist of an ATiny85 that stores and receives body data from sensors via Bare Conductive’s Electric Paint. This combination of basic components and conductive ink come together to create a circuit that essentially turns you in a cyborg. There’s even some room for an ambient light sensor that illuminates LEDs whenever it’s dark. And unlike most wellness devices, the temporary tattoo can be worn in other places than merely the wrist — all while remaining unnoticeable. 

Tech Tats boast various applications, with health and mobile payments being two of them. For one, the biosensors can be stuck on the skin once a year instead of having to go for an annual physical, and will keep tabs on all of your vitals that the doctor would normally check for. The information can then be sent to the doctor, who will notify you only if there is an issue. This can also come in handy following surgery to better keep tabs on a patient’s progress.

According to Chaotic Moon, the temporary tattoo can read body temperature as well as sense if someone is stressed based on sweat, heart rate and hydration levels. Throw on a BLE module and data can be wirelessly transferred to an accompanying smartphone app, or uploaded through location-based low-frequency mesh networks.

Tattoo1

Not only the medical field, but Tech Tats can find a home in banking industry, too. Instead of carrying a wallet around with all of your most personal information in your back pocket, these conductive patches can be employed to authorize payments in similar fashion to Apple Pay.

Aside from that, Chaotic Moon’s bio-wearable can even play a role in the military setting by detecting poisons in the air, pathogens in a soldier’s body or identifying when they’re injured or stressed.

Could temporary tattoos be the future of wearable technology? Time will only tell. But until then, you can watch Chaotic Moon explain their innovation in the video below!

 

maXTouch U family opens up a world of possibilities for next-gen devices


This new controller family will make touchscreen devices less frustrating and more enjoyable to use.


It’s safe to say that touchscreens have surely come a long way since Dr. Samuel C.Hurst at the University of Kentucky debuted the first electronic touch interface back in 1971. Despite their ubiquity today in just about every device, the technology doesn’t seem to always work as well as it should given recent advancements. As VentureBeat’s Dean Takahashi points out, displays remain frustratingly unresponsive to finger taps, consume a lot of power, and quite frankly, are still pretty bulky — until now.

Chips

That’s because Atmel has launched a next generation of sensor chips that will pave the way to much better (and more delightful) tactile experiences for gadgets ranging from 1.2” smartwatch screens to 10.1” tablet displays. Following in the footsteps of its older siblings, the new maXTouch U family will enable optimal performance, power consumption leveraging picoPower technology, and of course, thinner screens.

More apparent than ever before, the use of touch-enabled machinery has exploded over the past five years. As a result, there has been an ever-growing need to develop touchscreens with extremely high touch performance, ultra-low power and more sophisticated industrial designs with thinner screens. Not to mention, the anticipated surge in wearables has also created a demand for extremely small touchscreen controllers with ultra-low power consumption in tiny packaging. Luckily, this is now all possible thanks to the maXTouch U family which crams pure awesomeness in a 2.5-millimeter by 2.6-millimeter space (WLCSP).

Flawless

Designers can now build extremely innovative thin and flexible touchscreen designs using single layer, on-cell and hybrid in-cell touchscreens with intelligent wake-up gestures and buttons. What this means is that, the technology can support entry-level smartphones, slick wearable gizmos, super tablets and everything in between on a full range of stack-ups.

Among the most notable features of the U include low power modes down to 10µW in deep sleep for wearables such as smartwatches, active stylus support, 1.0-millimeter passive stylus support (so users can write with things like pencils on a touchscreen), as well as up to a 20-millimeter hover distance (so that a user can answer their phone call with a wet hand). What’s more, the touch controllers can sense water and reject it as a touch action, and works with multiple fingers — even if someone is wearing gloves.

Binay Bajaj, Atmel Senior Director of Touch Marketing, explains that the recently-revelaed series provides all the necessary building blocks for futuristic mobile gadgetry. The chips are available in samples today, while production versions will be ready in the third and fourth quarters.

“Our expertise in ultra-low power MCUs and innovative touch engineering have allowed us to bring a superior series of devices to market that is truly an innovative collection to drive next-generation touchscreens. We are a leading provider of touchscreen devices to a variety of markets adopting capacitive touchscreens,” Bajaj adds.

Let’s take a closer look at the six new maXTouch U devices:

  • mXT735U is the perfect device for the entry level tablet delivering robust moisture support and excellent noise immunity for touchscreens up to 10.1″.
  • mXT640U supports touchscreens up to 6 inches. This device supports 1mm passive stylus support and thin stack support including 0.4mm cover lens for GFF stack, up to 25mm hover detection and moisture resistance.
  • mXT416U delivers extremely high touch performance including 2.5mm passive stylus, excellent moisture support, noise immunity and up to 30mm large finger touch detection.
  • mXT336U is targeted for mid-range smartphone applications, delivering a perfect balance between performance and form factor.
  • mXT308U is geared towards low-end smartphone applications emphasizing simplicity and robustness.
  • mXT144U is designed specifically for wearable applications. The mXT144U features picoPower with 10uW in deep sleep mode and is the smallest hybrid sensing touchscreen controller packaged in a 2.5mm x 2.6mm WLCSP. This device is the ideal solution for today and tomorrow’s wearable devices.

Introducing the new Atmel | SMART SAM C family


Atmel unveils an innovative 5V Cortex-M0+ MCU series with integrated peripheral touch controller.


Say hello to the Atmel | SMART SAM C family, the world’s first full 5V ARM Cortex-M0+-based MCU series with an integrated peripheral touch controller (PTC). The newest batch of MCUs innovatively combines 5V, DMA performance and a PTC with excellent moisture tolerance. Beyond that, the devices integrate advanced analog capability and offer EMI and ESD protection, making them ideal for the rapidly expanding smart appliance and industrial markets.

SAMC_launch_980x352-1

Atmel | SMART microcontrollers with PTC are currently in mass production at leading appliance manufacturers worldwide. By adding full 5V functionality on an ARM Cortex M0+ based core, along with upcoming support for the IEC 60730 Class B Safety Library, the SAM C lineup — including the SAM C20 and CAM C21 — is the perfect solution for partnering with industrial and white goods companies to power next-generation applications for the burgeoning Internet of Things.

Leveraging over two decades of MCU success, the latest series incorporates Atmel’s proprietary smart peripherals and Event System, not to mention are also pin and code-compatible to the SAM D and SAM L families. The SAM C is fully supported by Atmel’s free integrated development environment Atmel Studio and program examples and drivers for all peripherals are available through the Atmel Software Framework.

“Atmel leverages its leadership position in both MCU and touch with the new SAM C series,” explained Reza Kazerounian, Atmel SVP and GM, Microcontroller Business Unit. “The SAM C series uniquely combines support for 5V on a Cortex-M0+ based MCU with an integrated PTC, bringing an industry-first product to market for next-generation industrial and appliance applications.”

Among the notable features of the SAM C:

  • Expands the ARM Cortex-M0+ based MCU with hardware divide and square root accelerator at 48MHz
  • Large memories with SRAM up to 32KB and embedded Flash up to 256KB
  • Supports 2.7V to 5.5V operating voltage
  • Integrates the Atmel QTouch Peripheral Touch Controller
  • Incorporates Atmel’s proprietary DMA with SleepWalking, Event System and SERCOM
  • Dual 12-bit ADCs and a 16-bit Sigma Delta ADC
  • Dual CAN 2.0 with FD support

To help accelerate a designer’s development, the SAM C21 Xplained Pro is now selling for just $39. These boards include an embedded debugger and programmer and have a wide range of compatible extensions units. Standalone programmer debugger solutions supporting the SAM C family are also available from both Atmel and third parties.

Video Diary: A look back at Embedded World 2015


Weren’t able to join us in Nuremberg? 


With another Embedded World in the books, here’s a look back at some of Atmel’s latest smart and securely connected solutions that are ready to power next-generation Internet of Things (IoT) applications.

Andreas von Hofen shows off the new automotive grade ARM Cortex-M0+-based SAM DA1. The recently-revealed family of MCUs feature an integrated peripheral touch controller (PTC) for capacitive touch applications.

Geir Kjosavik demonstrates a QTouch-based water level sensing application that highlights its advanced HMI and sensing capabilities. Notable uses for this solution include automotive liquid containers and coffee machines.

Dr. Atta Römer explores the latest advancements in phase measurement by exhibiting various localization applications based on 802.15.4 transceivers. Among those examples is Agilion, who showed off its latest e-ink display ID badge based on an Atmel transceiver that is capable of tracking employees in emergency situations, transmitting data and managing access.

Ingolf Leidert addresses Atmel’s newest development kit for ZigBee Light Link solutions using a pair of SAMR21ZLL-EK boards. In this particular demonstration, one board served as a ZigBee LightLink remote, while the other acted as a light.

Controllino is an open-source programmable logic controller (PLC) built around ATmega328 and ATmega2560 microcontrollers. The startup’s CEO Marco Riedesser went 1:1 with Artie Beavis to delve deeper into the Arduino-compatible PLC that enables Makers and designers to produce and control a wide-range of IoT projects, ranging from industrial to home automation applications.

Lionel Perdigon introduces the newest series in the Atmel | SMART ARM Cortex-M portfolio, the SAM E70 and the SAM S70. These Cortex-M7-based MCUs are ideal for connectivity and general purpose industrial applications, while the auto-grade SAM V70 and SAM V71 are perfectly suited for in-vehicle infotainment, audio amplifiers, telematics and head unit control.

The Internet of Things requires a system-level solution encompassing the whole system, from the smallest edge/sensing node devices to the cloud. That is why Atmel has partnered with best-in-class cloud partners — including PubNub, Proximetry and Arrayent — that can support a variety of applications for both Tier-1 OEMs and smaller companies. As Ramzi Al-Harayeri explains Atmel has integrated the partners’ technologies into Atmel’s cloud solutions framework adding the cloud platform functionality seamlessly to all of the company’s wireless MCU offerings.

Thomas Wenzel showcases the latest version of Atmel’s connected car solution, AvantCar 2.0. Focusing on user requirements for next-generation vehicles, this futuristic center console concept delivers an advanced human machine interface (HMI). Beyond that, the new centerstack includes curved touchscreens highlighting HMI in upcoming automobiles using Atmel technologies including XSense, maXTouch, AVR MCUs and local interconnect network.

Bosch Sensortec’s Fabio Governale and Divya Thukkaram unveil the latest extension board for the incredibly-popular Xplained platform. Featuring a BNO055 intelligent 9-axis absolute orientation sensor, the next-gen device connects directly to Atmel’s Xplained board making it ideal for prototyping projects for the Internet of Things, wearables and gaming markets, as well as for applications like personal health and fitness, indoor navigation, and others requiring context awareness and augmented reality for a more immersive experience.

David Lindstrom of Percepio takes us through some of the innovative features of Atmel Studio 6.2, including the MTB support available on the new SAM D21 board. As the demo reveals, it’s super easy to get started, enable Trace View and run the system using the all-in-one collaborative environment for embedded design.

Sankaranarayanan Kitchiah delves deeper into Atmel’s BLDC motor control development platform using a SAM D21 MCU and the Atmel Data Visualizer (ADV) application.

Arduino and Adafruit unveil the Arduino Gemma

During his Maker Faire Rome presentation, Arduino Co-Founder Massimo Banzi offered attendees a preview of the company’s new collaboration with Adafruit — the Arduino Gemma, a tiny wearable MCU board packed in a 1-inch (27mm) diameter package.

ArduinoGemma

Similar to the original Adafruit Gemma, the mini yet powerful wearable platform board is powered by the versatile ATtiny85. The board will be default-supported in the Arduino IDE, equipped with an on/off switch and a microUSB connector. Since it is programmable with the Arduino IDE over USB, all Makers will have the ability to easily create wearable projects with all the advantages of being part of the Arduino family.

BzGFaEdIYAA6jnK.jpg-large

“We wanted to design a microcontroller board that was small enough to fit into any project, and low cost enough to use without hesitation,” Adafruit’s Limor Fried (aka LadyAda) explained in a blog post last September. “Gemma is perfect for when you don’t want to give up your Flora and aren’t willing to take apart the project you worked so hard to design. It’s our lowest-cost sewable controller.”

Ideal for small and simple projects sewn with conductive thread, the [tinyAVR based] Arduino Gemma fits the needs of nearly every entry-level wearable creations — ranging from reading sensors to driving addressable LED pixels.

To better visualize just how small we are talking, look at this image from an earlier version of the Adafruit Gemma.

flora_1222scale_LRG

“The ATtiny85 is a great processor because despite being so small, it has 8K of flash and 5 I/O pins, including analog inputs and PWM ‘analog’ outputs. It was designed with a USB bootloader so you can plug it into any computer and reprogram it over a USB port (it uses 2 of the 5 I/O pins, leaving you with 3),” Arduino noted in its announcement.

In addition to ATtiny85 MCU, other key hardware specs include:

  • Operating Voltage: 3.3V
  • Input Voltage (recommended): 4-16V via battery port
  • Input Voltage (limits): 3-18V
  • Digital I/O Pins: 3
  • PWM Channels: 2
  • Analog Input Channels: 1
  • DC Current per I/O Pin: 40 mA
  • DC Current for 3.3V Pin: 150 mA
  • Flash Memory: 8 KB (ATtiny85) of which 2.5 KB used by bootloader
  • SRAM: 0.5 KB (ATtiny85)
  • EEPROM: 0.5 KB (ATtiny85)
  • Clock Speed: 8 MHz
  • MicroUSB for USB Bootloader
  • JST 2-PH for external battery

For those seeking to use an Arduino Gemma in their next DIY wearable project, the board will be available for purchase on the Arduino Store and Adafruit Industries beginning late Fall 2014.

Atmel strengthens its IoT leadership

Atmel today announced a definitive agreement to acquire Newport Media, Inc., a leading provider of high performance low power Wi-Fi and Bluetooth solutions, that will enable Atmel to offer designers and Makers the industry’s most complete wireless portfolio of smart, connected devices for the Internet of Things (IoT).

“This acquisition immediately adds 802.11n Wi-Fi and Bluetooth to our offerings and will accelerate our introduction of low-energy Bluetooth products,” explains Atmel CEO Steve Laub. “Combined with our existing Wi-Fi and Zigbee solutions and industry leading microcontroller portfolio, Atmel is positioned for substantial growth in the Internet of Things marketplace.”

atmel_SAMW23_HomePage_980x352

Expanding Atmel’s already broad SmartConnect™ wireless portfolio, NMI’s 802.11n Wi-Fi and Bluetooth certified products offer innovative, highly integrated solutions that will accelerate seamless communication and connectivity for the Internet of Things. NMI’s products combined with Atmel’s ultra-low power microcontrollers (MCUs) are designed for a broad spectrum of applications including industrial, home and building automation, and consumer products requiring smaller form factors and longer battery life.

Analysts at IDC recently confirmed the arrival of a connected future as the worldwide market for IoT solutions is expected to increase from $1.9 trillion in 2013 to a staggering $7.1 trillion in 2020.

As we’ve previously discussed on Bits & Pieces, Atmel is well-positioned to benefit from the rapidly evolving Internet of Things. According to Oppenheimer & Co. analyst Andrew Uerkwitz, Atmel is one of a handful of companies that makes MCUs that will increasingly be in demand, with today’s announcement further bolstering its leadership position in the IoT market.

Interested in learning more about the IoT? You’ll want to check out our extensive Bits & Pieces IoT article archive here.

A closer look at Atmel’s smart energy platform (Part 2)

In part one of this series, Bits & Pieces introduced Atmel’s recently launched SAM4C series of products, with a spotlight on the SAM4C16 and SAM4C8. Designed for smart energy applications, these system-on-chip solutions are built around two high performance 32-bit ARM Cortex-M4 RISC processors. The devices operate at a maximum speed of 100 MHz and feature up to 2Mbyte of embedded Flash, 304 Kbytes of SRAM and on-chip cache for each core.

atmelsmartenergy3cropped

The dual ARM Cortex-M4 architecture facilitates the integration of various layers, including application, communications and metrology functions in a single device. It also offers options for integrated software metrology or external hardware metrology AFE (analog front end), as well as an integrated or an external power-line carrier (PLC) physical layer solution. Essentially, this is a modular approach that is sure to meet various design needs.

In part two of this series, we’ll be taking a closer look at the software and hardware metrology of the SAM4Cx. Specifically, Atmel’s software metrology library provides a comprehensive level of performance, scalability and flexibility which supports the integration of proprietary advanced metrology and signal processing algorithms.

“Atmel’s standard library enables residential, commercial, and industrial meter design up to class 0.2 accuracy, dynamic range of 3000:1, and are compliant with IEC 62052-11, 62053-22/23, ANSI C12.1, C12.20 and MID,” an Atmel engineering rep told Bits & Pieces.

atsense301

“Meanwhile, software metrology front-end electronics is comprised of ATSENSE-301 and ATSENSE-101 multi-channel (up to 7) simultaneously-sampled Sigma-Delta A/D converters at 16sps, high precision voltage reference with up to 10 ppm/°C temperature stability, programmable current signal amplification, temperature sensor and SPI interface.”

Additional SAM4Cx features include:

  • Poly-phase energy metering analog front end for Atmel’s MCUs and Metrology library.
  • Compliant with Class 0.2 standards (ANSI C12.20-2002 and IEC 62053-22).
  • Up to 7 Sigma Delta ADC measurement channels: 3 Voltages, 4 Currents, 102 dB Dynamic Range.
  • Current Channels with Pre-Gain (x1, x2, x4, x8).
  • Supports shunt, current transformer and Rogowsky coils.
  • 3.0V to 3.6V operation, Ultra Low Power: < 22 mW typ (device fully active @ 3.3V).
  • Precision voltage reference.
  • Temperature drift: 50ppm typ (ATSENSE-301)and 10ppm typ (ATSENSE-301H).
  • Factory measured temperature drift and die temperature sensor to perform software correction.
  • 8 MHz Serial Peripheral Interface (SPI) compatible mode 1 (8-bit) for ADC data and AFE controls.
  • Interrupt Output Line signaling ADCs’ end of conversion, under-run and over-run.
  • Package: 32-lead TQFP, 7 x 7 x 1.4 mm.

atmelmetrology

In terms of hardware metrology (AFE), Atmel offers out-of-the-box solutions for basic metering that supports up to class 0.2 accuracy; exceeds IEC and ANSI standards and offers best-in-class temperature drift.

Additional specs include:

  • A dynamic range up to 6000:1
  • Optimizes performance
  • Reduces OEM’s cost of manufacturing
  • Great fit with SAM4L
  • picoPower Technology
  • Active mode @ 90μA/MHz
  • Full RAM retention @1.5μA
  • SleepWalking
  • 4×40 Segment LCD Controller
  • Hardware Crypto block

Interested in learning more about Atmel’s new comprehensive smart energy platform? Be sure to check out our official product page here, part one of our deep dive here and part three here.

Designing an open source baby monitor

Earlier this year, a team of researchers from FabLab Pisa and the University of Pisa’s Center for Bioengineering and Robotics kicked off an exciting new project known as OS4BME, or Open Source for Biomedical Engineering.

The project’s goal? Introducing the medical device world to a DIY & Makers philosophy. Indeed, OS4BME wants to help facilitate the development of simple, low-cost and high-impact biomedical devices such as neonatal baby monitors.

The course took take place at Kenyatta University (Nairobi) and involved a number of staggered tracks, including configuring a 3D printing system, developing a neonatal monitoring device, using open source and designing solar-powered electronics based on the Atmel-powered Arduino platform.

In July, Arduino announced its official support for the project, sending the research team a number of UNO boards (ATmega328), along with Wi-Fi and GSM shields used during the course. The components were subsequently donated to the Kenyatta University and Fablab Nairobi.

Arti Ahluwalia (Professor of Bioengineering), Daniele Mazzei and Carmelo De Maria (Biomedical Engineers, co-founders of FabLab Pisa and researchers at the Center) have since returned to Italy where they were recently interviewed by Arduino’s Zoe Romano.

“We decided to use open source tools to design and prototype the baby monitor because we believe economic barriers can’t stop the creative process. Our results will be the starting point for future projects, following the open source philosophy,” the FabLab Pisa team told Romano.

“[Our] baby monitor [was] composed by a 3D-printed mechanical frame, an electronic board and a control software. Thus, in order, we used FreeCAD for mechanical design, MeshLab to analyze the quality of the mesh, Slic3r to generate the machine code, Pronterface to send commands to a Prusa Mendel RepRap. The brain of the baby monitor, electronic and software, is based on Arduino. ”

According to FabLab Pisa, the project was an “immediate” success, if even most students and staff were initially unaware of the existence of tools such as Arduino, FreeCad, Slicer and Media Wiki.

“The course was instrumental in bringing this knowledge to the participants, and their keen interest throughout the introductory part, particularly on 3D printing and rapid prototyping was apparent,” the FabLab team added. “[Currently], the University of Pisa is working with the ABEC and Boston University to raise funds for further courses and student and staff exchange.”

1:1 interview with Geoffrey Barrows of ArduEye

ArduEye is a project by Centeye, Inc. to develop open source hardware for a smart machine vision sensor. All software and hardware (chips and PCBs) for this project were developed either from pre-existing open source designs or from Centeye’s own 100% IR&D efforts. In the interview below, Atmel discusses the above-mentioned technology with Maker Geoffrey Barrows, founder of ArduEye and CEO of Centeye.

geoffrey-barrows-ardueye-avrTom Vu: What can you do with ArduEye?

Geoffrey Barrows:  Here are some things people have actually done with an ArduEye, powered by just the ATmega328 type processor used in basic Arduinos:

  • Eye Tracking- A group of students at BCIT made an eye tracking device, for people paralyzed with ALS (“Lou Gehrig’s disease”), that allow them to operate a computer using their eyes.
  • Internet connected traffic counter– I aimed an ArduEye out at the street in front of my house and programmed it to count the number of cars driving northbound. Every 5 minutes, it would upload the count to Xively, allowing the whole world to see the change in traffic levels throughout the day.
  • Camera trigger- One company used an ArduEye to make a camera trigger at the base of a water slide at a water park. When someone riding the slide reached the bottom, the camera took a picture of the rider and then send it to him or her via SMS!
  • Control a robotic bee – My colleagues at Harvard University working on the “RoboBee” project mounted one of our camera chips on their 2cm large robotic bee platform. The chip was connected to an Arduino Mega (obviously not on the bee), which ran a program to compute visually, using optical flow, how high the bee climbed. A controller could then cause the bee to climb to a desired height and hold a position. This was a very cool demonstration.
  • Control a drone – My colleagues at the U. Penn GRASP Lab (who produced the famous swarming quad copter video) used two ArduEyes to control one of their nano quad copters to hover in place using vision.
  • The New Jersey based “LandroidsFIRST robotics team uses ArduEyes on their robots to do things like detect objects and other robots.

These are just some examples. You can also do things like count people walking through a doorway, make a line-following robot, detect bright lights in a room, and so forth. I could spend hours dreaming up uses for an ArduEye. Of course an ArduEye doesn’t do any of those things “out of box”- you have to program it. Arduino_ardueye-avr-atmega

TV:  Explain the methodology and the approach? What is your general rule of thumb when it comes to resolution and design margins?

GB:  My design philosophy is a combination of what I call “vertical integration” and “brutal minimalism”. To understand “vertical integration,” imagine a robot using a camera to perform vision-based control. Typically, one company designs the camera lens, another designs the camera body and electronics, and another company designs the camera chip itself. Then you have a “software guy/gal” write image processing algorithms, and then another person to implement the control algorithms to control the robot. Each of these specialties is performed by a different group of people, with each group having their own sense of what constitutes “quality.” The camera chip people generally have little experience with image processing and vice versa. The result is a system that may work, but is cumbersome and cobbled together.

Our approach is instead to consider these different layers together and in a holistic fashion. At Centeye, the same groups of minds design both the camera hardware (the camera body as well as the camera chips themselves) and the image processing software. In some cases we even design the lens. What this means is that we can control the interface between the different components, rather than being constrained by an industrial standard. We can identify the most important features and optimize them. Most important, we can then identify the unnecessary features and eliminate them.

This latter practice, that of eliminating what is unnecessary, is “brutal minimalism”. This is, in my opinion, what has allowed us to make such tiny image sensors. And the first thing to eliminate is pixels! It is true that if you want to take a beautiful photograph for display, you will need megapixels worth of resolution (and a good lens). But to do many other tasks, you need far fewer than that. Consider insects- they live their whole lives using eyes that have a resolution between about 700 pixels (for a fruit fly) to maybe 30k pixels (for a dragonfly). This is an existence proof that you don’t always need a million pixels, or even a thousand pixels, to do something interesting.

TV:  What are some of the interesting projects you have worked on when involving sensors, vision chips, or robotics?

GB:  The US Air Force and DARPA has over the years been sponsoring a number of fascinating programs bringing together biologists and engineers to crack the code of how to make a small, flying robot. These projects were all interesting because they provided me, the engineer, with the chance to observe how Nature has solved these problems. I got to interact with an international group of neuroscientists and biologists making real progress “reverse engineering” the vision systems of flys and bees. Then later on I got to implement some of these ideas in actual flying robots.

This gave me insights to vision and robotics that are often contradictory to what is generally pursued in much university research efforts- the way a fly perceives the world and controls itself is completely different from how most flying “drones” do the same. Flying insects don’t reconstruct Cartesian models of the world, and certainly don’t use Kalman filters!

uva-darpa-aerovironment-avrI also participated in the DARPANano Air Vehicle” effort, where I got to put some of these bio-inspired principles to practice. As part of that project, we built a set of camera chips to make insect-inspired “eyes”, and then hacked a small toy helicopter to do things like hold a position visually, avoid obstacles, and so forth, with a vision system weighing just a few grams. What very few people know is that some of the algorithms we used could be traced back to the insights obtained directly by those biologists studying flying insects.

https://atmelcorporation.files.wordpress.com/2013/10/hummingbird-robot.jpg

Right now we are also participating in the NSF-funded Harvard University “RoboBeeproject, whose goal is to build a 2cm scale flying robotic insect.  Centeye’s part, of course, is to provide the eyes.  My weight budget will be about 20 milligrams. So far we are down to about 50 milligrams, with off-board processing, so we have a way to go.

RoboBees Project TeamTV: You mentioned insects. do you draw inspiration from biology in your designs?

GB: Another aspect of our work, especially our own work with flying drones, is to take inspiration from biology. This includes the arrangement of pixels within an eye, as well as the type of image processing to perform and even how to integrate all this within a flight control system.

There is a lot we can learn from how nature has solved some tough problems. And we can gain a lot by copying these principles. However, in my experience it is best to understand the principles behind why nature’s particular solution to a problem works and innovate with that knowledge, rather than to slavishly copy a design you see in nature. Consider a modern airliner and a bird in flight. They do look similar- wings keep them aloft using Bernoulli forces, a tail provides stability, by keeping the center of drag behind the center of gravity, and they modify their flight path by changing the shape of their wings and tail. However an airliner is made from metal alloys, not feathers!

robobee_ardueye_avr_atmel

I like to invoke the 80/20-principle here – If you make a list of all the features of a design from nature, probably 80% of the benefit will come from 20% of the features, or even less. So focus on finding the most important features, and implement those.

TV:  What are the technology devices, components, and connectivity underneath?

GB:  For almost all of our vision sensor prototypes, including ArduEyes, there were four essential components: A lens, which focused light from the environment onto the image sensor chip, the image sensor chip itself, a processor board, and an algorithm running on the processor. You can substantially change the nature of a vision by altering just one of these components. We usually use off-the-shelf lenses, but we have made our own in the past. We always use our own image sensor chip. For the processor we have used everything from an 8-bit microcontroller to an advanced DSP chip. And finally we generally use our own algorithms, though we have tinkered with open source libraries like Open-CVcenteye_product_stonyman_breakout_001-645x425

It can take a bit of a mentality shift to be able to design across all these different layers. Most of the tools and platforms out there do not allow this type of flexibility. However with a little bit of practice it can be quite powerful. Obvious the greatest amount of flexibility comes from modifying the vision algorithms.

TV:  Does nature have a smart embedded designer? If so, what would Nature’s tagline or teaser be for it’s creations? What’s the methodology or shape, if you can sum it up in a few words?

GB: Perhaps one lesson from Nature’s “embedded designer” would be “Not too much, not too little.” To understand this, consider evolution: If you are a living creature, then your parents lived long enough to reproduce and pass their genes to you. This is true of your parents, grandparents, and so on. Every single one of your ancestors, going all the way back to the origins of life on Earth, lived long enough to reproduce, and your genetic makeup is a product of that perfect 100% success rate. It is mind blowing to think about it.

ArduEyeMini_front-avr-atmega-atmel

Now, for a creature to live long enough to reproduce, it has to have enough of the right features to survive. But it must also not have too many features, and it must also not have the wrong features. Most animals get barely enough energy (e.g. food) to survive. If a particular animal has too many “unnecessary features,” then it will need more food to survive and thus is less likely to pass its genes on.

Another lesson would be that a design’s value is measured relative to the application. Each animal species evolved for a particular role in a particular environment- this is why penguins are different from flamingos, and why fruit flies are different from eagles. Applied to human engineered devices, this means that any “specification” or figure of merit considered in a vacuum is meaningless. You have to consider the application, or the environment, first before deciding on specifications. This is why choosing a camera based only on the number of “megapixels” it has is dangerous.

TV:  What is your rule of thumb when it comes to prototypes, testing, improving, and then rolling out include fuller design?

GB:  I’m going to be more philosophical here: Rule #1- A crappy implementation of the right thing is superior, both technically and morally, to a brilliant implementation of the wrong thing. Wrong is wrong, no matter how well done. Rule #2- A crappy implementation of the wrong thing is superior to a brilliant implementation of the wrong thing. Doing the wrong thing brilliantly generally consumes more resources than doing it crappy, plus the fact you invested more into it makes you less likely to abandon it once you realize it is wrong.

Of course, the ideal is to do a brilliant implementation of the right thing. However when you are prototyping a new device, or trying to bring a new technology to market, it is very difficult to know what are the right and the wrong things to do. So the first thing you must do is to not worry about being crappy, and instead focus on identifying the right thing to do. Quickly, one ought to prototype a device, experiment with it, get it in the hands of customers if it is a product, get feedback, and make improvements. Repeat this cycle until you know you are doing the right thing. And only then put in the effort to do a brilliant implementation.

Those who are familiar with the “Lean Startup” business development model will recognize the above philosophy. I am a big fan of Lean Startup. I would give away everything I own if I could send a few relevant books on the topic back in time to my younger self 15 years ago, with a sticky note saying “READ ME YOU FOOL!”

Now of course we have to take the word “crappy” with a grain of salt. I don’t mean to produce and deliver rubbish. That helps no one. Instead, what I mean is that the first implementations you put out there are “brutally minimalist” and include the bare essence of what you are trying to produce. It may be minimal, but it still has to deliver something of real value. This is often called a “minimally viable product” in the Lean Startup community.

The same applies to when we are conducting research to develop a new type of technology. The prototypes are ugly, and often use code that makes spaghetti look like orderly Roman columns. But their purpose is to quickly test out and refine an idea before making it “pretty”.

TV:  What is the significance of the ATmega328 in your embedded design?

GB:  We chose the ATmega328 because this is the standard processor for basic Arduino designs. We wanted to maintain the Arduino experience as faithfully as possible to keep the product easy to hack.

TV:  How important is it for you to rapidly build, test, and develop the evolution of your product from Arduino?

GB:  Funny you should ask. We use Arduinos and ArduEyes all the time to prototype new devices or even perform basic experiments. When I get a new chip back from the foundry, the first thing I do is hook it up to an Arduino. I can verify basic functionality in just a few hours, sometimes even in ten minutes.

TV:  What is the difference between Centeye and ArduEye? Technology differentiators?

GB:  ArduEye is essentially a project that was developed by Centeye and supported by Centeye. The main differentiators are that ArduEye was developed in isolation from our other projects, in particular the ones associated with Defense. We essentially developed a separate set of hardware, including chips, and software, and did so at no small expense. This is partially why it took so long for this project to become reality.

TV:  How do you see ArduEye and vision chips in the future for many smart connected things?

GB:  I think the best uses for adding vision to an IoT application will come not from me, but from tinkerers, hackers, and other entrepreneurs that have identified a particular problem or pain that can be solve using our sensors as a platform. But in order for them to innovate, vision must be tamed to the level that these users can quickly iterate through different possibilities. I see ArduEye as a good platform to make it happen, to let such innovation occur in a friction-less manner.

TV:  What are some one the IoT implications of using brilliant sensor eye devices in their products?

GB:  At one level there is a rich amount of information you can obtain with vision. Think about it- you can drive a car if you only have visual information. However vision has a tendency to generate a LOT of data. This is true even for a very modest image sensor of several thousand pixels. And it is true that bandwidth is getting cheaper, but I don’t think the Siri model of pushing all the data to “the cloud” for processing is a viable one. You will have to find ways to process vision information up at the sensor, or at some nearby node, before that information can be sent up to the cloud.

TV:  How can sensors like ArduEye be compounded with richer use-cases especially when integrating the Big Data and Cloud initiatives of modern trending IT innovations?

GB:  Over the next decade we will see newly minted billionaires who have figured this out.

TV:  How can ArduEye evolve? What do you see as a visionary for ArduEye to be integrated more so to accelerate  efficiency?

GB:  Good question! Well, first of all, this will depend on how others are using ArduEye and the feedback I get from them. For ArduEye to be successful, it has to be valuable to other people. So I would really like to hear feedback from anyone who uses these products, so that we can make them better. I’ve been willing to speak with anyone who uses these products. Tell me- do you know any other image sensor companies that allow you to speak with the people who design the chips? That said, some obvious improvements would be to incorporate more advanced Arduinos, such as the Due that uses an ARM processor.

TV:  Are there security or privacy concerns for this technology to evolve? What are the caveats for designers and business makers?

GB:  Security and privacy will be a big issue for the Internet of Things, and will lead to many entrepreneurial opportunities. However, this is not our focus. But if you think about it- a benefit to using ArduEyes to monitor a room instead of a full resolution camera is that you won’t be able to recognize their faces! You can say, half jokingly, that privacy is built in!

TV:  How are vision chips and open source ArduEye helping people live better or smarter lives? Where do you see this going in 5-10 years?

GB:  The ArduEye is a fairly new project and is one that takes an uncommon, though technically sound approach to machine vision. So right now all of the use cases are experimental. This is very often the case for a new emerging technology. It will take time for the best applications to be found. But I expect that as our community of users grows, and as we learn to better service this community, we could see a diverse set of applications. Right now I can only speculate.

TV:  Where do you see Sensors, Vision, etc play a more pivotal and crowding role in the grandeur Internet of Things, Internet of Everything, and Industrial Internet?

GB:  In order for the Internet of Things to reach it’s full potential, it will need sensors to acquire all the information that is needed. Already the number of devices connected to the Internet is in the billions. It will only be a matter of time before this reaches the trillions. And we all know that vision is a powerful sensory modality. Some of the vision sensors will be higher resolution imagers of the type you see in cameras. However in the same way that there are many more insects than large mammals on planet Earth, it makes sense that there is room for many more cameras of ArduEye capability than for full image sensors. This is where I see Centeye playing in the future. More than that, this is why I originally founded Centeye in 2000- the company name was meant to be a triple pun, with the prefix “cent-“ meaning many, tiny, and expensive. Many eyes, tiny eyes, cheap eyes. I was just too soon in 2000…

centeye_product_ardueye_avr

“Major surge” expected for RF remote control devices

Analysts at ABI Research say there will be a “major surge” in RF technology adoption for remote control devices in the consumer market over the next five years.

wirelessrf

According to ABI Research practice director Peter Cooney, implementation of RF tech is becoming “more simplified” as lower power sipping is achieved.

“Over the last five years there has been an upswing in technology development and a rise in the need to make home consumer devices smart that has led to resurgence in using RF,” Cooney explained.

“Initially, proprietary RF technology was used but equipment vendors have been quick to understand the benefits of using a standardized RF technology in remote control design.”

The analyst also confirmed that the remote control market represented a “massive growth” opportunity for wireless connectivity technology vendors.

“Over 3.2 billion remote controls will be shipped from 2013 to 2018 with flat panel TVs, set-top boxes, DVD/Blu-ray devices and games consoles alone,” he added.

And that is why Atmel is helping the stalwart remote control evolve beyond the confines of infrared. As Director of Atmel Wireless Solutions Magnus Pedersen notes, the humble TV remote control has provided us with a convenient, yet unengaging, means of controlling our televisions and AV equipment for dozens of years.

“Bringing ultimate entertainment control to the holder, the infrared remote has changed little in that time despite the controlled technology making immense leaps,” he said.

“[However], as the use of RF-based controls gathers momentum, developers have a number of key design considerations to factor into their approach and how to select the components needed.”

Interested in learning more? Be sure to check out Atmel’s extensive wireless/RF portfolio here.

%d bloggers like this: