Tag Archives: AVR

Taking the IoT to the next level

Over three-quarters of companies are now actively exploring or using the Internet of Things (IoT), with the vast majority of business leaders believing it will have a meaningful impact on how their companies conduct business. Clearly, the the IoT is reaching a tipping point.

iotimpact

Although the concept of an Internet of Things has been around for at least a decade, the IoT is beginning to become an important action point for the global business community. As Clint Witchalls notes in a recent report sponsored by ARM, there is no doubt that IoT-related technology is already having a broad impact across the world. Although the precise effect is likely to vary by country and by company, it is hard to imagine any sector will be left untouched by rapidly evolving Internet of Things.

Kevin Ashton who originally coined the term the “Internet of Things” (IoT) in 1999 while working at Proctor & Gamble, points out that the recent “trickle” of IoT product releases is all part of a larger plan to test market appetite.

“We are trying to understand before we get in too deep, because once you are financially invested and committed you cease to become agile. Then you really have to start building on the thing you’ve already invested in,” Ashton explains. “In the early stages of technology deployment it’s a charitable act really to explore a new technology because the return on investment isn’t there, it’s too expensive and it’s too unknown. That’s where government has a role.”

Looking ahead, investment in the IoT should continue to increase as more and more senior executives move up the IoT learning curve. According to Witchalls, the costs associated with the IoT will continue to fall concurrently – just like any nascent technology. Indeed, a number of early adopters believe that the technology is already mature enough and cheap enough to make IoT products and services viable without the need for a big upfront investment, at least for initial trials.

“You don’t need a lot of R&D, it’s more about integration,” says Honbo Zhou, a director of China’s Haier. “Everyone can build it [into their products]. It’s just a matter of finding a business model that works.”

Meanwhile, Elgar Fleisch, the deputy dean of ETH Zürich, a science and technology university, says he believes IoT adoption will be quite different from what he dubs the “Internet of people revolution.”

During the first phase of the Internet, he maintains, anyone with a good idea and a computer could start an organization with global reach. However, Fleisch sees the initial advantage in the “IoT revolution” going mainly to bricks-and mortar organizations, especially large firms with many assets to track and monitor. Meaning, we are unlikely to see another Facebook, Yahoo or eBay.

“There will be winners and losers, but we are unlikely to see entirely new big players entering the market,” Fleisch opines.

Notwithstanding the significant involvement of the physical world of assets and products, the IoT is still expected to be a less visible revolution than the traditional Internet.

“PayPal, Groupon and YouTube are well-known Internet companies, yet few people are probably aware that the smart meter in their cellar means that their home is a part of the IoT,” writes Witchalls. “As organizations move towards the ‘productization’ of the IoT, there are signs that business leaders recognize that this need not be a major hindrance: undeveloped consumer awareness is not seen as one of the top obstacles to organizations using the IoT. After all, consumers will always want products and services that are better, cheaper, greener and more convenient.”

As Ashton notes, “Consumers are not going to demand the Internet of Things. Nobody is going to demand the underlying infrastructure.”

Rather, says Ashton, consumers will demand some value and benefit.

“They’re going to demand a security system that they can control from their smartphone. You don’t go to the end user and talk about the Internet of Things. You go to the end user to talk about benefits,” he adds.

Want to learn more about how the IoT revolution is gathering pace and reaching a tipping point? Part one is available here, part two here, part three here and part four here.

Automotive circuit design headaches

I wrote an article for Electronic Design magazine about Bob Pease and his solenoid driver circuit. Former National Semiconductor employee Myles H. Kitchen was nice enough to drop me an encouraging note.

“Thanks for your great article on Bob Pease and the solenoid drivers. Having worked with Bob in the late 1970s and early 1980s at National Semiconductor, I came to appreciate his wisdom and simplicity for addressing issues that seemed simple, but were really quite involved. As someone who’s worked on automotive electronics my entire career, an issue such as a solenoid driver is critical. I recall when testing early automotive product designs at one company, we would put the module under test in a car, and then turn on the 4-way flashers to see if operation was affected, or if it stopped working completely. The combination of multiple inductive and high-current resistive loads operating on and off at several hertz would play havoc with the power supply, and immediately point out design deficiencies in module power supplies, regulation, protection, and noise immunity…. some of which could be traced to poor relay or solenoid driver circuits.  Surviving the 4-way flasher test was only a quick way to see how robust the new design might be, but it was a quick indicator if we had things right up to that point. I miss Bob and his ramblings in ED, but hope to see more of your work in the future.  Loved it.”

Well, having been an automotive engineer at both GM and Ford before moving out to Silicon Valley, Myles’s note sparked a flood of memories. His four-way flasher story was prophetic. When I was in college at GMI (General Motors Institute) one of my pals worked at Delco. They were just coming out with the integrated electronic voltage regulator in the back of the alternator, circa 1973. So all the executives were standing around at a demo and after they ohhhh and ahhhh, and congratulate themselves, my buddy gets in the car, and knowing what Myles knows, he cycles the air conditioning switch a few times. The “Charge” light promptly came on.

Auto-warning-lights

I asked my fellow student if he was in trouble or if they hated him for causing the failure, and to GM’s credit, he told me “No, they were actually glad I found it before it went into production.” It must have been some serious egg on some faces, though. After that, survival after repeated AC clutch cycling became part of the spec for the voltage regulator. I bet four-way flashers are included as well.

I later worked on anti-lock brakes for GMC heavy duty trucks. This was way before anti-lock brakes on cars, about 1975. We dutifully shielded all the wires to the sensors with expensive braided cable. When we pulled the truck out on the road, the brakes started modulating, with the truck just sitting there. We realized that the entire 24V power system was a pretty nice antenna and that noise can get into a module from the power side as easy as from the sensors. We begged the government to give us more time, and they did. Indeed, I don’t know if they ever put in antilock brakes on heavy trucks. Let me check, yeah, wow, it’s still called MVSS 121 (motor vehicle safely standard) and it finally went into effect in 1997. That was at least a 20-year delay in getting it working.

I told Bob Reay over at Linear Tech that automotive design was the toughest, because you had a military temperature and vibration, but consumer cost. He added another factor, the chips for automotive have to yield well, since you need to ship millions. What a crazy challenge.

When I thanked Myles Kitchen for his kind words and told him the above stories, he responded with a great story about load dump. The phenomena called load dump is usually caused by a mechanic who is troubleshooting the battery and charging system of a car. You get the car running, rev it up a bit, and yank off the battery cable. If the car keeps running, that means the alternator and regulator are OK, it is just a bad battery. Thing is, the alternator is often putting full output into this bad battery. And when you yank the cable off the battery, the voltage regulator controlling the alternator cannot react instantly. So there is this huge overvoltage spike as all the stored energy in the alternators magnetic field has to dissipate into whatever loads are still connected, like your radio. A load dump can put over 100 volts on electrical system. And it is not a fast spike; it can last for hundreds of milliseconds. Smart mechanics just leave the battery cable on and hook up a voltmeter to see if the alternator is putting 13.75 to 14.2 volts on the battery. So Myles recounts:

“Thanks for your email.  Yes, sounds like we’ve run up against many of the common automotive issues in our time.  I’ll add one brief anecdote here.  When I worked at Motorola’s automotive division, I certainly learned all about what a load dump is, but I’d never really heard of anyone experiencing one first-hand and what it could do.  One day, our admin complained that her 70’s vintage Plymouth Duster wasn’t running right, and that her headlamps and radio quit working.  She had been driving it the night before when something went wrong.  We brought it into the garage at Motorola, and found that she had a very discharged battery with very loose battery connections. You could just lift them off with your hand.  As a result, her battery was discharged, and when she hit a Chicago pothole it all went bad.  The resulting load dump had blown out every light bulb filament in the car, along with the radio.  Only the alternator/regulator had survived.  The ignition was still a points and condenser system, or that would have probably died as well.  A new battery, tight connections, and a bunch of replacement bulbs got her back on the road again.  And, I’ve never doubted the need for a load-dump-tolerant design since!”

Those are wise words from someone who has been there and seen it first-hand. And I wonder if the voltage regular in that old Duster was a mechanical points type. In the early days we automotive engineers would try to protect each individual component for load dump. The radio would have a Zener diode clamp, so would the cruise control module. Then manufactures put a big Zener clamp right in the voltage regulator that clamps the voltage on the whole car. Maybe that was too low an impedance to clamp, because now I see there are a lot of smaller distributed TVS (transient voltage suppressor) clamps that you use to protect the circuitry of your module.

There are two other approaches. One, you can just disconnect your circuit with a high-voltage FET when the load dump happens:

Overvoltage-cut-out-circuit

I used this circuit to keep automotive overvoltage from destroying an LT1513 chip I used as a battery charger. When the DC Bus voltage exceeds the 24V Zener plus the base-emitter drop of Q10, it turns Q10 on and that turns Q12 off and protects downstream circuitry from overvoltage.

Alternative two, you can put a high-voltage regulator in front of your circuit that will maintain power to your circuit through the load dump, at the risk that the pass transistor will overheat since it is dropping a lot of voltage while passing current during the load dump. Linear Tech makes such a part.

There is one more tip for every engineer regarding automotive electronics. Remember that there are laws that make auto manufacturers offer service parts for 10 or 15 years. So no matter what your application, you might consider using an automotive part like Atmel’s line of MCUs, memory, CAN/LIN bus, and RF remote controls. We state that we will be making many of these parts for over a decade. If you design them into your industrial, medical or scientific application (ISM) you can have some assurance you can still get the part for years, or at least a pin-for-pin compatible part. That means no board spins. On top of that assurance, most of the parts have extended temperature range, which might help in your application as well. Since we make the parts for high-volume automotive customers, they are usually priced very reasonably.

Benchmarks for embedded processors

Crack applications engineer Bob Martin was walking by just now and we got to talking about people we both knew from our National Semiconductor days. One name that came up was Markus Levy. Bob told me about EEMBC® — the Embedded Microprocessor Benchmark Consortium.

EEMBC

When I read up on the organization, I was delighted to see that Markus started work on embedded benchmarks when he worked at EDN magazine, where I also worked as an editor for 5 years. Back in 1996, it was clear that the old Dhrystone MIPS benchmark was not really meaningful to embedded systems. So Markus got a bunch of industry companies together and proposed the new benchmarks. They got 12 members right off the bat and got funding to establish real-world benchmarks that would be suitable for phones, tablets, routers and other embedded systems. As their about page explains:

“EEMBC benchmarks are built upon objective, clearly defined, application-based criteria. The EEMBC benchmarks reflect real-world applications and have expanded beyond processor benchmarks, also heavily focusing on benchmarks for smartphones/tablets and browsers (including Android platforms) and networking firewall appliances.”

I was glad to see that not only is Atmel a member, but so is ARM, who invented the cores used in Atmel’s 32-bit SAM line of microprocessors and microcontrollers. When you look at Atmel’s benchmark results, You can see our original 8051 processors get a score of 0.1. An AVR 8-bit MCU like the ATmega644 will get a benchmark score of 0.54. In contrast our ARM-core SAM3 and SAM4 chips will get a benchmark score up to 3.3. When I looked at a competitor’s ARM4 offering, I was delighted to see they ranged from 2.0 to 2.8, significantly slower than Atmel’s ARM4 SAM4 chips.

This is congruent with what I hear in the hallways here at Atmel. We just didn’t slap some counter-timers on an ARM core and release it. We took the time to do it right, adapting and improving the really cool peripheral system from our XMEGA 8-bit micros. I assume these benchmarks are just for raw speed, but the cool thing about Atmel’s peripheral event system is that you can have peripherals interact and do DMA without waking up the CPU core and sucking up a lot of power. Still it’s nice that the benchmark shows us as faster. This might mean you can get some chunk of code to execute faster and then get the micro put to sleep, saving power overall. This can be non-intuitive. If the micro’s compiler has more efficient code creation, you can get way more done with the same amount or less power. I know this is true for AVR 8- and 32-bit processors. The AVR was invented and crafted by hardware engineers that understood the importance of C and computer science in general. Although the entire AVR line did not spring fully-formed from the head of Thor, there were some really crafty Norwegians involved.

While the ARM-core SAM chips run ARM instruction sets, they too are optimized for compiling. After all, AVR showed the world how to do this in 1996. And with Atmel peripheral concepts, the SAM chips are really something. Check out the new SAM D20 Cortex M0+ micro for a nice inexpensive chip that can do a whole lot on minimal power.

7Electrons: An Atmel-powered eBookmark

Back in July, Jinna Kim of 7Electrons completed preliminary sketches for an eBookmark powered by an 8-bit AVR Atmel microcontroller (MCU), various Adafruit components, LEDs and a resistive touch strip.

The device – dubbed the 7Electrons eBookmark – is now operational and live (in pictures) on the 7Electrons blog.

According to Terry Burton of 7Electrons, the eBookmark is envisioned as a bridge between analog and digital worlds.

“From the beginning, the eBookmark was conceptualized as a statement on books and their current place in the world. Our generation bridges a strange divide where paper exists but is slowly being nudged out,” Burton explained.

“The eBookmark attempts to bridge a similar divide that we ourselves have lived through. Jinna used to re-read pages because she would forget which paragraph and side of the page she left off on.”

The 7Electron eBookmark, says Burton, allows the reader to save his or her place on the page – and with a switch they can also select the left or right page. The top also extends for use with larger books.

“The process of creating the bookmark involved wood carving, software development in C on an Atmel 8-bit processor, and hand wiring of all the electronics, LEDs and battery with fine wire,” Burton added.

“[Meanwhile], 16 tiny surface mount LEDs shine through the opaque face and the MCU remembers where you left off in your reading… In an era where bookstores are dying and libraries are collecting cobwebs, our piece is a bridge between analog and digital.”

Interested in learning more about the Atmel-powered eBookmark? You can check out additional pictures on the official 7Electron blog here.

C64 DRAM testing with an ATmega8U2

Josh (aka Axlecrusher) was working on restoring an old Commodore 64. The stalwart computer was missing a few obvious pieces, prompting Josh to make a new AV cable and obtain a new power supply, all while replacing the PLA, VIC and capacitors. However, the C64 still didn’t boot properly, so Josh decided to test the Commodore 64’s DRAM chips.

Photo Credit: WIkipedia

“My particular C64 uses 8 individual RAM chips most are D4164C-15 and a couple are D4164C-2. I replaced all the RAM but I wanted to know if both the original and replacement chips were functioning properly. I decided to test each chip, I needed some hardware to test with,” Axlecrusher explained in a blog post.

“I decided to use [Atmel’s] ATmega8U2 AVR microprocessor (MCU), because I have one that can plug into a breadboard and it has enough pins to drive the D4164C chips. I wired the test setup to try to reduce the instruction count where I could so it is a little messy. Wiring and instruction count could be greatly improved if I didn’t need the programming header, but it gets the job done.”

According to Axlecrusher, the first task was simply trying to write and read just one bit, with the datasheet for the DRAM provided timing windows charts for each required step. After a few hours of stepping through the charts, coding, re-coding, reviewing the charts, and sometimes just trial and error, Axlecrusher was finally able to write and read 1 bit from memory.

“After a couple more days of work I was reading and writing to the entire memory module. I constructed a couple of routines to test both the wiring and the memory. To test the wires I wrote 0 to the first bit of the memory followed by a 1 to a power of 2 memory location (high on just 1 wire). I then read memory location zero and if the value is no longer 0, it indicates a failure on a specific address wire,” he continued.

“I used a walking one algorithm with a bit inversion to test all the memory cells. The goal is to toggle as many bits as possible. In either case if there’s an error, the red LED would turn off forever. While the test is running, the LED will blink at the end of each complete cycle. I was able to test all the memory modules I had replaced. They were all functioning properly.”

Interested in learning more about C64 DRAM testing with Atmel’s ATmega8U2? You can check out Axlecrusher’s original blog post here and the source code on GitHub.

Infographic: Atmel’s secret maker sauce – AVR

Maker Faire Rome and World Maker Faire New York may be behind us, but Atmel is by no means finished making a big deal of the Maker Movement this year!

In fact, milling around with the most passionate (oddly dressed) people on the planet only ever serves to galvanize us to put that extra dash of passion into everything we do; into every chip and kit we produce. That’s because somewhere out there is a Maker who will take our kits or chips and build a 3D printer with them, or a nippy little vision sensor robot, or even a smart toilet (yes, seriously)!

It’s always easy to appreciate a finished product, of course. But us Atmelians know that it’s the components that allow people’s designs to really shine… much as a first class meal is only as good as the ingredients used.

Our secret ingredient, of course, is AVR; the little chip that can do big things and create infinite possibilities.

Atmel_August Auto_Final

(Click image to enlarge)

AVR was one of the first microcontrollers to use on-chip flash memory for program storage, pushing the envelope early on. Starting out as a PhD project in Trondheim, Norway, the technology has come a long way, both literally and figuratively!

Available in the tiniest of packages (the ATtiny20 is so small it can almost fit inside the ball of a ballpoint pen) and so low-power it makes Sleeping Beauty look like a fitness instructor, AVR has wowed makers from the get-go.

That’s why AVR was the first choice chip for Maker favorite Arduino. It’s now estimated that around one million Arduinos have been sold to date, and within the next 5 to 10 years, the Arduino will be used in every school to teach electronics and physical computing.

Not to mention how many quadcopters and crazy looking drones AVR powers. Ex Wired Editor, Chris Anderson, estimates that the DIY Drone community currently boasts over 15,000 drones, compared to just 7,000 “professional” drones in use worldwide by military forces. Power to the people, so to speak!

Is it any surprise Atmel almost bursts with pride whenever we find a new AVR project to tout? We’ve even created our own award for Makers with the most creative AVR vision (you’re free to submit your own projects, check out others or just vote for your favorites until the end of December!)

We hope you enjoy some of the fun facts we’ve dug up for our Maker AVR infographic even half as much as we enjoyed making it! Keep creating, folks!

1:1 interview with Geoffrey Barrows of ArduEye

ArduEye is a project by Centeye, Inc. to develop open source hardware for a smart machine vision sensor. All software and hardware (chips and PCBs) for this project were developed either from pre-existing open source designs or from Centeye’s own 100% IR&D efforts. In the interview below, Atmel discusses the above-mentioned technology with Maker Geoffrey Barrows, founder of ArduEye and CEO of Centeye.

geoffrey-barrows-ardueye-avrTom Vu: What can you do with ArduEye?

Geoffrey Barrows:  Here are some things people have actually done with an ArduEye, powered by just the ATmega328 type processor used in basic Arduinos:

  • Eye Tracking- A group of students at BCIT made an eye tracking device, for people paralyzed with ALS (“Lou Gehrig’s disease”), that allow them to operate a computer using their eyes.
  • Internet connected traffic counter– I aimed an ArduEye out at the street in front of my house and programmed it to count the number of cars driving northbound. Every 5 minutes, it would upload the count to Xively, allowing the whole world to see the change in traffic levels throughout the day.
  • Camera trigger- One company used an ArduEye to make a camera trigger at the base of a water slide at a water park. When someone riding the slide reached the bottom, the camera took a picture of the rider and then send it to him or her via SMS!
  • Control a robotic bee – My colleagues at Harvard University working on the “RoboBee” project mounted one of our camera chips on their 2cm large robotic bee platform. The chip was connected to an Arduino Mega (obviously not on the bee), which ran a program to compute visually, using optical flow, how high the bee climbed. A controller could then cause the bee to climb to a desired height and hold a position. This was a very cool demonstration.
  • Control a drone – My colleagues at the U. Penn GRASP Lab (who produced the famous swarming quad copter video) used two ArduEyes to control one of their nano quad copters to hover in place using vision.
  • The New Jersey based “LandroidsFIRST robotics team uses ArduEyes on their robots to do things like detect objects and other robots.

These are just some examples. You can also do things like count people walking through a doorway, make a line-following robot, detect bright lights in a room, and so forth. I could spend hours dreaming up uses for an ArduEye. Of course an ArduEye doesn’t do any of those things “out of box”- you have to program it. Arduino_ardueye-avr-atmega

TV:  Explain the methodology and the approach? What is your general rule of thumb when it comes to resolution and design margins?

GB:  My design philosophy is a combination of what I call “vertical integration” and “brutal minimalism”. To understand “vertical integration,” imagine a robot using a camera to perform vision-based control. Typically, one company designs the camera lens, another designs the camera body and electronics, and another company designs the camera chip itself. Then you have a “software guy/gal” write image processing algorithms, and then another person to implement the control algorithms to control the robot. Each of these specialties is performed by a different group of people, with each group having their own sense of what constitutes “quality.” The camera chip people generally have little experience with image processing and vice versa. The result is a system that may work, but is cumbersome and cobbled together.

Our approach is instead to consider these different layers together and in a holistic fashion. At Centeye, the same groups of minds design both the camera hardware (the camera body as well as the camera chips themselves) and the image processing software. In some cases we even design the lens. What this means is that we can control the interface between the different components, rather than being constrained by an industrial standard. We can identify the most important features and optimize them. Most important, we can then identify the unnecessary features and eliminate them.

This latter practice, that of eliminating what is unnecessary, is “brutal minimalism”. This is, in my opinion, what has allowed us to make such tiny image sensors. And the first thing to eliminate is pixels! It is true that if you want to take a beautiful photograph for display, you will need megapixels worth of resolution (and a good lens). But to do many other tasks, you need far fewer than that. Consider insects- they live their whole lives using eyes that have a resolution between about 700 pixels (for a fruit fly) to maybe 30k pixels (for a dragonfly). This is an existence proof that you don’t always need a million pixels, or even a thousand pixels, to do something interesting.

TV:  What are some of the interesting projects you have worked on when involving sensors, vision chips, or robotics?

GB:  The US Air Force and DARPA has over the years been sponsoring a number of fascinating programs bringing together biologists and engineers to crack the code of how to make a small, flying robot. These projects were all interesting because they provided me, the engineer, with the chance to observe how Nature has solved these problems. I got to interact with an international group of neuroscientists and biologists making real progress “reverse engineering” the vision systems of flys and bees. Then later on I got to implement some of these ideas in actual flying robots.

This gave me insights to vision and robotics that are often contradictory to what is generally pursued in much university research efforts- the way a fly perceives the world and controls itself is completely different from how most flying “drones” do the same. Flying insects don’t reconstruct Cartesian models of the world, and certainly don’t use Kalman filters!

uva-darpa-aerovironment-avrI also participated in the DARPANano Air Vehicle” effort, where I got to put some of these bio-inspired principles to practice. As part of that project, we built a set of camera chips to make insect-inspired “eyes”, and then hacked a small toy helicopter to do things like hold a position visually, avoid obstacles, and so forth, with a vision system weighing just a few grams. What very few people know is that some of the algorithms we used could be traced back to the insights obtained directly by those biologists studying flying insects.

https://atmelcorporation.files.wordpress.com/2013/10/hummingbird-robot.jpg

Right now we are also participating in the NSF-funded Harvard University “RoboBeeproject, whose goal is to build a 2cm scale flying robotic insect.  Centeye’s part, of course, is to provide the eyes.  My weight budget will be about 20 milligrams. So far we are down to about 50 milligrams, with off-board processing, so we have a way to go.

RoboBees Project TeamTV: You mentioned insects. do you draw inspiration from biology in your designs?

GB: Another aspect of our work, especially our own work with flying drones, is to take inspiration from biology. This includes the arrangement of pixels within an eye, as well as the type of image processing to perform and even how to integrate all this within a flight control system.

There is a lot we can learn from how nature has solved some tough problems. And we can gain a lot by copying these principles. However, in my experience it is best to understand the principles behind why nature’s particular solution to a problem works and innovate with that knowledge, rather than to slavishly copy a design you see in nature. Consider a modern airliner and a bird in flight. They do look similar- wings keep them aloft using Bernoulli forces, a tail provides stability, by keeping the center of drag behind the center of gravity, and they modify their flight path by changing the shape of their wings and tail. However an airliner is made from metal alloys, not feathers!

robobee_ardueye_avr_atmel

I like to invoke the 80/20-principle here – If you make a list of all the features of a design from nature, probably 80% of the benefit will come from 20% of the features, or even less. So focus on finding the most important features, and implement those.

TV:  What are the technology devices, components, and connectivity underneath?

GB:  For almost all of our vision sensor prototypes, including ArduEyes, there were four essential components: A lens, which focused light from the environment onto the image sensor chip, the image sensor chip itself, a processor board, and an algorithm running on the processor. You can substantially change the nature of a vision by altering just one of these components. We usually use off-the-shelf lenses, but we have made our own in the past. We always use our own image sensor chip. For the processor we have used everything from an 8-bit microcontroller to an advanced DSP chip. And finally we generally use our own algorithms, though we have tinkered with open source libraries like Open-CVcenteye_product_stonyman_breakout_001-645x425

It can take a bit of a mentality shift to be able to design across all these different layers. Most of the tools and platforms out there do not allow this type of flexibility. However with a little bit of practice it can be quite powerful. Obvious the greatest amount of flexibility comes from modifying the vision algorithms.

TV:  Does nature have a smart embedded designer? If so, what would Nature’s tagline or teaser be for it’s creations? What’s the methodology or shape, if you can sum it up in a few words?

GB: Perhaps one lesson from Nature’s “embedded designer” would be “Not too much, not too little.” To understand this, consider evolution: If you are a living creature, then your parents lived long enough to reproduce and pass their genes to you. This is true of your parents, grandparents, and so on. Every single one of your ancestors, going all the way back to the origins of life on Earth, lived long enough to reproduce, and your genetic makeup is a product of that perfect 100% success rate. It is mind blowing to think about it.

ArduEyeMini_front-avr-atmega-atmel

Now, for a creature to live long enough to reproduce, it has to have enough of the right features to survive. But it must also not have too many features, and it must also not have the wrong features. Most animals get barely enough energy (e.g. food) to survive. If a particular animal has too many “unnecessary features,” then it will need more food to survive and thus is less likely to pass its genes on.

Another lesson would be that a design’s value is measured relative to the application. Each animal species evolved for a particular role in a particular environment- this is why penguins are different from flamingos, and why fruit flies are different from eagles. Applied to human engineered devices, this means that any “specification” or figure of merit considered in a vacuum is meaningless. You have to consider the application, or the environment, first before deciding on specifications. This is why choosing a camera based only on the number of “megapixels” it has is dangerous.

TV:  What is your rule of thumb when it comes to prototypes, testing, improving, and then rolling out include fuller design?

GB:  I’m going to be more philosophical here: Rule #1- A crappy implementation of the right thing is superior, both technically and morally, to a brilliant implementation of the wrong thing. Wrong is wrong, no matter how well done. Rule #2- A crappy implementation of the wrong thing is superior to a brilliant implementation of the wrong thing. Doing the wrong thing brilliantly generally consumes more resources than doing it crappy, plus the fact you invested more into it makes you less likely to abandon it once you realize it is wrong.

Of course, the ideal is to do a brilliant implementation of the right thing. However when you are prototyping a new device, or trying to bring a new technology to market, it is very difficult to know what are the right and the wrong things to do. So the first thing you must do is to not worry about being crappy, and instead focus on identifying the right thing to do. Quickly, one ought to prototype a device, experiment with it, get it in the hands of customers if it is a product, get feedback, and make improvements. Repeat this cycle until you know you are doing the right thing. And only then put in the effort to do a brilliant implementation.

Those who are familiar with the “Lean Startup” business development model will recognize the above philosophy. I am a big fan of Lean Startup. I would give away everything I own if I could send a few relevant books on the topic back in time to my younger self 15 years ago, with a sticky note saying “READ ME YOU FOOL!”

Now of course we have to take the word “crappy” with a grain of salt. I don’t mean to produce and deliver rubbish. That helps no one. Instead, what I mean is that the first implementations you put out there are “brutally minimalist” and include the bare essence of what you are trying to produce. It may be minimal, but it still has to deliver something of real value. This is often called a “minimally viable product” in the Lean Startup community.

The same applies to when we are conducting research to develop a new type of technology. The prototypes are ugly, and often use code that makes spaghetti look like orderly Roman columns. But their purpose is to quickly test out and refine an idea before making it “pretty”.

TV:  What is the significance of the ATmega328 in your embedded design?

GB:  We chose the ATmega328 because this is the standard processor for basic Arduino designs. We wanted to maintain the Arduino experience as faithfully as possible to keep the product easy to hack.

TV:  How important is it for you to rapidly build, test, and develop the evolution of your product from Arduino?

GB:  Funny you should ask. We use Arduinos and ArduEyes all the time to prototype new devices or even perform basic experiments. When I get a new chip back from the foundry, the first thing I do is hook it up to an Arduino. I can verify basic functionality in just a few hours, sometimes even in ten minutes.

TV:  What is the difference between Centeye and ArduEye? Technology differentiators?

GB:  ArduEye is essentially a project that was developed by Centeye and supported by Centeye. The main differentiators are that ArduEye was developed in isolation from our other projects, in particular the ones associated with Defense. We essentially developed a separate set of hardware, including chips, and software, and did so at no small expense. This is partially why it took so long for this project to become reality.

TV:  How do you see ArduEye and vision chips in the future for many smart connected things?

GB:  I think the best uses for adding vision to an IoT application will come not from me, but from tinkerers, hackers, and other entrepreneurs that have identified a particular problem or pain that can be solve using our sensors as a platform. But in order for them to innovate, vision must be tamed to the level that these users can quickly iterate through different possibilities. I see ArduEye as a good platform to make it happen, to let such innovation occur in a friction-less manner.

TV:  What are some one the IoT implications of using brilliant sensor eye devices in their products?

GB:  At one level there is a rich amount of information you can obtain with vision. Think about it- you can drive a car if you only have visual information. However vision has a tendency to generate a LOT of data. This is true even for a very modest image sensor of several thousand pixels. And it is true that bandwidth is getting cheaper, but I don’t think the Siri model of pushing all the data to “the cloud” for processing is a viable one. You will have to find ways to process vision information up at the sensor, or at some nearby node, before that information can be sent up to the cloud.

TV:  How can sensors like ArduEye be compounded with richer use-cases especially when integrating the Big Data and Cloud initiatives of modern trending IT innovations?

GB:  Over the next decade we will see newly minted billionaires who have figured this out.

TV:  How can ArduEye evolve? What do you see as a visionary for ArduEye to be integrated more so to accelerate  efficiency?

GB:  Good question! Well, first of all, this will depend on how others are using ArduEye and the feedback I get from them. For ArduEye to be successful, it has to be valuable to other people. So I would really like to hear feedback from anyone who uses these products, so that we can make them better. I’ve been willing to speak with anyone who uses these products. Tell me- do you know any other image sensor companies that allow you to speak with the people who design the chips? That said, some obvious improvements would be to incorporate more advanced Arduinos, such as the Due that uses an ARM processor.

TV:  Are there security or privacy concerns for this technology to evolve? What are the caveats for designers and business makers?

GB:  Security and privacy will be a big issue for the Internet of Things, and will lead to many entrepreneurial opportunities. However, this is not our focus. But if you think about it- a benefit to using ArduEyes to monitor a room instead of a full resolution camera is that you won’t be able to recognize their faces! You can say, half jokingly, that privacy is built in!

TV:  How are vision chips and open source ArduEye helping people live better or smarter lives? Where do you see this going in 5-10 years?

GB:  The ArduEye is a fairly new project and is one that takes an uncommon, though technically sound approach to machine vision. So right now all of the use cases are experimental. This is very often the case for a new emerging technology. It will take time for the best applications to be found. But I expect that as our community of users grows, and as we learn to better service this community, we could see a diverse set of applications. Right now I can only speculate.

TV:  Where do you see Sensors, Vision, etc play a more pivotal and crowding role in the grandeur Internet of Things, Internet of Everything, and Industrial Internet?

GB:  In order for the Internet of Things to reach it’s full potential, it will need sensors to acquire all the information that is needed. Already the number of devices connected to the Internet is in the billions. It will only be a matter of time before this reaches the trillions. And we all know that vision is a powerful sensory modality. Some of the vision sensors will be higher resolution imagers of the type you see in cameras. However in the same way that there are many more insects than large mammals on planet Earth, it makes sense that there is room for many more cameras of ArduEye capability than for full image sensors. This is where I see Centeye playing in the future. More than that, this is why I originally founded Centeye in 2000- the company name was meant to be a triple pun, with the prefix “cent-“ meaning many, tiny, and expensive. Many eyes, tiny eyes, cheap eyes. I was just too soon in 2000…

centeye_product_ardueye_avr

ATmega328 drives this rockin’ Lunchbeat sequencer

A Maker by the name of Jan Cumpelik has designed a rockin’ breadboard sequencer aptly dubbed “Lunchbeat.”

Powered by Atmel’s stalwart ATmega328, the microcontroller (MCU) accepts inputs from the neat row of 10k trimpots as well as a series of tactile switches. Feedback is provided by a row of 8 LEDs – driven from a 595 shift register to save pins on the microcontroller.

“The remaining chip is an OpAmp which works in conjunction with a 3-bit R2R ladder DAC to output audio. Turn your speakers down just a bit before taking in the demonstration,” writes HackADay’s Mike Szczys.

“[Below] you will also find an image version of his schematic that we made for your convenience. It is only available as a PDF in the code repository he posted.”

As previously discussed on Bits & Pieces, Atmel’s ATmega328 is a a high-performance 8-bit AVR RISC-based microcontroller that boasts 32KB ISP flash memory with read-while-write capabilities, 1KB EEPROM, 2KB SRAM, 23 general purpose I/O lines, 32 general purpose working registers and three flexible timer/counters with compare modes.

Additional key specs include internal and external interrupts, serial programmable USART, a byte-oriented 2-wire serial interface, SPI serial port, 6-channel 10-bit A/D converter (8-channels in TQFP and QFN/MLF packages), a programmable watchdog timer with internal oscillator and five software selectable power saving modes. Operating between 1.8-5.5 volts, the  ATmega328 executes powerful instructions in a single clock cycle – achieving throughputs approaching 1 MIPS per MHz – neatly balancing power consumption with processing speed.

Interested in learning more? You can read more about the ATmega328 on Atmel’s website.

“Major surge” expected for RF remote control devices

Analysts at ABI Research say there will be a “major surge” in RF technology adoption for remote control devices in the consumer market over the next five years.

wirelessrf

According to ABI Research practice director Peter Cooney, implementation of RF tech is becoming “more simplified” as lower power sipping is achieved.

“Over the last five years there has been an upswing in technology development and a rise in the need to make home consumer devices smart that has led to resurgence in using RF,” Cooney explained.

“Initially, proprietary RF technology was used but equipment vendors have been quick to understand the benefits of using a standardized RF technology in remote control design.”

The analyst also confirmed that the remote control market represented a “massive growth” opportunity for wireless connectivity technology vendors.

“Over 3.2 billion remote controls will be shipped from 2013 to 2018 with flat panel TVs, set-top boxes, DVD/Blu-ray devices and games consoles alone,” he added.

And that is why Atmel is helping the stalwart remote control evolve beyond the confines of infrared. As Director of Atmel Wireless Solutions Magnus Pedersen notes, the humble TV remote control has provided us with a convenient, yet unengaging, means of controlling our televisions and AV equipment for dozens of years.

“Bringing ultimate entertainment control to the holder, the infrared remote has changed little in that time despite the controlled technology making immense leaps,” he said.

“[However], as the use of RF-based controls gathers momentum, developers have a number of key design considerations to factor into their approach and how to select the components needed.”

Interested in learning more? Be sure to check out Atmel’s extensive wireless/RF portfolio here.

AVR video synthesizer and an analog video game prototype

Like most of the folks that come to the annual Analog Aficionados party, my buddy Todd Bailey has a bunch of interests. Todd helped Atmel out at the NY Maker Faire working at our booth, showing off his Atmel AVR-powered video synthesizer.

Todd-Bailey_Video-synthesizer

Todd Bailey’s video synthesizer getting a workout by Dan Friel as he performs Thumper

Todd does a lot of work with AVRs, some of which I can’t tell you about because he is under NDA (non-disclosure agreement). The video synth was a personal fun project perfectly aligned with the open-source and Maker movement. The synth generates all sync, blanking, and colorburst signals on an Atmega168a running at 14.31818MHz (four times the color carrier frequency for NTSC). The one at the Faire was a prototype and Todd might move up to an Xmega just so he can run at 8 times the color carrier rate for tighter timings.

It’s currently written in mixed C and assembly.

Todd-Bailey_Video-synthesizer-PCB

Todd Bailey demonstrated this AVR-powered video synthesizer at the Atmel booth at NY Maker Faire 2013.

In addition to synthesized video, Bailey also loves old vector arcade games. These are games where the CRT (cathode ray tube) is not a raster unit like in your old analog TV. A vector tube is more like an oscilloscope, where you draw lines at any angle. Todd wrote:

“As some of you may have known or been involved in, a couple buddies and I have been working on a new arcade game using old vector monitors to take advantage of how beautiful and alien they look.  We built an FPGA-based vector generator, a high-bandwidth and resolution XYZ DAC/amp and have gotten really intimate with the guts of the Electohome G05 monitor.”

Todd-Bailey_video game

“Anyway, most of the hardware and engine stuff is done and we decided it was time to show it off to our friends.  The storyboard as it stands is about cryogenically frozen Soviet pilots descending from space and blowing up Chicago, although the prototype game right now is just about blasting polygons.  It’s in full 3D wireframe, and it also features a separately-driven monochrome ATM CRT as the ship’s HUD. We’d like it to become a proper stand up arcade game pretty soon but have basically no idea what to do with it when we’re done.”

I got into vector CRTs when I saw the schematics for the HV (high voltage) section of the Tempest vector monitor. They would have been better off running open-loop. What the flyback circuit does is try to maintain voltage on a system with a static load, so all you really get is excessive current as the flyback windings start to short, and the well-known smoke effect from these systems. A universal input current-mode flyback would be just the ticket– protecting the transformer from fire and I bet even that could run open-loop once you set it at the factory.