Author Archives: l2myowndevices

About l2myowndevices

Don Dingee (@L2myowndevices) is an explorer on the #IoT, writing and speaking at the confluence of embedded, mobile, sensing, wireless, and social technology.

tinyAVR in 8- and 14-pin SOIC now self-programming


The ATtiny102/104 retain the AVR performance advantage — still a 12 MIPS core with 1KB Flash and 32B SRAM — and upgrade many of the features around it.


At this week’s Embedded World 2016, Atmel is heading back to 8-bit old school with their news, straight to the low pin count end of their MCU portfolio with a significant upgrade to the tinyAVR family.

According to Atmel’s briefing package, development of the ATtiny102 and ATtiny104 has been in progress for some time. We got a peek at the company’s roadmap for AVR where these are labeled “next generation tinyAVRs,” and all we can say is this is the beginning of a significant refresh — alas, we can’t share those details, but we can now look at these two new parts.

What jumps out immediately is how the AVR refresh fills a significant gap in Atmel’s capability. The existing tinyAVR family is anchored by the ATtiny10, a capable 8-bit AVR core running at up to 12 MIPS with 0.5 or 1KB Flash and 32B of SRAM. The pluses of extended availability are obvious at the beginning of the lifecycle, but by the midpoint of a long run, the technology can start to seem dated.

 ATtiny102/ ATtiny104

ATtiny102/ ATtiny104

That is certainly the case for the ATtiny10 introduced in April 2009. At that time, the ATtiny10 was a shot straight at the Microchip PIC10F, with much higher CPU performance and a competitive 6-pin SOT and 8-pin DFN package offering. Outside of the CPU itself, the ATtiny10 and PIC10F line up pretty closely except for two areas: self-programming, and the accuracy of on-chip oscillators and voltage references. ATtiny10 parts require pre-programming from Atmel or a distributor, and its rather wide accuracy specs need help from product calibration and external componentry – however, cost and code compatibility still have a lot of sway, and the popularity of the ATtiny10 was unshaken.

The ATtiny102/104 retain the AVR performance advantage — still a 12 MIPS core with 1KB Flash and 32B SRAM — and upgrade many of the features around it. First and most noticeable is a packaging improvement. The ATtiny102 comes in an 8-pin SOIC (with the 8-pin DFN option still available). For a generation of applications needing more I/O in a low-cost part, the ATtiny104 comes in a pin-compatible 14-pin SOIC with 6 extra I/O pins.

Features for ATtiny102/ ATtiny104

Self-programming of Flash has been added to both versions, and with the same core footprint a single production image for both parts is achievable. Fast start-up time is available as an option as well. The internal voltage references are now highly accurate, with calibrated 1.1V, 2.2V, and 4.3V taps at +/- 3%. Internal oscillator accuracy is now +/- 2% over a 0 to 50 degrees C temperature range at fixed voltage. Those changes prompted expanding successive approximation ADC resolution to 10-bit, and channels are doubled to eight. Two of the I/O pins can now be configured for a USART, adding serial communications capability. A new 10-byte Unique ID provides a serial number.

Those features translate to customer satisfaction with intelligent devices using the ATtiny102 and ATtiny104. The more accurate internal oscillator improves the precision of motor control in personal care devices such as toothbrushes and electric shavers. The calibrated voltage references enable applications where rechargeable battery management is a primary function, for example in the d.light family of portable solar-powered lighting.

For more information on the ATtiny102 and ATtiny104 MCUs, you can check out Atmel’s recent post here.

This announcement, and what I think will follow from Atmel later this year, reaffirms just how important 8-bit is for the future at Atmel. The AVR architecture is beloved because of its simplicity and ubiquity with over 7B cores now shipped. The advances in the ATtiny102 and ATtiny104 are aimed at reducing BOM and manufacturing costs and enabling further innovation in intelligent consumer devices.

MQTT not IoT “god protocol,” but getting closer

One protocol, and its descendants, drove the success of the World Wide Web. IP, or Internet Protocol, is the basis of every browser connection and the backbone of IT data centers. Some assumed that the Internet of Things would follow suit, with the thought that having an IP address would be a sufficient condition to connect.

The problem on the IoT isn’t IP – the problem is all the stuff layered on top of it. Running protocols such as HTTP, SSL, and XML requires significant compute power and memory space. The average PC, smartphone, or tablet has enough horsepower today to do that, but the average sensor running on a smaller microcontroller does not. (ARM Cortex-M7 notwithstanding.)

To combat that, the industry has spawned a huge list of alternative, mostly non-interoperable IoT protocols. A partial list: 6LoWPAN, AllJoyn, AMQP, ANT+, Bluetooth, CoAP, DASH7, DDS, INSTEON, KNX, MQTT, NFC, RFID, STOMP, Thread, Weightless, XMPP, ZigBee, and Z-Wave. New consortia are popping up weekly with more ideas.

Searching for an IoT “god protocol”, one unifying end-to-end protocol serving all things, is silly. At one end, sensors have different requirements such as range, RF spectrum, security, topology, and power consumption. At the other, any successful IoT strategy will ultimately need to integrate with an IP-based cloud in some form. Greenfields of any scale rarely exist. IoT applications need to connect and exchange data.

The answer is building a multi-protocol bridge between sensors and actuators, mobile devices, and the cloud. Ideally, code would be open source, and would provide scalability to span a wide range of devices in large numbers. Additionally, transport would be reliable, surviving brief intermittency in wireless connections.

IBM Internet of Things Foundation

More and more organizations are embracing MQTT as part of the bridge. MQTT offers a full-up version running over TCP/IP, and a slimmed down version MQTT-SN for non-IP devices. Its publish/subscribe model allows topologies to scale while retaining real-time characteristics and configurable quality of service.

IBM started the whole MQTT movement as a message broker for mainframes and servers, with integration into WebSphere for web services. They then opened it up for embedded use in a release to OASIS and the Eclipse Foundation.

A big piece of IBM Bluemix is the IoT Foundation, a cloud-based instance of MQTT with predefined topic structures and message formats. Mobile apps are already using MQTT, with applications such as Facebook Messenger and Salesforce.com. IBM also has an e-book on MQTT in mobile.

Other recent developments to consider:

  • ARM’s mbed device server seeks to replace a generic web server with one tailored for the IoT. Built from technology in the Sensinode acquisition, ARM has brought HTTP, CoAP, and MQTT together in one platform.
  • 2lemetry has taken that a step further with ThingFabric, integrating protocol actors including MQTT, CoAP, and STOMP, along with extensibility.
  • PubNub has taken a websocket connections approach running over MQTT, focusing on low latency, reliable delivery from a cloud implementation. There is a good PubNub guest post on Atmel Bits & Pieces describing the approach.
  • Speaking of Atmel and Arduino, IBM has several recipes for leveraging Arduino with the IoT Foundation, such as an Arduino Uno connection example, and a series on implementing a cloud-ready temperature sensor.
  • Open source motivates many folks, and one of the more interesting individual projects out there is a bridge for AllJoyn to MQTT. If successful, the implications could be significant, such as controlling home entertainment devices directly from Facebook on a mobile device.

Again, I don’t think there is a “god protocol” that will take over the IoT once and for all, satisfying each and every use case. The winners are going to integrate multi-protocol bridges to serve as wide a range of use cases as possible. The ability of MQTT to connect sensors and mobile devices to big data systems in real time is drawing more participants in.

This post has been republished with permission from SemiWiki.com, where Don Dingee is a featured blogger. It first appeared there on November 5, 2014.

Secure at any IoT deed

In his classic book, “Unsafe at Any Speed,” Ralph Nader assailed the auto industry and their approach to styling and cost efficiency at the expense of safety during the 1960s. He squared up on perceived defects in the Chevrolet Corvair, but extended his view to wider issues such as tire inflation ratings favoring passenger comfort over handling characteristics.

History has not treated Nader’s work kindly, possibly because of his politics including a crusade on environmental issues which spurred creation of the US Environmental Protection Agency. Sharp criticism of Nader’s automotive fault-finding came from Thomas Sowell in a book “The Vision of the Anointed”. He targeted “Teflon prophets,” Nader foremost among them, who foretell of impending calamity using questionable data, unless government intervenes as regulatory savior.

Sowell’s most scathing indictment of Nader was for failing to understand the trade-off between safety and affordability. Others targeted Nader’s logic by suggesting some non-zero level of risk and injury is acceptable if society progresses, supported by data the Corvair was actually no worse in terms of safety among its contemporaries on the automotive market at the time.

Yet, almost five decades later, we have Toyota sudden acceleration damage awards, GM ignition switches and massive recalls in progress, and the prospect that someday soon an autonomous car may go haywire. The problem seems to be not errors of commission, but errors of omission; complex engineering requirements, design, and test are becoming increasingly difficult. Getting all that done at volumes and prices needed to drive model year expectations and consumer market share is a big ask.

In an industrial context of the IoT, “safety critical” design is a science, with standards, and certification, and independent testing. In application segments such as aerospace and defense, medical, industrial automation, and others – even the automotive industry, which has made huge strides in electronics and software development – safety and risk are proactively managed.

Security of consumers on the IoT is another matter. Devices are inexpensive, often created by teams with little to no security experience. Worse yet, there is a stigma around many security features as unnecessary overkill that would slow down performance, get in the way of usability, or increase costs beyond competitiveness. This is an accident waiting to happen.

Or perhaps, one already in progress, if we believe the recent study on firmware in a sampling of consumer devices. A lot of folks think benevolent hackers are also polytetrafluoroethylene-coated, but it is hard to dispute there is cause for concern among embedded devices when it comes to security — especially when those devices connect to networks.

One of the areas cited in the study is encryption, and some rather sloppy handling of keys when it is used. Across the industry, embedded software is wildly inconsistent in approaches to encryption. As the study points out, developers are prone to stamp out copies of aged, flawed solutions because they are comfortable with and invested in a particular approach.

Regulation is the last thing we need here. Engineers need a lot more education, starting from the basics of including and using hardware encryption units on MCUs and SoCs, through the state-of-the-art knowledge in cryptography and certificate management, and up to IT-style approaches such as over-the-air software updates and two-factor authentication.

We also need some deeper thought on encryption implementations, beyond just NIST recommendations. In a web context, we have Transport Layer Security (TLS), but that protocol requires a full IP stack and a lot more horsepower than many small embedded devices can afford. On top of that, hardware encryption is currently very vendor-dependent. Vendors like Atmel are working with ARM on TrustZone technology to create newer implementations based on Trusted Exectuion Environment APIs, tuned for IoT devices instead of data center use.

Historically, encryption has been applied to securing closed systems – the IoT presents a paradox. If it devolves into a myriad of smaller, effectively closed systems that only intermittently share data, we may gain some benefit, but will never reach the vision.

The best case scenario is an effective set of industry practices emerge for encryption in consumer IoT devices before problems become widespread, defeating the very purpose of sharing data with the cloud. We need developers to not avoid encryption, but for that to happen it has to be cost- and implementation-effective for easier use.

This post has been republished with permission from SemiWiki.com, where Don Dingee is a featured blogger. It first appeared there on August 25, 2014.

Sewn open: Arduino and soft electronics

As several other recent threads on SemiWiki have pointed out, the term “wearables” is a bit amorphous right now. The most recognizable wearable endeavors so far are Google Glass, the smartwatch, and the fitness band, but these are far from the only categories of interest.

There is another area of wearable wonder beginning to get attention: clothing, which has drawn the interest of researchers, makers, and moms alike. The endgame as many see it is smart clothing: the weaving of electronics, sensors, and conventional fabrics into something called e-textiles. However, while athletes, soldiers, and other niches may get sensor-impregnated jerseys sooner, affordable clothing based on exotic advanced fabrics for most consumers may still be 20 or 30 years away by some estimates.

Right now, we have these anything-but-soft computing structures – chips, circuit boards, displays, switches – adaptable for some clothing applications. Still missing are some key elements, most notably power in the form of energy harvesting or smaller and denser batteries. The influence of water-based washing machines and their adverse effect on most electronics also looms large.

How do we cross this gap? It’s not all about advanced R&D; these types of challenges are well suited for experimentation and the imagination of makers. Several Arduino-compatible maker modules – all based on Atmel microcontrollers – have jumped in to the fray, showing how “soft electronics” can help create solutions.

LilyPad embroidery
Maybe I’ve built one or two too many harness assemblies using expensive, mil-spec circular connectors, but the fascinating thing to me is what makes all these boards wearable. Small size is nice, but anybody knows a project needs wiring, right? You’ll notice the large plated holes on the first several offerings: these are eyelets for conductive thread, literally intended to sew these boards to other components like fabric pushbuttons. Many projects also use snaps, similar to 9V battery connections, to disconnect boards for conventional washing of the garment.

Arduino_IDE

The other side of this is the software. One of the attractive features of Arduino is the IDE, real live C-style programming simplified for the masses, with functions designed to perform I/O on the Atmel MCU. Code is edited on a PC or Mac, and compiled into a sketch and uploaded to the board. There are so many examples of code for Arduino maker modules out there available in open source, it makes it easy to find and integrate functions quickly.

If that all sounds crazy, consider the pioneer for this is Leah Buechley of the MIT Media Lab, one of the thought leaders of the maker movement and an expert on e-textiles. She is the brain behind the LilyPad, the original 2” diameter Arduino wearable circa 2007 commercialized through SparkFun, with the most recent version featuring the ATmega32u4 and native USB.

Adafruit took the next steps with two wearable boards.FLORA is slightly smaller than the LilyPad and retains the same familiar circular profile and ATmega32u4 MCU.GEMMA goes even smaller, 1.1” in diameter, packing an ATtiny85 on board with a USB connection for easy development.

Adafruit GEMMA

Not to be outdone by circles, squares and rectangles are still in the mix.SquareWear 2.0 comes in two versions, the 1.7” square variant with a coin cell socket onboard, both including the ATmega328 MCU with simulated USB, high current MOSFET ports, a light sensor, and a temperature sensor. Seeed grabbed the ATmega32u4 and designed it into the Xadow, a tiny 1” x 0.8” expandable unit with integrated flat cable connectors for daisy chaining.

SquareWear 2.0

These aren’t just toys for creating flashing LEDs; there is no shortage of sensors and connectivity, including displays, GPS, Bluetooth, and more compatible with these wearable maker modules. Their popularity is growing: Becky Stern of Adafruit claims there are over 10,000 units of FLORA shipped so far, and they are the darlings of maker faire fashion shows and hackathons.

Besides the upside for makers, maybe this sewing angle will finally allow us to explain electronics to our moms, after all. Until we get to the fulfilled flexible future of e-textiles and more advanced technology, the conductive thread of soft electronics will stitch together creative ideas using somewhat familiar tiny modules with today’s microcontrollers.

This post has been republished with permission from SemiWiki.com, where Don Dingee is a featured blogger. It first appeared there on May 21, 2014.

On the road from Makers to consumers

It’s time to break with conventional thinking. For decades, the measure of success for semiconductors has been OEM design wins. Most consumers haven’t known, or cared, about what is inside their electronic gadgets, as long as they work. That may be about to change, because a new intermediary is finding its voice – and being heard in high places.

Intel and Apple, in different ways, began challenging the norm by pursuing consumer branding and developing pull-through demand for their parts as drivers of the overall experience. Coupling what people “feel” about their devices with the technology powering them creates an almost unbreakable bond, akin to a religious response. Reaching billions of people has required billions of dollars and high profile advertising campaigns – out of the question for most embedded semiconductor companies.

A new road is being carved across the landscape, paved not with gigantic chips packing billions of transistors delivering a cascade of social chatter and streaming entertainment content. This road is built with ideas carried on small boards and open source software, and a sense of wonder about how the world works, and what we can do to shape it.

Somewhere on that road right now is a big truck, captured in pixels at a stop in June 2014 that may go down as a turning point in the annals of semiconductor evolution.

Overstated? The truck tour is a tried-and-true mechanism for reaching industrial OEMs, taking hands-on demonstrations to cities far from the sources of silicon and software innovation. If we were only talking about embedded design and the industrial IoT, it’d be business as usual, and this would be just another truck with a fancy paint job and a couple of FAEs inside.

But, it’s not. The industrial IoT is wonderful and welcome, however by and of itself it won’t generate the billions of units needed to drive a recovery and restart growth in semiconductors and the economy at-large. That will only come from reaching and capturing consumers with IoT technology, in a big way.

And that, so far, has proven difficult. After all, even industry experts are feverishly debating the name IoT, questioning what applications really fall under the moniker, or what exactly it means. Much like “smart grid” and “mHealth” before it, the term IoT means something in the developer community, but not so much to consumers who don’t yet see a connection between the Internet and how they use everyday things.

A recent SOASTA survey suggests 73% of the US has never heard of the IoT, at least until an interviewer explains it to them. (I’m curious why that number always seems to be 73% no matter the topic, but let’s just say 3 out of 4 – I believe it.) When hearing oral arguments in the Aereo case earlier this year, several US Supreme Court justices issued queries indicating a limited grasp of technology. (Cut to Keyrock: “I’m just a caveman … your modern ways frighten and confuse me.”)

This isn’t a lack of intelligence on their part; it’s a lack of generating the needed visibility on our part. These are the people we all must reach if we have a hope to succeed. Who is going to reach them? Makers, armed with our tools and their ideas. Atmel and other tech firms reaching Washington and the first-ever White House Maker Faire, side by side with people like the star of Sylvia’s Super-Awesome Maker Show, was a milestone in delivering the message to the masses. This goes way beyond the T and E in STEM; remember, the social transformation was driven by youth, and young makers are going to drive the uptake of the consumer IoT.

Why? Well, frankly speaking, they don’t think like engineers – they think like actual, real-life users. I made the comment recently that we need to be careful, the people we are trying to reach can drive smartphones, not (name of other popular maker module redacted … sorry, Arduino didn’t rhyme.) Don’t be distracted by a 17-foot tall mechatronic giraffe with lava lamps for ears and a penchant for partying, or by the Obama crack about we don’t spell “fair” with an ‘e’ in this country. These are people designing things they, and people like them, want to use. More importantly, they will provide the translation of what the new technology can do, renarrating the story from the language of semiconductor companies to the wants of the average consumer.

Makers are the people we need to win with. That idea isn’t lost on Chrysler, who has co-opted the maker movement as their idea in 2014 commercials. Makers care about what is inside, and they are choosing Atmel in droves – in part because Atmel has redirected technological and social media energy into nurturing them, away from just talking to the button-down, risk-adverse, safety-is-job-one industrial community. Intel and other chip suppliers are feverishly trying to catch the wave with makers, moving away from the “e2e” stance that only takes us so far in this next phase.

It’s not for the faint of heart, or the impatient. The industrial IoT is safe, somewhat predictable ground for experienced firms, whereas the consumer IoT still borders on bubble in many minds. The maker movement is now what the university programs were back when to semiconductor firms, taken to the next level and reaching an even wider audience. Design wins with makers now likely won’t show up in the volume shipments column right away – but, they will show up as consumers get the IoT over time.

This post has been republished with permission from SemiWiki.com, where Don Dingee is a featured blogger. It first appeared there on June 19, 2014.

For I have seen the shadow of the curved touchscreen

Last year’s CES was the modern technology equivalent of the voyage of Ferdinand Magellan, proving beyond any shadow of doubt displays no longer can be thought of as only flat. While the massive curved 105-inch TVs shown by LG and Samsung drew many gawkers, the implications of curved touch displays are even wider.

At DAC 50 there were more than a few chuckles and some mystified looks when Samsung’s Dr. Stephen Woo spent a lot of his keynote address highlighting flexible displays as one of the challenges for smarter mobile devices (spin to the 27:41 mark of the video for his forward-looking comments). I think if we had polled that room at that second, there would have been two reactions: 1) yeah, right, a flexible phone, or 2) hmmmm, there must be something else going on. His comments should have provided the clue the flat display theory was about to dissolve:

Is there any major revolution coming to us? My answer to that is yes. I’m afraid that we as EDA, as well as the semiconductor industry, are not fully appreciating the magnitude of the revolution.

Woo showed the brief clip from CES 2013 introducing the first Samsung flexible display prototype, hinting that while exciting, it is still a ways from practicality. Why? He went on to explore the rigid structure of the current high volume smartphone – flat display, flat and hard board with flat and hard chips, and a hard case. I have some unpleasant recollections of trying chips on flex harnesses in the defense industry, and the problems become non-trivial with bigger parts and shock forces coming into play, not to mention manufacturing costs.

We might be getting thrown off by the limiting context of a phone as we know it. A gently curved but still fixed display poses fewer problems in fabrication using current technology. Corning has announced 3D-shaped Gorilla Glass, and Apple, LG, and Samsung are all chasing curved display fabrication and gently curved phone concepts today.

The real possibilities for smaller curved displays jump out in the context of wearables and the Internet of Things. The missing piece from this discussion: the touch interface. Flexible displays present a challenge well beyond the simplistic knobs-and-sliders, or even the science of multi-touch that allows swiping and other gestures. Abandoning the relative ease of planar coordinates implies not only smarter touch sensors, but algorithms behind them that can handle the challenges of projecting capacitance into curved space.

Illustrating the potential for curved displays with touch interfaces in automotive design, AvantCar debuted at CES 2014. Courtesy Atmel.

 

Atmel fully appreciates the magnitude of this revolution, and through a combination of serendipity and good planning is in the right place at the right time to make curved touchscreens for wearables and the IoT happen. With CES becoming an almost-auto show, it was the logical place to showcase the AvantCar proof of concept, illustrating just what curves can do for touch-enabled displays in consumer design. (Old web design axiom, holds true for industrial design too: men tend to like straight lines and precise grids, women tend to like curves and swooshes – combine both in a design for the win.)

The metal mesh technology in XSense – “fine line metal” or FLM – means the touch sensor is fabricated on a flexible PET film, able to conform to flat or reasonably curved displays up to 12 inches. XSense uses mutual capacitance, with electrodes in an orthogonal matrix, really an array of small touchscreens within a larger display. This removes ambiguity in the reported multiple touch coordinates by reporting points independently, and coincidentally enables better handling of polar coordinates following the curve of a display using Atmel’s maxTouch microcontrollers.

Utilizing fine line metal - copper etch on PET film - Atmel's XSense touch sensor is able to conform to gently curved displays.

 

Now visualize this idea outside of the car environment, extended to a myriad of IoT and wearable devices. Gone are the clunky elastomeric buttons of the typical appliance, replaced by a shaped display with configurable interfaces depending on context. Free of the need for flat surfaces and mechanical switches in designs, touch displays can be integrated into many more wearable and everyday consumer devices.

Dr. Woo’s vision of flexible displays may be a bit early, but the idea of curved displays looks to be ready for prime time. The same revolution created by projected capacitance for touch in smartphones and tablets can now impact all kinds of smaller devices, a boon for user experience designers looking for more attractive and creative ways to present interfaces.

For more on the curved automotive console proof of concept, check out Atmel’s blog on AvantCar.

What do you think of the emergence of curved displays and the coming revolution in device design? How do you see curved touchscreens changing the way industrial designers think of the user interface on devices? Looking out further, what other technological improvements are needed?

This post has been republished with permission from SemiWiki.com, where Don Dingee is a featured blogger. It first appeared there on January 10, 2014.

A look at the chip side of the OIC

Maybe it’s my competitive analysis gene, or too many years spent hanging out with consortium types, but I’m always both curious and skeptical when a new consortium arises – especially in a crowded field of interest. The dynamics of who aligns with a new initiative, and how they plan to go to market compared to other entities, prompts deeper exploration.

We’ve had a spate of IoT consortia showing up in 2014, and it seems there is a new one popping up every day. These days, the operative word is open – open source is a requirement to get anywhere with developers and makers. Outside of the Industrial Internet Consortium which is dealing with a much gorier systems-of-systems problem in a more traditional embedded scenario, most of these new initiatives are targeting consumer connectivity.

Ecosystem polarization is also a theme. After all, technology has been driven by specification “wars” forever, where participants align with a side and play a game that often takes years or even decades before a decisive score. To get into the game, one has to be chosen for particular skills, and commit to a group with an idea.

Not surprisingly, the “captains” of these IoT consortia are coming from the smartphone experience. Qualcomm moved first by opening up AllJoyn into the AllSeen Alliance, with a roster recently topping 50 companies including the addition of Microsoft. Apple then served up HomeKit, Google responded with a Nest API, and BlackBerry is lurking with Project Ion.

What’s missing here? AllSeen has a couple of semiconductor companies, notably Imagination Tech and Silicon Image to go along with Qualcomm. For the most part, these initiatives are primarily software-oriented, with middleware, carriers, and device OEMs leading the way. I’ve been quoted as saying “software is the solution”, but the overall absence of microcontroller and SoC vendors in all this is a bit alarming.

People like to say the semi vendors want to be able to sell chips into any application – they’re agnostic. Hogwash, when it comes to this phase of the IoT. That’s like saying we’ll just wait around until we’re the last kid on the playground to be picked. If you’ve been reading me and others lately, it is clear we don’t have the right chips yet for many IoT applications, particularly wearables and always-on scenarios. The software won’t be right until the IoT silicon is right.

Chipmakers can’t afford to wait on the sidelines, hoping their standard fare gets picked up and fits in with one of these teams. They also can’t take the proprietary route, such as TI trying to draw cloud providers onto only their solutions – especially without IPv6 support for their Wi-Fi solutions yet. What we need is a much more collaborative discussion with a variety of viewpoints, including multiple semiconductor vendors working on a common cause.

The debut of the Open Interconnect Consortium may signal a change is coming. At a first glance, the co-captains are Intel and Samsung – a statement that affirms Tizen is headed straight for the IoT and a home and a wrist near you. (Yeah, yeah, the right words on Android and iOS and Windows and even RTOS platforms are there, but read the charter that says “must provide an open source implementation” again.) Exactly how the OIC proceeds is a bit nebulous; the specifications are yet to be defined, with only sweeping statements on creating an open, certified, and branded environment so far.

One thing is for sure: semiconductors are well represented. Right at the top of the OIC roster are two names that should get attention: Atmel, and Broadcom. Not coincidentally, these are two of the bigger names behind the maker movement, representing the Arduino and Raspberry Pi communities.

If one was to pick a chip vendor for an IoT team, Atmel seems the logical choice right now. They have a wealth of 6LoWPAN, Bluetooth, and ZigBee experience – and are adding depth with the July 2014 acquisition of Newport Media. They also have exposure with two of the more prominent open IoT operating systems, Contiki and RIOT. Atmel has a birds-eye view at what makers and startups are actually doing on the IoT. Their voice could prove very important in aligning chip design, software design, and use cases as the OIC develops specifications.

Broadcom is compelling for other reasons beyond their SoC presence. As a major Wi-Fi chipset vendor, they line up well as a counter to Qualcomm on the other side. Historically, Broadcom has been notoriously obstinate in refusing to provide open source drivers for solutions, even labeled “open source hostile” in some forums. Broadcom’s stance may finally be changing, perhaps in response to the AllJoyn momentum and a realization that makers are our new best friends on the IoT.

And, this isn’t the Haswell-fueled side of Intel talking, but their maker efforts in Edison and Galileo powered by Atom and Quark, and an acquisition of Basis. Wind River is also listed as a lead member, and as the embedded software operation of Intel, they have accumulated vast experience with Android, Linux, and Tizen that could serve this effort nicely.

Where AllSeen has a significant head start, the OIC is just entering the game. I’d watch what they do next, as they add members and release specifications. A lot will likely hinge on how well Tizen is accepted for IoT applications, who else lines up with support, and how interoperability with the other environments – open or proprietary – is established.

This post has been republished with permission from SemiWiki.com, where Don Dingee is a featured blogger. It first appeared there on July 9, 2014.

What’s not quite MCU, and not quite SoC?

There has been a lot of railing lately about how we don’t have quite the right chips for the upcoming wave of wearables. Chips one would drop in a smartphone are often overkill and overpowered, burning through electrons too quickly. Chips one would use for a simple control task generally lack peripherals and performance, offsetting their low power advantage.

One area we can feel this pressure building is in the Arduino community. Authentic Arduino boards have been Atmel AVR based, hosting 8-bit microcontrollers in a low power, simple-to-use,and easy-to-program environment. The popularity of Arduino shields for I/O expansion has driven a comber of compatible boards tossing 16- and 32-bit MCUs, and even some notable low-end SoCs, into the mix. Of course, straying into SoC territory introduces other popular form factors as well.

As with most technology with a sizable following, Arduino is now being forced to grow, as much due to its own success as due to external competition. Having mastered the basics of 8-bit, many makers are now stretching their capability and taking on more ambitious goals. At the same time, the pinout popularized by the Arduino Uno (ATmega328 MCU) remains in vogue, and changing too many things at once is bad form.

arduino_zero_overhead_bottom_5396

The recent debut of the Arduino Zero, and its first real-world appearance at Maker Faire Bay Area 2014 in May, marked the launch of the industrial-strength Atmel SAM D21 microcontroller with its ARM Cortex-M0+ core into the hearts and minds of Makers. However, this isn’t the first time a 32-bit ARM MCU has been seen in the Arduino neighborhood, but the endorsement with official Atmel-based hardware is a big step for the community.

A closer look reveals our budding trend of interest. The Atmel SAM D family is built for traditional control tasks, sans on-chip wireless and opting instead for ultra-low power and a mostly MCU-like peripheral mix. Revealed in the block diagram – note, on the Arduino Zero not all these features make it off the board – is a very SoC-like chip architecture, scaled down for power.

Atmel SAM D21 microcontroller block diagram

The Cortex-M0+ processor core in the Atmel SMART SAM D21 connects to three AHB bus segments plus memory ports via a high speed bus matrix, effectively separating traffic for memory, DMA and USB, power and clock management, and mixed signal I/O. One might wonder out loud what “high speed” means given a 48 MHz core frequency and the mix of peripherals shown, but let’s keep the context in mind.

Commenting on the current mania of smartphone SoCs being crammed into wearables with less than stellar results so far, I made a statement a few days ago I’ll stand by:

“A typical smartphone chip is 1W – for many wearables, that will be at least one and maybe two orders of magnitude off.”

The SAM D21 weighs in at less than 70uA/MHz, meaning fully clocked it pulls somewhere around 3.3mW – in the range of interest. Within that envelope, it can run full speed USB 2.0, unimpeded by other interrupts and without external components in “device” mode. An integrated I2S block brings full-rate audio streaming, and the obligatory Atmel integrated peripheral touch controller enables user interfacing. Also on chip are a flexible serial block, a multi-channel A/D, comparators, a D/A, plus a spate of timers.

Strictly speaking, the SAM D21 is still an MCU, demarcated primarily in terms of on-chip flash and RAM and no external memory. By retaining ultra-low power behavior while segmenting buses and rethinking integration for mixed signal, USB, audio, and touch, Atmel has taken a first step into the space between control MCUs and smartphone SoCs – with an eye on what wearables actually need to succeed. It’s a smart move to get the Arduino community on board and find out where this can go.

This post has been republished with permission from SemiWiki.com, where Don Dingee is a featured blogger. It first appeared there on May 21, 2014.