Category Archives: Resources

Arduino and Seeed Studio announce partnership at Maker Faire Shenzhen


Seeed Studio will manufacture and distribute Arduino LLC products using the new Genuino brand in Asia.


Back in May, Massimo Banzi took the Maker Faire Bay Area stage for the highly-anticipated “State of Arduino” address. During what was surely one of the most highly-anticipated sessions of the show, the Arduino co-founder announced a New York City manufacturing partnership with Adafruit, the availability of the Arduino Zero and Wi-Fi Shield 101, as well as the launch of a sister brand dubbed Genuino (“genuine” in Italian) for boards outside of the United States.

SEEDGenuinoLow

One month later at Maker Faire Shenzhen, Banzi has returned with some other big news: He and Eric Pan, founder and CEO of Seeed Studio, have unveiled a strategic partnership between Arduino LLC and Seeed Studio. Similar to their collaboration with Adafruit here in America, Seeed Studio will manufacture and distribute Arduino LLC products using the new Genuino brand in China and other Asian markets.

“The new Genuino name certifies the authenticity of boards, in line with the open hardware and open source philosophy that has always characterized Arduino,” Banzi explains. “We are very excited to partner with Seeed Studio to manufacture our products in China. We’ve known and appreciated Seeed for years, we share the same values and I think they are one of the most forward looking companies in China.”

Uno_front1

As popular as Arduino has become throughout China, Banzi notes that the brand has been heavily used without permission. Fortunately, Genuino will allow the market to clearly identify which products are indeed authentic and contributing to the open source hardware process. The brand will still emulate the 8- and 32-bit chips that Makers have grown accustomed to over the years, such as the Uno (ATmega328) and Mega (ATmega2560), in a familiar teal and white color scheme.

Mega_front1

“Arduino is becoming a global language of making, we are proud to help provide Genuino branded localized products to carry on the conversation in China. Here we already have a huge Arduino user base and growing, it’s time to get us involved deeper with global ecosystem,” Pan added.

Genuino-branded products will be sold on Seeed’s store on Taobao and on Genuino’s official site in the near future.

The Model 01 is an heirloom-grade, open source ergonomic keyboard


The Model 01 doesn’t look or feel like any keyboard you’ve ever had before. 


The arrangement of characters on a QWERTY keyboard was first introduced back in 1868 by Christopher Sholes, who happened to also be the inventor of the typewriter. As legend has it, Sholes organized the keys in their odd fashion to prevent jamming on mechanical typewriters by separating commonly used letter combinations. Other than adding a few function and arrow keys, the text entry device has remained relatively unchanged for nearly 150 years.

bbd058e2bc54a02e3d58a41c29e4ee19_original

With just about everyone nowadays spending eight-plus hours typing away on their computers, too many of us are putting unnecessary strain on our wrists. Have you ever thought about how you might improve the standard QWERTY layout? Well, Jesse Vincent and Kaia Dekke — who together make up Bay Area startup Keyboardio — have with their butterfly-shaped keyboard that places a greater emphasis on the thumb, lessens the stress on your pinkies and offers a more natural position for the hand and wrist — something that may prove to be a lifesaver for those suffering from carpal tunnel or arthritis. And sure, there are plenty of ergonomic keyboards on the market, but the Model 01 was specially designed for serious typists.

“The traditional keyboard was designed for typewriters, not hands. Staggered columns made room for mechanical components, without concern for wrist angles or finger lengths. Shift keys were placed under the weakest fingers,” Vincent explains.

20150219_165518_resized

Instead, the Keyboardio team has puts keys such as control, alt, delete, shift and a new ‘function’ button under the typists’ palms, all within easy reach of the thumbs. The duo says that they have been experimenting with ways to eliminate the mouse altogether by using the W, A, S and D keys for general cursor movements and other keys to tell the mouse where to go on the screen.

“You can think of it as a function key or a special sort of shift. Dropping the base of your thumb onto it turns the H, J, K, and L keys into your arrow keys, turns the number keys into F-keys and even turns the WASD keys into a high-precision mouse.”

3e1658c5dad55dbfbd22b559f0185cb3_original

Not only does Model 01 ship with the source code and a screwdriver, users can even define custom key layouts or macros based on the application currently running on the PC. Meaning, typists can assign complex sequences of keystrokes and mouse movements to a single key press through a simple program — on any computer compatible with OS X, iOS, Linux, Windows or Android operating systems.

The modular keyboard is built around a versatile ATmega32U4 along with some battery charging circuitry, Worldsemi WS2812B LEDs and a Bluetooth module — all housed inside two blocks of CNC-milled solid maple wood. The keyswitches, which boast a lifetime of 50 million presses, are Matias Quiet Click ALPS-mount keyswitches with ultra-bright, colorful LEDs located under each one. Its creators have custom sculpted each of the 64 individual keycaps on the Model 01 to gently guide a typists fingers to the right keys. Beyond that, the Model 01 features a USB interface.

vcrfwc1fn7ipvnaqwyva

“For a variety of reasons, many USB keyboards limit you to pressing six keys (plus modifiers) at once. Most of us would never notice this limitation, but an intrepid few really, really need to be able to hit more than six keys at once,” Vincent writes. “If you need True N-key rollover (NKRO), we’ve got you covered. The NKRO-over-USB technique we’re using works great on Windows, MacOS X and Linux without any special drivers.”

With its aesthetically-pleasing maple wood exterior, Vincent believes the Model 01 can be the first computer accessory made to “heirloom grade.” While Keyboardio may initially appeal to the enthusiast crowd, the open source nature of the gadget will certainly entice hardware and software fans to offer their own set of modifications as well.

a58f596d1a3cdb87936d19dd995c0bf3_original

Though it ships with the default QWERTY arrangement, the unit also “speaks” Dvorak, Colemak, Workman and a variant of the Malt layout. What’s more, the Model 01 has an “any” key — whose function is left to the imagination of the beholder. Does it look like an ergonomic keyboard that you’d love to have at home or in the office? Click over to its Kickstarter campaign, where Keyboardio is currently seeking $120,000. Shipment is set to commence in May 2016.

Thingsquare is putting the IoT at your fingertips


This IoT platform enables users to build their connected product in a matter of days.


Thingsquare, an IoT startup who has emerged as one of the pioneers in connected product development, has launched an open prototyping tier enabling engineers, designers and Makers to envision and prototype their smart devices in a matter of minutes.

Smart

For those unfamiliar wtih Thingsquare, the all-in-one software platform provides Makers with all of the necessary tools to quickly add Internet connectivity to their product via smartphone. Ultimately, this easy-to-use solution reduces the time typically required to bring an idea to mass market from months to just days.

The platform works by connecting smart devices, such as lights and thermostats, which have a programmable wireless chip running the Thingsquare firmware. The wireless MCU and the firmware securely sync the gadget to the cloud backend server that handles the API for the app. From there, Thingsquare builds a resilient wireless mesh network where one router offers seamless Internet access for all mesh nodes, also allowing users to upgrade their firmware over the air.

“Devices form a wireless mesh network and connect to the Internet. Devices use their Internet connection to authenticate with the Thingsquare cloud and begin announcing their presence. The smartphone app discovers devices and authenticates with the Thingsquare cloud. Users can login and control devices either locally or remotely. The app can notify the user if something important happens,” the team explains.

discovery

Thingsquare has even made it possible to try a minimalist version of app without any hardware by providing a built-in virtual hardware mechanism that lets a user run the platform from their phone.

“A virtual device acts as a real wireless hardware device, but runs as software on your smartphone. To the Thingsquare platform, the virtual device looks just like a normal hardware device. Virtual devices send and receive data in the same way as wireless hardware devices do.”

As for the hardware, the solution will support a wide range of SoCs — most notably the Atmel | SMART SAM R21. This calls for at least a pair of SAM R21 Xplained PRO evaluation boards, two microUSB cables (one for each device), an Atmel Ethernet1 Xplained PRO extension board, an Ethernet cable, a Wi-Fi router with an Ethernet port, as well as a PC for uploading the firmware to the chips.

What’s nice is that the Cortex-M0+ processor supports external devices on GPIO pins that can be controlled from the smartphone. The SAM R21 creates a self-healing wireless mesh with one MCU acting as an Ethernet gateway with the Xplained PRO Ethernet extension board. This process, including all of the necessary code, has been made available on Github.

r21_eth

What’s more, the newly-revealed open prototyping tier will help resolve a number of problems often encountered throughout development. This is accomplished by providing wireless connectivity by way of a self-healing and self-forming mesh network, a simple app that users can build themselves, and if necessary, secure remote access.

“The cool thing with connected product is how many different markets it touches. Anything that benefits from being connected is rapidly becoming connected,” the startup adds. “Further, the Thingsquare platform lets you put your next product’s app in the hands of your potential customers right from the start, and provide remote support.”

Evident by the sheer number of malicious hacks in recent months, smart gadgets require protection, something of which the company has embedded into its platform from the start through secure authentication. Beyond that, other features of the app include discovering, interacting, positioning and sensing nearby devices as well as collecting data from the wireless mesh. At the moment, the app runs on iOS (version 8.0) and Android (version 4.3) smartphones.

Device

“Our customers are demanding complete, easy-to-use IoT solutions that can quickly bring a full system to market,” explains Magnus Pedersen, Atmel Product Marketing Director. “Our cooperation with Thingsquare is an example of that, with a web-based toolchain and open source firmware to offer our customers a fully integrated hardware and software solution for various IoT applications.”

Ready to get started designing your first IoT gizmo? If so, check out Thingsquare’s open prototyping tier. Meanwhile, those wishing to learn more about how the platform works can do so here.

Qualtré debuts 11-DOF MEMS sensor platform


New platform spurs innovation by simplifying evaluation and the development of sensor fusion algorithms.


Qualtré, Inc, a leader in the development and commercialization of Bulk Acoustic Wave MEMS inertial sensors, has debuted a MEMS sensor evaluation platform with 11 degrees of freedom (DOF). This evaluation platform combines three axes of gyroscopic data, three axes of accelerometer data, three axes of magnetic data, as well as barometric pressure/altitude and temperature. The company’s sensor fusion application software library leverages the Atmel | SMART SAM4E Cortex-M4 MCU.

DOF

“With an integrated sensor fusion framework, designers can focus on their unique motion based application,” explains Dr. Sreeni Rao, Qualtré’s VP of Vertical Markets. “It’s all about bringing the relevant data together from multiple sensors to provide a more comprehensive and accurate picture of what’s going on in a system. The Qualtré 11-DOF evaluation platform makes it easy to interface multiple sensors and get started immediately writing, compiling and running sensor based applications which can easily be ported to the end-user platform.”

The current version of the sensor fusion platform provides software support for a number of functions, including Wi-Fi-based 11-DOF real-time telemetry, sensor fusion quaternion outputs, corrected heading direction and second order temperature compensation.

Typically speaking, a key challenge in sensor fusion is effectively separating signal, motion and noise. Fortunately, Qualtré’s algorithms aim to take data from different sensors that observe the same event to distinguish between noise and signals, then compute more accurate information. Sensor fusion encompasses a variety of techniques which leverage the environmental monitoring of the individual sensors and combine them intelligently to achieve broader and more precise results.

Calling all Makers, visionaries and innovators up for a (IPSO) Challenge!


How do you IPSO? There are many problems in everyday life that can be solved by collecting data thru sensors, or by controlling smart objects based upon inputs from a variety of sources.


Once again, the IPSO Alliance has initiated its annual challenge, whose deadline for proposals is quickly approaching!

ipso2

The IPSO CHALLENGE was launched as a way to show what is possible utilizing the Internet Protocol (IP) and open standards in building the Internet of Things. Enter this global competition by submitting a proposal before July 15 2015 for a working prototype that is innovative, marketable and easy to use.

Just a few weeks ago, I had the opportunity to speak to a potential group of IPSO CHALLENGE participants in Colorado Springs, Colorado. This meetup was created to enable potential participants to learn about the challenge, mingle with like-minded individuals, find team members with the skills needed to implement ideas that are already being considered or to find those with like interests and come up with an innovative project proposal.

As a proud sponsor of the IPSO CHALLENGE 2015, my goal on behalf of Atmel was to describe how our wireless and MCU solutions can be used to form the basis of the hardware and software platforms that should be considered for a number of innovative IP-based challenge entries.

The incentive? Over $17,500 up for grabs in prizes with first taking home $10,000, $5,000 for the runner-up and $2,500 for third. There are many problems in everyday life that can be solved by collecting data thru sensors, or by controlling smart objects based upon inputs from a variety of sources. The Internet of Things and the Internet Protocol are a smart choice as the means to publish and subscribe to  sensor information, and make this available for processing in the cloud, or to deliver this information to mobile devices for viewing or notification anywhere in the world.

One of the development kits that is being promoted for use in the IPSO CHALLENGE is the ATSAMR21-XPRO evaluation board. This kit supports the ATSAMR21 (IEEE 802.15.4-compliant single-chip wireless solution) wireless “system in package” device.

SAMR21_XPRO

The device contains both an ARM Cortex M0+ microcontroller plus the AT86RF233 2.4ghz 802.15.4 radio. This combination makes the perfect solution where a low power wireless sensor or actuator is required ,as an element of the hardware platform needed to implement your CHALLENGE entry.

The SAM R21 is the ideal platform to support a 6LoWPAN wireless mesh network, with sensors that can be used to measure and collect  data, or control outputs, while also having the ability to transfer this information to the cloud, or to any PC or mobile device, that has an internet connection anywhere in the world.

SAM R21 device IO assignments:

SAMR21

Atmel recently released its SmartConnect 6LoWPAN, a wireless stack firmware package that provides an IPV6 6LoWPAN implementation running on the SAM R21 evaluation kit, among a number of other Atmel platforms. Additionally, there are a number of example applications for SmartConnect 6LoWPAN that are provided in the free Atmel Studio 6.2.

AtmelStudio6

The example that I demonstrated during the IPSO meetup was the MQTT (MQ Telemetry Transport) example. MQTT is a publish/ subscribe protocol that allows the SAM R21 SmartConnect 6LoWPAN solution to implement topics like /Atmel/IoT/temperature or  Atmel/IoT/LED and then subscribe to, or publish to these topics while also allowing other devices to also subscribe  or publish to these same topics. This enables all these devices to work together in collecting and processing the content of many distributed sensors.

This is a very simple protocol that needs only a small amount of memory resources, and allows one to create a very effective distributed processing solution, where IP is used to enable communication and data transfer between all of the elements contained within the network.

SmartConnect 6LoWPAN, as with most 6LoWPAN solutions, makes use of the RPL mesh networking routing protocol. This lets these low power SAM R21 (15.4) radios to have the ability to transfer data over longer distances thru the wireless mesh. Because one only has to transfer the data to its nearest neighbor or its parent, in  the network that was formed.

Let’s take a look at a simplistic example of a problem, with a 6LoWPAN wireless mesh network solution: Your children take a school bus to school every morning, and if you could know when the school bus was in the neighborhood, or approaching the nearest stop, life would be a lot easier in inclement weather.

So you gather together a few SAM R21 kits and battery packs, and start to think about a solution.

Since you would need to know where the bus is at some distance from your home, this would eliminate “wired’ solutions, and since you probably would not have access to “mains power” at many of the sensing locations, the solution would require low power battery operated wireless sensors.  As it just so happens, the SAM R21 would make a perfect low power battery operated “wireless” sensor.  The SmartConnect 6LoWPAN wireless mesh network firmware would allow you to cover an extended range, by placing additional routing sensors where needed to keep track of the bus, and to relay or route similar data from other sensors that are too far away by radio, to get all the way back to your home base unit.

Given that you will need access to a fence post, a mailbox or telephone pole on your neighbors property in order to mount your small wireless sensors, you can tell them that they also can access this data to keep track of the school bus, or just about anything in the neighborhood that has a mobile tag  placed on it, whether it’s a young child’s backpack or jacket, a pet’s collar, etc.)

There needs to be one root location where all of the sensor data is transferred to, and this location will act as the  border router ( or dag root ) of the 6LoWPAN network. This is also implemented using the SAM R21 evaluation kit along with an Ethernet 1 XPRO interface board. This border router hardware would be located in your house, and plugged into a spare Ethernet port of the home access point that provides internet service to your home. Future options could also allow using Wi-Fi instead of Ethernet to make the connection to your home Wi-Fi access point.

A mobile sensor/tag will need to be placed on the bus (hopefully you can get permission, to place a small sensor using double sided tape inside the bus, or maybe ask the nice bus driver if he/she would carry it, or have one of the kids that gets on the bus early in the bus route for our neighborhood,  clip the mobile sensor to their backpack or belt .  How and where to place these mobile tag sensors, may actually be one of the most difficult parts to solve for this solution.

Once you have the mobile sensors in place on the bus, kids, dogs,  and cats, now you need to set up the sensor mesh around the neighborhood.

Atmel provides a tool call Atmel Wireless Composer.

WirelessComposer

This free tool  has a very nice feature that allows range testing to be done by one person.  Place one SAM R21 device in a fixed location and then take a battery operated remote node for a walk in your neighborhood.  You can  use this method to determine the typical range that you can achieve and  check potential mounting spots within the neighborhood. This can be used to insure that you can establish reliable wireless communications, and to find the location of where to place the  nearest neighboring node.

Remember to ask permission, before you mount the sensor node on someone else’s property.

As you turn on the remote nodes they will make their presence know to the network, and a route will be discovered back to the root node at your home.

mesh

Once you have established your network, a number of SmartConnect 6LoWPAN Example applications can be used to move the data around the network. By using the MQTT example previously mentioned, units can publish information as to which “mobile” tags are within wireless range of the sensor, thus providing a coarse location system, to notify those that are subscribing to a particular topic, as to the current location of the bus, child, dog or cat.

You can find the Example projects within Atmel Studio 6 as shown below:

ExampleProj

ExampleProj1

The power of  The Internet Protocol and the Cloud in this system is that each individual sensor has its own IPV6 address, and the data collected by the end sensor nodes is packaged into an IP frame, and  transferred thru the wireless network, and then thru the border router to the wired Internet. Then finally to the Cloud without having to convert or change protocols.  Today, there are so many devices that can make use of this data, including devices such as smartphone’s, tablets, laptops, and home automation hubs and gateways, What you can do with this data has endless possibilities.

Applications for these internet connected devices can be created to show the location of the bus or pet on a map, or maybe just send a simple notification of “School bus currently at the Smith family residence”….  Again the possibilities are endless.

Maybe you would also like to turn on your house lights or open your garage door when you approach your house from your car with a sensor mounted in the car. The info in the cloud can be integrated with your home automation system to control the lights and garage door.

Now that you have completed the proof of concept using  the Atmel | SAM R21 evaluation boards, or hopefully now that you have won the IPSO CHALLENGE!, you will want to turn your prototype into a deploy-able product.

Atmel has the solution for you.  SAM R21 “modules” are being developed in a small form factor that will allow the creation of a small battery operated mobile tag or sensor unit, and these modules come with an FCC certification ID, and a proven RF design, to eliminate the challenge, cost, and time required to develop a wireless product from scratch.

Feeling inspired? Submit your idea today before time runs out!

Casa Jasmina opens its (smart) doors


Located in Torino, Casa Jasmina is a first of its kind smart apartment.


Several months after its announcement at Maker Faire Rome, the (presumably smart) doors of Casa Jasmina have officially been opened. A collaborative effort between Massimo Banzi and futurist Bruce Sterling, along with some support from Arduino, the smart apartment is a first of its kind in combining Italian contemporary interior and furniture design with an array of open source electronics, many of which built around Atmel microcontrollers.

domus-04-casa-jasmina

Unlike other so-called “homes of the future,” this Arduino-powered space — which takes its name from Sterling’s wife Jasmina Tešanović — will be more than a livable showcase. In fact, it will serve as a hybridized IoT research lab as it monitors its inhabitants’ responses to the ambient elements inside and will soon become a publicly available, short-term rental property on Airbnb.

CG1rqzAW8AMVHay

The June 6th opening of Casa Jasmina coincided with the second annual Torino Mini Maker Faire, featuring a number of public areas, discussions, appearances, and impressively, an exhibit area with over 50 Maker projects. Among those on display included Jesse HowardAkerOpendesk, and Open Structure. Aside from the initial batch of IoT creations, the abode boasted several electrical products from the Energy@Home consortium, Internet of Things artwork from the Torino Share Festival, and the first wave of prototypes from Casa Jasmina’s ‘Call for Projects.’

Among the ambient objects found throughout the living quarters are a wireless lamp designed out of Tetra Pak packaging, an LED lamp made from a milk carton and an Arduino Leonardo (ATmega32U4)-driven piece of artwork that emits different patterns of colored lights in response to fluctuations in background radioactivity.

da89d638-27be-4895-b746-096c1c0a0514

With several industry heavyweights and dedicated communities surrounding the project, Casa Jasmina will certainly continue to attract some interesting innovations, guests and intelligent things to populate the apartment. Looking ahead, it will even play home to various residencies, talks and workshops.

Italy couldn’t have been a better home to the world’s first connected, open source apartment. According to a new report, the European nation’s Internet of Things market is expected to reach €1.55 billion ($1.75B) this year, with smart home products leading the way. Not only did one in four respondents already admit to having an intelligent object in their house, nearly half (46%) say they are willing to purchase an Internet-enabled gadget or service in the near future.

CG0ziegWcAAcTAQ

“Most Maker objects today have been for the laboratory, or they have been for the university, or they have been for design school. They haven’t really been made for a domestic purpose. They aren’t for family, they aren’t for young children, they’re not for the elderly, for the cat, for the dog, for the houseplant. They are mostly there for the geek who is buying the hardware and is in command of the user base. I think its time for the Maker scene to expand out of its limits and try to talk to a wider demographic,” Sterling revealed in a recent interview.

Casa Jasmina welcomed its first guests on June 6, 2015 and will run for two years. Want to follow along with the initiative’s progress? Head over to its official page here.

Percepio Trace: Increasing response time


Discover how a developer used Tracealyzer to compare runtime behaviors and increase response time.


With time-to-market pressures constantly on the rise, advanced visualization support is a necessity nowadays. For those who may be unfamiliar with Percepio, the company has sought out to accelerate embedded software development through world-leading RTOS tracing tools. Tracealyzer provides Makers, engineers and developers alike a new level of insight into the run-time world, allowing for improved designs, faster troubleshooting and higher performance. What has made it such a popular choice among the community is that it works with a wide-range of operating systems and is available for Linux and FreeRTOS, among several others.

connected_views

When developing advanced multi-threaded software systems, a traditional debugger is often insufficient for understanding the behavior of the integrated system, especially regarding timing issues. Tracealyzer is able to visualize the run-time behavior through more than 20 innovative views that complement the debugger perspective. These views are interconnected in intuitive ways which makes the visualization system powerful and easy to navigate. Beyond that, it seamlessly integrates with Atmel Studio 6.2, providing optimized insight into the run-time of embedded software with advanced trace visualization.

Over the next couple of months, we will be sharing step-by-step tutorials from the Percepio team, collected directly from actual user experiences with Tracealyzer. In the latest segment, how a developer used Tracealyzer to solve an issue with a randomly occurring reset; today, we’re exploring how the tool can increase response time.


In this scenario, a user had developed a networked system containing a TCP/IP stack, a Flash file system and an RTOS running on an ARM Cortex-M4 microcontroller. The system was comprised of several RTOS tasks, including a server-style task that responds to network requests and a log file spooler task. The response time on network requests had often been an issue, and when testing their latest build, the system responded even slower than before. So, as one can imagine, they really wanted to figure this out!

But when comparing the code of the previous and new version, they could not find any obvious reason for the lower response time of the server task. There were some minor changes due to refactoring, but no significant functions had been added. However, since other tasks had higher scheduling priority than the server task, there could be many other causes for the increased response time. Therefore, they decided to use Tracealyzer to compare the runtime behaviors of the earlier version and the new version, in order to see the differences.

They recorded traces of both versions in similar conditions and began at the comparison at the highest level of abstraction, i.e., the statistics report (below). This report can display CPU usage, number of executions, scheduling priorities, but also metrics like execution time and response time calculated per each execution of each task and interrupt.

1

As expected, the statistics report revealed that response times were, in fact, higher in the new version — about 50% higher on average. The execution times of the server task were quite similar, only about 7% higher in the latter. Reason for the greater response time, other tasks that interfere.

To determine out what was causing this disparity, one can simply click on the extreme values in the statistics report. This focuses the main trace view on the corresponding locations, enabling a user to see the details. By opening two parallel instances of Tracealyzer, one for each trace, you can now compare and see the differences — as illustrated below.

2

Since the application server task performed several services, two user events have been added to mark the points where the specific request are received and answered, labeled “ServerLog.” The zoom levels are identical, so you can clearly see the higher response time in the new version. What’s more, this also shows that the logger task preempts the server task 11 times, compared to only 6 times in the earlier version — a pretty significant difference. Moreover, it appears that the logger task is running on higher priority than server task, meaning every logging call preempts the server task.

So, there seems to be new logging calls added in the new version causing the logger task to interfere more with the server task. In order to observe what is logged, add a user event in the logger task to show the messages in the trace view. Perhaps some can be removed to improve performance?

3

Now, it’s evident that also other tasks generate logging messages that affect the server task response time. For instance, the ADC_0 task. To see all tasks sending messages to the logger task, one can use the communication flow view — as illustrated below.

commu

The communication flow view is a dependency graph showing a summary of all operations on message queues, semaphores and other kernel objects. Here, this view is for the entire trace, but can be generated for a selected interval (and likewise for the statistics report) as well. For example, a user can see how the server task interacts with the TCP/IP stack. Note the interrupt handler named “RX_ISR” that triggers the server task using a semaphore, such as when there is new data on the server socket, and the TX task for transmitting over the network.

But back to the logger task, the communication flow reveals five tasks that sends logging messages. By double-clicking on the “LoggerQueue” node in the graph, the Kernel Object History view is opened and shows all operations on this message queue.

Clip

As expected, you can see that logger task receives messages frequently, one at a time, and is blocked after each message, as indicated by the “red light.”

Is this a really good design? It is probably not necessary to write the logging messages to file one-by-one. If increasing the scheduling priority of server task above that of the logger task, the server task would not be preempted as frequently, and thus, would be able to respond faster. The logging messages would be buffered in LoggerQueue until the server task (and other high priority tasks) has completed. Only then would the logger task be resumed and process all buffered messages in a batch.

By trying that, these screenshot below demonstrates the server task instance with highest response time, after increasing its scheduling priority above the logger task.

New

The highest response time is now just 5.4 ms instead of 7.5 ms, which is even faster than in the earlier version (5.7 ms) despite more logging. This is because the logger task is no longer preempting the server task, but instead processes all pending messages in a batch after server is finished. Here, one can also see “event labels” for the message queue operations. As expected, there are several “xQueueSend” calls in sequence, without blocking (= red labels) or task preemptions. There are still preemptions by the ADC tasks, but this no longer cause extra activations of the logger task. Problem solved!

The screenshot below displays LoggerQueue after the priority change. In the right column, one see how the messages are buffered in the queue, enabling the server task to respond as fast as possible, and the logging messages are then processed in a batch.

PErc

ASUS Z300 tablet is the world’s first on-cell touchscreen with active stylus pen support


The ASUS Z300 on-cell tablet provides a perfect ‘pen-to-paper’ writing experience thanks to Atmel maXTouch and maXStylus controllers.


ASUS has revealed quite a few announcements over the last couple of days at Computex 2015 including an all-in-one PC, a full-featured smartphone for selfies, a second generation ZenWatch, as well as a range of tablets in various sizes. Among those devices was the 10.1″ Z300, which features the world’s first on-cell touchscreen with capacitive active stylus pen support that enables a precise ‘pen-to-paper’ writing experience for more content generation on today’s digital world.

asus-zenpad-10-z300c-z300cg-z300cl

To accomplish this, the company has selected Atmel’s maXTouch controllers to power the touchscreen and active stylus pen of its newly-launched tablet. The ASUS Z300 tablet’s touch display is driven by a maXTouch T-series touchscreen controller, which features a revolutionary sensing architecture that combines both mutual and self-capacitance to enhance performance.

“As a leading provider of innovative mobile devices for the worldwide market, ASUS continues to bring superior products to market,” explained Shar Narasimhan, Atmel Senior Product Manager of Touch Marketing. “The selection of Atmel’s maXTouch controllers for the industry’s first 10.1″ on-cell tablet with capacitive active stylus by ASUS is further testament that we are enabling OEMs to deliver leading-edge digital lifestyle products.”

Touch1

What’s more, the device uses one of the industry’s most advanced capacitive styli, Atmel’s maXStylus mXTS220 — the only active pen with noise immunity capable of operating in the high display noise environment emitted by ultra-thin on-cell stack-ups. Together, the maXStylus and maXTouch integrate seamlessly to create a flawless user experience in even the most demanding conditions.

“As a leading manufacturer of mobile devices, our products are only built with world-class components,” added Samson Hu, Atmel’s Corporate Vice President & GM of Mobile Product Business Unit. “Atmel’s industry-leading stylus capabilities enabled us to deliver a much thinner on-cell display stack for more elegant designs with a best-in-class active pen experience. We look forward to launching more advanced devices with intuitive human interfaces powered by Atmel.”

IAR Systems adds powerful code analysis possibilities for 8-bit AVR developers


New version of IAR Embedded Workbench for AVR introduces static code analysis and stack usage analysis.


IAR Systems has unveiled version 6.60 of its IAR Embedded Workbench for AVR microcontrollers. The update extends code analysis possibilities with the integration of static code analysis tools and stack usage analysis.

ew-top

The latest version of IAR Embedded Workbench for AVR adds support for IAR Systems’ static analysis add-on product C-STAT. Completely integrated within the IAR Embedded Workbench IDE, C-STAT can perform numerous checks for compliance with rules as defined by the coding standards MISRA C:2004, MISRA C++:2008 and MISRA C:2012, as well as rules based on CWE (the Common Weakness Enumeration) and CERT C/C++. By using static analysis, developers can identify errors such as memory leaks, access violations, arithmetic errors, and array and string overruns at an early stage to ensure code quality and minimize the impact of errors on the finished product and on the project timeline.

Additionally, the version 6.60 introduces stack usage analysis. Seeing as though the stack is a fundamental property of an embedded application, setting it up properly is essential for ensuring the application’s stability and reliability. However, calculating the stack space is notoriously difficult for all but the smallest of systems. This challenging task can be greatly simplified by granting access to information around the worst case maximum stack depth of the application. Enabling stack usage analysis in IAR Embedded Workbench provides just that, adding listings of the maximum stack depth for each call graph root to the linker map file. The analysis process can be customized to take into account such constructs as calls via function pointers and recursion.

template-monitor-b-perspective-micrum

”The new functionality in IAR Embedded Workbench provides great advantages for our customers,” explains Steve Pancoast, Atmel VP of Software Applications, Tools and Development. “Developers can leverage the new analysis possibilities to improve the quality of their code, as well as streamline their development process. Atmel’s strong partnership with IAR Systems gives our customers access to world-leading tools across our entire range of AVR and Atmel | SMART ARM-based microcontrollers and microprocessors.”

IAR Embedded Workbench for AVR is a complete set of high-performance C/C++ tools featuring world-leading code optimizations creating compact, fast performing code. Version 6.60 also features parallel build, which will surely have a major impact on expediting development. Now, the user can optionally set the compiler to run in several processes simultaneously, which can significantly reduce compiler times.

Atmel tightens automotive focus with new Cortex-M7 MCUs


Large SoCs without an Ethernet interface typically have slow start-up times and high-power requirements — until now. 


Atmel, a lead partner for the ARM Cortex-M7 processor launch in October 2014, has unveiled three new M7-based microcontrollers with a unique memory architecture and advanced connectivity features for the connected car market.

According to a company spokesman, E70, V71 and V70 chips are the industry’s highest performing Cortex-M microcontrollers with six-stage dual-issue pipeline delivering 1500 CoreMarks at 300MHz. Moreover, V70 and V71 microcontrollers are the only automotive-qualified ARM Cortex-M7 MCUs with Audio Video Bridging (AVB) over Ethernet and Media LB peripheral support.

Cortex-M7-chip-diagramLG

Atmel is among the first suppliers to introduce the ARM Cortex-M7-based MCUs, whose core combines performance and simplicity and further pushes the performance envelope for embedded devices. The new MCU devices are aimed to take the connected car design to the next performance level with high-speed connectivity, high-density on-chip memory, and a solid ecosystem of design engineering tools.

Atmel’s Memory Play

Atmel has memory technology in its DNA, and that seems apparent in the design footprint of E70, V70 and V71 MCUs. The San Jose-based chipmaker is offering a flexible memory system that is optimized for performance, determinism and low latency.

Jacko Wilbrink, Senior Marketing Director at Atmel, said that the company’s Cortex-M7-based MCUs leverage Atmel’s advanced peripherals and flexible SRAM architecture for higher performance applications while keeping the Cortex-M class ease-of-use. He added that the large on-chip SRAM on SAM E70/V70/V71 chips is critical for connected car and IoT product designers since it allows them to run the multiple communication stacks and applications on the same MCU without adding external memory.

On-chip DMA and low-latency access SRAM architecture

On-chip DMA and low-latency access SRAM architecture

Avoiding the external memories reduces the PCB footprint, lowers the BOM cost and eliminates the complexity of high-speed PCB design when pushing the performance to a maximum. Next, Tim Grai, another senior manager at Atmel, pointed out another critical take from Cortex-M7 designs: The tightly coupled memory (TCM) interface. It provides the low-latency memory that the processor can use without the unpredictability that is a feature of cache memories.

Grai says that the most vital memory feature is not the memory itself but how the TCM interface to the M7 is utilized. “The available RAM is configurable to be used as system RAM or tightly-coupled instruction and data memory to the core, where it provides deterministic zero-wait state access,” Grai added. “The arrangement of SRAM allows for multiple concurrent accesses.”

Cortex-M7 a DSP Winner

According to Will Strauss, President & Principal Analyst at Forward Concepts, ARM has had considerable success with its Cortex-M4 power-efficient 32-bit processor chip family. “However, realizing that it lacked the math ability to do more sophisticated DSP functions, ARM has introduced the Cortex-M7, its newest and most powerful member of the Cortex-M family.”

Strauss adds that the M7 provides 32-bit floating point DSP capability as well as faster execution times. With the greater clock speed, floating point and twice the DSP power of the M4, the M7 is even more attractive for applications requiring high-performance audio and even video accompanying traditional automotive and control applications.

Atmel’s Grai added an interesting dimension to the DSP story in Cortex-M7 processor fabric. He pointed out that true DSPs don’t do control and logical functions well and generally lack the breadth of peripherals available on MCUs. “The attraction of the M7 is that it does both—DSP functions and control functions—hence it can be classified as a digital signal controller (DSC).”

Grai quoted the example of Atmel V70 and V71 microcontrollers used to connect end-nodes like infotainment audio amplifiers to the emerging Ethernet AVB network. In an audio amplifier, you receive a specific audio format that has to be converted, filtered, modulated to match the requirement for each specific speaker in the car. So you need Ethernet and DSP capabilities at the same time.

Grai says that the audio amplifier in infotainment applications is a good example of DSC: a mix of MCU capabilities and peripherals plus DSP capability for audio processing. Atmel is targeting the V70 and V71 chips as a bridge between large application processors and Ethernet.

Most of the time, the main processor does not integrate Ethernet AVB, as the infotainment connectivity is based on Ethernet standard. Here, the V71 microcontroller brings this feature to the main processor. “Large SoCs, which usually don’t have Ethernet interface, have slow start-up time and high power requirements,” Grai said. “Atmel’s V7x MCUs allow fast network start-up and facilitate power moding.”

The SAM E70, V70 and V71

Atmel’s three new MCU devices are aimed at multiple aspects of in-vehicle infotainment connectivity and telematics control.

SAM E70: The microcontroller series features Dual CAN-FD, 10/100 Ethernet MAC with IEEE1588 real-time stamping, and AVB support. It’s aimed at automotive industry’s movement toward controller area network (CAN) message-based protocols holistically across the cabin, eliminating isolation and wire redundancy, and have them all bridged centrally with the CAN interface.

SAM V70: It’s designed for MediaLB connectivity and leverages advanced audio processing, multi-port memory architecture and Cortex-M7 DSP capabilities. For the media-oriented systems transport (MOST) architecture, old modules are not redesigned. So Atmel offers a MOST solution that is done over Media Local Bus (MediaLB) and is supported by the V70 series.

SAM V71: The MCU series ports a complete automotive Ethernet AVB stack for in-vehicle infotainment connectivity, audio amplifiers, telematics and head control units. It mirrors the SAM V70 series features as well as combines Ethernet-AVB and MediaLB connectivity stacks.


Majeed Ahmad is the author of books Smartphone: Mobile Revolution at the Crossroads of Communications, Computing and Consumer Electronics and The Next Web of 50 Billion Devices: Mobile Internet’s Past, Present and Future.