Category Archives: Connected Car

Are you designing for the latest automotive embedded system?


Eventually, self-driving cars will arrive. But until then, here’s a look at what will drive that progression.


The next arrow of development is set for automotive

We all have seen it. We all have read about it in your front-center technology news outlets. The next forefront for technology will take place in the vehicle. The growing market fitted with the feature deviation trend does not appeal to the vision of customizing more traditional un-connected, oiled and commonly leveraged chassis vehicles of today. Instead, ubiquity in smartphones have curved a design trend, now mature while making way for the connected car platform. The awaiting junction is here for more integration of the automotive software stack.  Opportunities for the connected car market are huge, but multiple challenges still exist. Life-cycles in the development of automotive and the mobile industry are a serious barrier for the future of connected cars. Simply, vehicles take much longer to develop than smartphones other portable gadgetry. More integration from vendors and suppliers are involved with the expertise to seamlessly fit the intended blueprint of the design. In fact, new features such as the operating system are becoming more prevalent, while the demand for sophisticated and centrally operated embedded systems are taking the height of the evolution. This means more dependence on integration of data from various channels, actuators, and sensors — the faculty to operate all the new uses cases such as automatic emergency response systems are functionality requiring more SoC embedded system requirements.

A step toward the connected car - ecall and how it works

What is happening now?

People. Process. Governance. Adoption. Let’s look at the similarities stemmed from change. We are going to witness new safety laws and revised regulations coming through the industry. These new laws will dictate the demand for connectivity. Indeed, drawing importance this 2015 year with the requirement set by 2018, European Parliament voted in favor of eCall regulation. Cars in Europe must be equipped with eCall, a system that automatically contacts emergency services directing them to the vehicle location in the event of an emergency. The automotive and mobile industries have different regional and market objectives. Together, all the participants in both market segments will need to find ways to collaborate in order to satisfy consumer connectivity needs. Case in point, Chrysler has partnered with Nextel to successfully connect cars like their Dodge Viper, while General Motors uses AT&T as its mobile development partner.

General Motors selected AT&T as its mobile partner

What is resonating from the sales floor and customer perspective?

The demand is increasing for more sophistication and integration of software in the cabin of cars. This is happening from the manufacturer to the supplier network then to the integration partners — all are becoming more engaged to achieve the single outcome, pacing toward the movement to the connected car. Stretched as far as the actual retail outlets, auto dealers are shifting their practice to be more tech savvy, too. The advent of the smart  vehicle has already dramatically changed the dealership model, while more transformation awaits the consumer.

On the sales floor as well as the on-boarding experience, sales reps must plan to spend an hour or more teaching customers how to use their car’s advanced technology. But still, these are only a few mentioned scenarios where things have changed in relation to cars and how they are sold and even to the point of how they are distributed, owned, and serviced. One thing for certain, though, is that the design and user trend are intersecting to help shape the demand and experience a driver wants in the connected car. This is further bolstered by the fast paced evolution of smartphones and the marketing experiences now brought forth by the rapid adoption and prolific expansion of the mobile industry tethered by their very seamless and highly evolved experiences drawn from their preferred apps.

Today, customer experiences are becoming more tailored while users, albeit on the screen or engaged with their mobile devices are getting highly acquainted with the expectation of “picking up from where I left off” regardless of what channel, medium, device, or platform.  Seamless experiences are breaking through the market.  We witness Uber, where users initialize their click on their smartphone then follows by telemetry promoted from Uber drivers and back to the users smart phone.  In fact, this happens vis versa, Uber driver’s have information on their console showing customer location and order of priority.  Real life interactions are being further enhanced by real-time data, connecting one device to draw forth another platform to continue the journey.  Transportation is one of the areas where we can see real-time solutions changing our day-to-day engagement.  Some of these are being brought forth by Atmel’s IoT cloud partners such as PubNub where they leverage their stack in devices to offer dispatch, vehicle state, and geo fencing for many vehicle platforms.  Companies like Lixar, LoadSmart, GetTaxi, Sidecar, Uber, Lyft are using real-time technologies as integral workings to their integrated vehicle platforms.

The design trajectory for connected cars continues to follow this arrow forward

Cars are becoming more of a software platform where value chain add-ons tied to an ecosystem are enabled within the software tethered by the cloud where data will continue to enhance the experience. The design trajectory for connected cars follow this software integration arrow.  Today, the demand emphasizes mobility along with required connectivity to customer services and advanced functions like power management for electric vehicles, where firmware/software updates further produce refined outcomes in the driver experience (range of car, battery management, other driver assisted functionalities).

Carmakers and mobile operators are debating the best way to connect the car to the web. Built-in options could provide stronger connections, but some consumers prefer tethering their existing smartphone to the car via Bluetooth or USB cable so they can have full access to their personal contacts and playlists. Connected car services will eventually make its way to the broader car market where embedded connections and embedded systems supporting these connections will begin to leverage various needs to integrate traditional desperate signals into a more centrally managed console.

Proliferation of the stack

The arrow of design for connected cars will demand more development, bolstering the concept that software and embedded systems factored with newly-introduced actuators and sensors will become more prevalent. We’re talking about “software on wheels,” “SoC on wheels,” and “secured mobility.”

Design wise, the cost-effective trend will still remain with performance embedded systems. Many new cars may have extremely broad range of sensor and actuator‑based IoT designs which can be implemented on a single compact certified wireless module.

The arrow for connected cars will demand more development bolstering the concept that software and embedded systems factored with newly introduced actuators & sensors will become more prevalent; “software on wheels”, “SoC on wheels” and “secured mobility”.

Similarly, having fastest startup times by performing the task with a high-performance MCU vs MPU, is economic for a designer. It can not only reduce significant bill of materials cost, development resources, sculpted form factor, custom wireless design capabilities, but also minimize the board footprint. Aside from that, ARM has various IoT device development options, offering partner ecosystems with modules that have open standards. This ensures ease of IoT or connected car connectivity by having type approval certification through restrictive access to the communications stacks.

Drivers will be prompted with new end user applications — demand more deterministic code and processing with chips that support the secure memory capacity to build and house the software stack in these connected car applications.

Feature upon feature, layer upon layer of software combined with characteristics drawn from the events committed by drivers, tires, wheels, steering, location, telemetry, etc. Adapted speed and braking technologies are emerging now into various connected car makes, taking the traditional ABS concept to even higher levels combined with intelligence, along with controlled steering and better GPS systems, which will soon enable interim or cruise hands-free driving and parking.

Connected Car Evolution

Longer term, the technological advances behind the connected car will eventually lead to self-driving vehicles, but that very disruptive concept is still far out.

Where lies innovation and change is disruption

Like every eventual market disruption, there will be the in-between development of this connected car evolution. Innovative apps are everywhere, especially the paradigm where consumers have adopted to the seamless transitional experiences offered by apps and smartphones. Our need for ubiquitous connectivity and mobility, no matter where we are physically, is changing our vehicles into mobile platforms that want us users to seamlessly be connected to the world. This said demand for connectivity increases with the cost and devices involved will become more available. Cars as well as other mobility platforms are increasingly becoming connected packages with intelligent embedded systems. Cars are offering more than just entertainment — beyond providing richer multimedia features and in-car Internet access.  Further integration of secure and trusted vital data and connectivity points (hardware security/processing, crypto memory, and crypto authentication) can enable innovative navigation, safety and predictive maintenance capabilities.

Carmakers are worried about recent hacks,  especially with issues of security and reliability, making it unlikely that they will be open to every kind of app.  They’ll want to maintain some manufactured control framework and secure intrusion thwarting with developers, while also limiting the number of apps available in the car managing what goes or conflicts with the experience and safety measures.  Importantly, we are taking notice even now. Disruption comes fast, and Apple and others have been mentioned to enter this connected car market. This is the new frontier for technological equity scaling and technology brand appeal. Much like what we seen in the earlier models of Blackberry to smartphones, those late in the developmental evolution of their platforms may be forced adrift or implode by the market.

No one is arguing it will happen. Eventually, self-driving cars will arrive.  But for now, it remains a futuristic concept.

What can we do now in the invention, design and development process?

The broader output of manufactured cars will need to continue in leveraging new designs that take in more integration of traditional siloed integration vendors so that the emergence of more unified and centrally managed embedded controls can make its way. Hence, the importance now exists in the DNA of a holistically designed platform fitted with portfolio of processors and security to take on new service models and applications.

This year, we have compiled an interesting mixture of technical articles to support the development and engineering of car access systems, CAN and LIN networks, Ethernet in the car, capacitive interfaces and capacitive proximity measurement.

In parallel to the support of helping map toward the progress and evolution of the connected car, a new era of design exists. One in which the  platform demands embedded controls to evenly match their design characteristics and application use cases. We want to also highlight the highest performing ARM Cortex-M7 based MCU in the market, combining exceptional memory and connectivity options for leading design flexibility. The Atmel | SMART ARM Cortex-M7 family is ideal for automotive, IoT and industrial connectivity markets. These SAM V/E/S family of microcontrollers are the industry’s highest performing Cortex-M microcontrollers enhancing performance, while keeping cost and power consumption in check.

So are you designing for the latest automotive, IoT, or industrial product? Here’s a few things to keep in mind:

  • Optimized for real-time deterministic code execution and low latency peripheral data access
  • Six-stage dual-issue pipeline delivering 1500 CoreMarks at 300MHz
  • Automotive-qualified ARM Cortex-M7 MCUs with Audio Video Bridging (AVB) over Ethernet and Media LB peripheral support (only device in the market today)
  • M7 provides 32-bit floating point DSP capability as well as faster execution times with greater clock speed, floating point and twice the DSP power of the M4

We are taking the connected car design to the next performance level — having high-speed connectivity, high-density on-chip memory, and a solid ecosystem of design engineering tools. Recently, Atmel’s Timothy Grai added a unveiling point to the DSP story in Cortex-M7 processor fabric. True DSPs don’t do control and logical functions well; they generally lack the breadth of peripherals available on MCUs. “The attraction of the M7 is that it does both — DSP functions and control functions — hence it can be classified as a digital signal controller (DSC).” Grai quoted the example of Atmel’s SAM V70 and SAM V71 microcontrollers are used to connect end-nodes like infotainment audio amplifiers to the emerging Ethernet AVB network. In an audio amplifier, you receive a specific audio format that has to be converted, filtered, and modulated to match the requirement for each specific speaker in the car. Ethernet and DSP capabilities are required at the same time.

“The the audio amplifier in infotainment applications is a good example of DSC; a mix of MCU capabilities and peripherals plus DSP capability for audio processing. Most of the time, the main processor does not integrate Ethernet AVB, as the infotainment connectivity is based on Ethernet standard,” Grai said. “Large SoCs, which usually don’t have Ethernet interface, have slow start-up time and high power requirements. Atmel’s SAM V7x MCUs allow fast network start-up and facilitate power moding.”

Atmel has innovative memory technology in its DNA — critical to help fuel connected car and IoT product designers. It allows them to run the multiple communication stacks for applications using the same MCU without adding external memory. Avoiding external memories reduces the PCB footprint, lowers the BOM cost and eliminates the complexity of high-speed PCB design when pushing the performance to a maximum.

Importantly, the Atmel | SMART ARM Cortex-M7 family achieves a 1500 CoreMark Score, delivering superior connectivity options and unique memory architecture that can accommodate the said evolve of the eventual “SoC on wheels” design path for the connected car.

How to get started

  1. Download this white paper detailing how to run more complex algorithms at higher speeds.
  2. Check out the Atmel Automotive Compilation.
  3. Attend hands-on training onboard the Atmel Tech on Tour trailer. Following these sessions, you will walk away with the Atmel | SMART SAM V71 Xplained Ultra Evaluation Kit.
  4. Design the newest wave of embedded systems using SAM E70, SAM S70, or SAM V70 (ideal for automotive, IoT, smart gateways, industrial automation and drone applications, while the auto-grade SAM V70 and SAM V71 are ideal for telematics, audio amplifiers and advanced media connectivity).

IMG_3659

[Images: European Commission, GSMA]

Stewart wants to be the middleman between you and your autonomous car


This tactile interface is designed for fully autonomous cars and hopes to help mediate the trust issues between man and machine.


Self-driving cars are no longer a futuristic idea, with an estimated 10 million expected to hit the roads by 2020. In fact, companies like Mercedes, BMW, Tesla and Nissan are among countless others that have already begun to implement these autonomous features into their automobiles. Although such vehicles offer obvious benefits such as faster travel times, enhanced safety and more convenience, some folks believe it eliminates a sense of freedom, expression and control while behind the wheel. In order to promote a positive relationship between man and his machine, Felix Ros has developed Stewart — a servo-controlled joystick that will help overcome society’s reluctance in embracing fully autonomous vehicles.

F8YWTSFIB49PDP5.MEDIUM

Stewart will provide you with constant updates about the car’s behavior and its intentions. However, if you don’t agree on the car’s next course of action, you can manipulate the tactile interface to change this. The device will learn from you in the same way that you can learn from it, hopefully resulting in a mutually trusting relationship. It should be noted that Stewart is merely a middleman between the autonomous vehicle and its driver, and is no way intended to actually control the car.

Through nuanced force feedback, Stewart will tell you what the car plans to do next, such as which direction it will choose and whether it will accelerate or brake. Yet, if you disagree with the vehicle’s planned course of action, you can intervene with the joystick to get the car to take your preferred route, or to simply drive in a different style. According to Ros, this puts emotion back into driving within the margins of what is considered safe.

FNA7SJCIB49PF6L.MEDIUM

“So why would you want to control a car that drives itself? Learning to trust a (new) technology takes time. A feeling of control can help to build a mutually trustful relationship,” Ros explains. “Humans are very unpredictable creatures that tend to change their minds frequently. For example: while driving you want to make a detour or you may need a coffee break. These changes of plan can easily be communicated to the car trough Stewart.”

Stewart is equipped with six servos, which are controlled by an Arduino Uno (ATmega328). A Processing sketch calculates the transition of all the six degrees of freedom and feeds that information to the Arduino. Intrigued? Check out the Maker’s official page here, as well as his step-by-step breakdown on Instructables.

ARM Keil ecosystem integrates the Atmel SAM ESV7


Keil is part of the ARM wide ecosystem, enabling developers to speed up system release to the market. 


Even the best System-on-Chip (SoC) is useless without software, as well as the best designed S/W needs H/W to flourish. The “old” embedded world has exploded into many emergent markets like the  IoT, wearables, and even automotive, which is no more restricted to motor control or airbags as innovative products from entertainment to ADAS are being developed. What is the common denominator with these emergent products? Each of these require more software functionality and fast memory algorithm with deterministic code execution, and consequently innovative hardware to support these requirements, such as the ARM Cortex-M7-based Atmel | SMART SAM ESV7.

AtmelChipLib Overview

ARM has released a complete software development environment for a range of ARM Cortex-M based MCU devices: Keil MDK. Keil is part of ARM wide ecosystem, enabling developers to speed up system release to the market. MDK includes the µVision IDE/Debugger and ARM C/C++ Compiler, along with the essential middleware components and software packs. If you’re familiar with Run-Time Environment stacked description, you’ll recognize the various stacks. Let’s focus on “CMSIS-Driver”. CMSIS is the standard software framework for Cortex-M MCUs, extending the SAM-ESV7 Chip Library with standardized drivers for middleware and generic component interfaces.

By definition, an MCU is designed to address multiple applications and the SAM ESV7 is dedicated to support performance demanding and DSP intensive systems. Thanks to its 300MHz clock, SAM ESV7 delivers up to 640 DMIPS and its DSP performance is double that available in the Cortex-M4. A double-precision floating-point unit and a double-issue instruction pipeline further position the Cortex-M7 for speed.

Atmel Cortex M7 based Dev board

Let’s review some of these applications where SAM ESV7 is the best choice…

Finger Printer Module

The goal is to provide human bio authentication module for office or house access control. The key design requirements are:

  • +300 MHz CPU performance to process recognition algorithms
  • Image sensor interface to read raw finger image data from finger sensor array
  • Low cost and smaller module size
  • Flash/memory to reduce BOM cost and module size
  • Memory interface to expand model with memory extension just in case.

The requirement for superior performance and an image sensor interface can be seen as essential needs, but which will make the difference will be to offer both cheaper BOM cost and smaller module size than the competitor? The SAM S70 integrates up to 2MB embedded Flash, which is twice more than the direct competitor and may allow reducing BOM and module size.

SAM S70 Finger Print

Automotive Radio System

Every cent counts in automotive design, and OEMs prefer using a MCU rather than MPU, at first for cost reasons. Building an attractive radio for tomorrow’s car requires developing very performing DSP algorithms. Such algorithms used to be developed on expansive DSP standard part, leading to large module size, including external Flash and MCU leading obviously to a heavy BOM. In a 65nm embedded Flash process device, the Cortex-M7 can achieve a 1500 CoreMark score while running at 300 MHz, and its DSP performance is double that available in the Cortex-M4. This DSP power can be used to manage eight channels of speaker processing, including six stages of biquads, delay, scaler, limiter and mute functions. The SAM S71 workload is only 63% of the CPU, leaving enough room to support Ethernet AVB stack — very popular in automotive.

One of the secret sauces of the Cortex-M7 architecture is to provide a way to bypass the standard execution mechanism using “tightly coupled memories,” or TCM. There is an excellent white paper describing TCM implementation in the SAM S70/E70 series, entitled “Run Blazingly Fast Algorithms with Cortex-M7 Tightly Coupled Memories” from Lionel Perdigon and Jacko Wilbrink, which you can find here.


This post has been republished with permission from SemiWiki.com, where Eric Esteve is a principle blogger as well as one of the four founding members of the site. This blog first appeared on SemiWiki on October 23, 2015.

How Ethernet AVB is playing a central role in automotive streaming applications


Ethernet is emerging as the network of choice for infotainment and advanced driver assistance systems, Atmel’s Tim Grai explains.


Imagine you’re driving down the highway with the music blaring, enjoying the open road. Now imagine that the sound from your rear speaker system is delayed by a split second from the front; your enjoyment of the fancy in-car infotainment system comes to a screeching halt.

Ethernet is emerging as the network of choice for infotainment and advanced driver assistance systems that include cameras, telematics, rear-seat entertainment systems and mobile phones. But standard Ethernet protocols can’t assure timely and continuous audio/video (A/V) content delivery for bandwidth intensive and latency sensitive applications without buffering, jitter, lags or other performance hits.

fig1_popup

Audio-Video Bridging (AVB) over Ethernet is a collection of extensions to the IEEE802.1 specifications that enables local Ethernet networks to stream time synchronised, loss sensitive A/V data. Within an Ethernet network, the AVB extensions help differentiate AVB traffic from the non-AVB traffic that can also flow through the network. This is done using an industry standard approach that allows for plug-and-play communication between systems from multiple vendors.

The extensions that define the AVB standard achieve this by:

  • reserving bandwidth for AVB data transfers to avoid packet loss due to network congestion from ‘talker’ to ‘listener(s)’
  • establishing queuing and forwarding rules for AVB packets that keep packets from bunching and guarantee delivery of packets with a bounded latency from talker to listener(s) via intermediate switches, if needed
  • synchronizing time to a global clock so the time bases of all network nodes are aligned precisely to a common network master clock, and
  • creating time aware packets which include a ‘presentation time’ that specifies when A/V data inside a packet has to be played.

Designers of automotive A/V systems need to understand the AVB extensions and requirements, as well as how their chosen microcontroller will support that functionality.

AVB: A basket of standards

AVB requires that three extensions be met in order to comply with IEEE802.1:

  • IEEE802.1AS – timing and synchronisation for time-sensitive applications (gPTP)
  • IEEE802.1Qat – stream reservation protocol (SRP)
  • IEEE802.1Qav – forwarding and queuing for time-sensitive streams (FQTSS).

In order to play music or video from one source, such as a car’s head unit, to multiple destinations, like backseat monitors, amplifiers and speakers, the system needs a common understanding of time in order to avoid lags or mismatch in sound or video. IEEE802.1AS-2011 specifies how to establish and maintain a single time reference – a synchronised ‘wall clock’ – for all nodes in a local network. The generalized precision time protocol (gPTP), based on IEEE1588, is used to synchronize and syntonize all network nodes to sub-microsecond accuracy. Nodes are synchronized if their clocks show the same time and are syntonised if their clocks increase at the same rate.

fig.2

This protocol selects a Grand Master Clock from which the current time is propagated to all network end-stations. In addition, the protocol specifies how to correct for clock offset and clock drifts by measuring path delays and frequency offsets. New MCUs, such as the Atmel | SMART SAMV7x (shown above), detect and capture time stamps automatically when gPTP event messages cross MII layers. They can also transport gPTP messages over raw Ethernet, IPv4 or IPv6. This hardware recognition feature helps to calculate clock offset and link delay with greater accuracy and minimal software load.

Meanwhile, SRP guarantees end-to-end bandwidth reservation for all streams to ensure packets aren’t delayed or dropped at any switch due to network congestion, which can occur with standard Ethernet. For the in-vehicle environment, SRP is typically configured in advance by the car maker, who defines data streams and bandwidth allocations.

Talkers (the source of A/V data) ‘advertise’ data streams and their characteristics. Switches process these announcements from talker and listeners to:

  • register and prune streams’ path through the network
  • reserve bandwidth and prevent over subscription of available bandwidth
  • establish forwarding rules for incoming packets
  • establish the SRP domain, and
  • merge multiple listener declarations for the same stream

The standard stipulates that AVB data can reserve only 75% of total available bandwidth, so for a 100Mbit/s link, the maximum AVB data is 75Mbit/s. The remaining bandwidth can be used for all other Ethernet protocols.

In automotive systems, the streams may be preconfigured and bandwidth can be reserved statically at system startup to reduce the time needed to bring the network into a fully operational state. This supports safety functions, such as driver alerts and the reversing camera, that must be displayed within seconds.

SRP uses other signalling protocols, such as Multiple MAC Registration Protocol, Multiple VLAN Registration Protocol and Multiple Stream Registration Protocol to establish bandwidth reservations for A/V streams dynamically.

The third extension is FQTSS, which guarantees that time sensitive A/V streams arrive at their listeners within a bounded latency. It also defines procedures for priority regenerations and credit based traffic shaper algorithms to meet stream reservations for all available devices.

The AVB standard can support up to eight traffic classes, which are used to determine quality of service. Typically, nodes support at least two traffic classes – Class A, the highest priority, and Class B. Microcontroller features help manage receive and transmit data with multiple priority queues to support AVB and ‘best effort class’ non AVB data.

box

Automotive tailored requirements

Automotive use cases typically fix many parameters at the system definition phase, which means that AVB implementation can be optimised and simplified to some extent.

  • Best Master Clock algorithm (BMCA): the best clock master is fixed at the network definition phase so dynamic selection using BCMA isn’t needed.
  • SRP: all streams, their contents and their characteristics are known at system definition and no new streams are dynamically created or destroyed; the proper reservation of data is known at the system definition phase; switches, talkers and listeners can have their configurations loaded at system startup from pre-configured tables, rather than from dynamic negotiations
  • Latency; while this is not critical, delivery is. Automotive networks are very small with only a few nodes between a talker and listener. It is more important not to drop packets due to congestion.

Conclusion

The requirement to transfer high volumes of time sensitive audio and video content inside vehicles necessitates developers to understand and apply the Ethernet AVB extensions. AVB standardization results in interoperable end-devices from multiple vendors that can deliver audio and video streams to distributed equipment on the network with micro-second accuracy or better. While the standard brings complexities, new MCUs with advanced features are simplifying automotive A/V design.


This article was originally published on New Electronics on October 13, 2015 and authored by Tim Grai, Atmel’s Director of Automotive MCU Application Engineering. 

Secured SAMA5D4 for industrial, fitness or IoT display


To target applications like home automation, surveillance camera, control panels for security, or industrial and residential gateways, high DMIPS computing is not enough.


The new SAMA5D4 expands the Atmel | SMART Cortex-A5-based family, adding a 720p resolution hardware video decoder to target Human Machine Interface (HMI), control panel and IoT applications when high performance display capability is required. Cortex-A5 offers raw performance of 945 DMIPS (@ 600 MHz) completed by ARM NEON 128-bit SIMD (single instruction, multiple data) DSP architecture extension. To target applications like home automation, surveillance camera, control panels for security, or industrial and residential gateways, high DMIPS computing is not enough. In order to really make a difference, on top of the hardware’s dedicated video decoder (H264, VP8, MPEG4), you need the most complete set of security features.

Life-Fitness-F3-Folding-Treadmill-with-GO-Console-2_681x800

Whether for home automation purpose or industrial HMI, you want your system to be safeguarded from hackers, and protect your investment against counterfeiting. You have the option to select 16-b DDR2 interface, or 32-b if you need better performance, but security is no longer just an option. Designing with Atmel | SMART SAMA5D4 will guarantee secure boot, including ARM Trust Zone, encrypted DDR bus, tamper detection pins and secure data storage. This MPU also integrates hardware encryption engines supporting AES (Advanced Encryption Standard)/3DES (Triple Data Encryption Standard), RSA (Rivest-Shamir-Adleman), ECC (Elliptic Curves Cryptography), as well as SHA (Secure Hash Algorithm) and TRNG (True Random Number Generator).

If you design fitness equipment, such as treadmills and exercise machines, you may be more sensitive to connectivity and user interface functions than to security elements — even if it’s important to feel safe in respect with counterfeiting. Connectivity includes gigabit and 10/100 Ethernet and up to two High-Speed USB ports (configurable as two hosts or one host and one device port) and one High Speed Inter-Chip Interface (HSIC) port, several SDIO/SD/MMC, dual CAN, etc. Because the SAMA5D4 is intended to support industrial, consumer or IoT applications requiring efficient display capabilities, it integrates LCD controllers with a graphics accelerator, resistive touchscreen controller, camera interface and the aforementioned 720p 30fps video decoder.

hmi-panels-sama5d4-atmel-processor

The MCU market is highly competitive, especially when you consider that most of the products are developed around the same ARM-based family of cores (from the Cortex-M to Cortex-A5 series). Performance is an important differentiation factor, and the SAMA5D4 is the highest performing MPUs in the Atmel ARM Cortex-A5 based MPU family, offering up to 945 DMIPS (@ 600 MHz) completed by DSP extension ARM NEON 128-bit SIMD (single instruction, multiple data). Using safety and security on top of performance to augment differentiation is certainly an efficient architecture choice. As you can see in the block diagram below, the part features the ARM TrustZone system-wide approach to security, completed by advanced security features to protect the application software from counterfeiting, like encrypted DDR bus, tamper detection pins and secure data storage. But that’s not enough. Fortunately, this microprocessor integrates hardware encryption engines supporting AES/3DES, RSA, ECC, as well as SHA and TRNG.

The SAMA5 series targets industrial or fitness applications where safety is a key differentiating factor. If security helps protecting the software asset and makes the system robust against hacking, safety directly protects the user. The user can be the woman on the treadmill, or the various machines connected to the display that SAMA5 MCU pilots. This series is equipped with functions that ease the implementation of safety standards like IEC61508, including a main crystal oscillator clock with failure detector, POR (power-on reset), independent watchdog timers, write protection register, etc.

Atmel-SMART-SAMA5D4-ARM-Cortex-MPU-AtmelThe SAMA5D4 is a medium-heavier processor and well suited for IoT, control panels, HMI, and the like, differentiating from other Atmel MCUs by the means of performance and security (not to mention, safety). The ARM Cortex-A5 based device delivers up to 945 DMIPS when running at 600 MHz, completed by DSP architecture extension ARM NEON 128-bit SIMD. The most important factor that sets the SAMA5D4 apart from the rest is probably its implemented security capabilities. These will protect OEM software investments from counterfeiting, user privacy against hacking, and its safety features make the SAMA5D4 ideal for industrial, fitness or IoT applications.


This post has been republished with permission from SemiWiki.com, where Eric Esteve is a principle blogger as well as one of the four founding members of the site. This blog first appeared on SemiWiki on October 6, 2015.

Your touchscreen can now seamlessly transition between hover, finger and glove touch


The new maXTouch mXT641T family is the industry’s first auto-qualified self- and mutual-capacitance controller meeting the AEC-Q100 standards for high reliability in harsh environments.


Optimized for capacitive touchpads and touchscreens from five to 10 inches, Atmel has expanded its robust portfolio of automotive-qualified maXTouch controllers with the all-new mXT641T family. These devices are the industry’s first auto-qualified self- and mutual-capacitance controllers meeting the AEC-Q100 standards for high reliability in harsh environments.

Glo1

The maXTouch mXT641T family incorporates Atmel’s Adaptive Sensing technology to enable dynamic touch classification, a feature that automatically and intelligently switches between self- and mutual-capacitance sensing to provide users a seamless transition between a finger touch, hover or glove touch. As a result, this eliminates the need for users to manually enable ‘glove mode’ in the operating system to differentiate between hover and glove modes. Adaptive Sensing is also resistant to water and moisture and ensures superior touch performance even in these harsh conditions.

The latest family of devices support stringent automotive requirements including hover and glove support in moist and cold environments, thick lens for better impact resistance, and single-layer shieldless sensor designs in automotive center consoles, navigation systems, radio interfaces and rear-seat entertainment systems. The single-layer shieldless sensor design eliminates additional screen layers, delivering better light transparency resulting in lower power consumption along with an overall lower system cost for the manufacturer.

Glove

“More consumers are demanding high-performance touchscreens in their vehicles with capacitive touch technology,” said Rob Valiton, Senior Vice President and General Manager, Automotive, Memory and Secure Products Business Units. “Atmel is continuing to drive more innovative, next-generation touch technologies to the automotive market and our new family of automotive-qualified maXTouch T controllers is further testament to our leadership in this space. Atmel is the only automotive-qualified touch supplier with over two decades of experience in designing, developing, and manufacturing semiconductor solutions that meet the stringent quality and reliability standards for our automotive customers.”

Interested? Production quantities of the mXT641T are now available. Meanwhile, you can learn all about the entire maXTouch lineup here.

$60 hack can trick LIDAR systems used by most self-driving cars


A security researcher has created a $60 system with Arduino and a laser pointer that can spoof the LIDAR sensors used by most autonomous vehicles. 


Many self-driving cars use LIDAR sensors to detect obstacles and build 3D images to help them navigate. However, one security researcher has developed a $60 device with “off-the-shelf parts” that can trick the systems into seeing objects which don’t actually exit, thereby forcing the autonomous vehicles to take unnecessary actions, like slowing down or stopping to avoid a collision with the phantom thing. Ultimately, this further highlights the need for stringent security measures for automobiles that would otherwise be vulnerable to cyber criminals armed with nothing more than a low-power laser and pulse generator.

JeffKowalskyCorbis4254044417-1441388783311-2

“It’s kind of a laser pointer, really. And you don’t need the pulse generator when you do the attack. You can easily do it with a Raspberry Pi or an Arduino,” explains researcher Jonathan Petit, principle scientist at Security Innovation.

According to IEEE Spectrum, Petit began by simply recording pulses from a commercial IBEO Lux LIDAR unit. The pulses were not encoded or encrypted, which allowed him to replay them at a later point. He was then able to create the illusion of a fake car, wall, cyclist or pedestrian anywhere from 65 to 1,100 feet from the LIDAR system, and make multiple copies of the simulated obstacles. In tests, the attack worked at all angles — from behind, the side and in front without alerting the passengers — and didn’t always require a precise hit of the device for it to achieve its goal.

“I can spoof thousands of objects and basically carry out a denial of service attack on the tracking system so it’s not able to track real objects,” Petit adds.

As IEEE Spectrum notes, sensor attacks are not limited to self-driving cars, either. The same homebrew laser pointer can be employed to carry out an equally devastating denial of service attack on a human motorist by simply dazzling them, and without the need for sophisticated laser pulse recording, generation or synchronization equipment.

toyota_self-driving_car_lidar_laser-100020089-orig

While the DIY system won’t necessary affect everyone, it does state the case that security should be at the forefront of auto design. Petit concludes. “There are ways to solve it. A strong system that does misbehavior detection could cross-check with other data and filter out those that aren’t plausible. But I don’t think carmakers have done it yet. This might be a good wake-up call for them.”

The researcher described his proof-of-concept hack in a paper entitled “Potential Cyberattacks on Automated Vehicles,” which will be presented at Black Hat Europe in November.

[Images: Jeff Kowalsky/IEEE Spectrum, TechHive]

How to prevent execution surprises for Cortex-M7 MCU


We know the heavy weight linked with software development, in the 60% to 70% of the overall project cost.


The ARM Cortex-A series processor core (A57, A53) is well known in the high performance market segments, like application processing for smartphone, set-top-box and networking. If you look at the electronic market, you realize that multiple applications are cost sensitive and don’t need such high performance processor core. We may call it the embedded market, even if this definition is vague. The ARM Cortex-M family has been developed to address these numerous market segments, starting with the Cortex-M0 for lowest cost, the Cortex-M3 for best power/performance balance, and the Cortex-M4 for applications requiring digital signal processing (DSP) capabilities.

For the audio, voice control, object recognition, and complex sensor fusion of automotive and higher-end Internet of Things sensing, where complex algorithms for audio and video are needed for rich audio and visual capabilities, Cortex-M7 is required. ARM offers the processor core as well as the Tightly Coupled Memory (TCM) architecture, but ARM licensees like Atmel have to implement memories in such a way that the user can take full benefit from the M7 core to meet system performance and latency goals.

Figure 1. The TCM interface provides a single 64-bit instruction port and two 32-bit data ports.

The TCM interface provides a single 64-bit instruction port and two 32-bit data ports.

In a 65nm embedded Flash process device, the Cortex-M7 can achieve a 1500 CoreMark score while running at 300 MHz, offering top class DSP performance: double-precision floating-point unit and a double-issue instruction pipeline. But algorithms like FIR, FFT or Biquad need to run as deterministically as possible for real-time response or seamless audio and video performance. How do you best select and implement the memories needed to support such performance? If you choose Flash, this will require caching (as Flash is too slow) leading to cache miss risk. Whereas SRAM technology is a better choice since it can be easily embedded on-chip and permits random access at the speed of processor.

Peripheral data buffers implemented in general-purpose system SRAM are typically loaded by DMA transfers from system peripherals. The ability to load from a number of possible sources, however, raises the possibility of unnecessary delays and conflicts by multiple DMAs trying to access the memory at the same time. In a typical example, we might have three different entities vying for DMA access to the SRAM: the processor (64-bit access, requesting 128 bits for this example) and two separate peripheral DMA requests (DMA0 and DMA1, 32-bit access each). Atmel has get round this issue by organizing the SRAM into several banks as described in this picture:

Figure 2. By organizing the SRAM into banks, multiple DMA bursts can occur simultaneously with minimal latency.

By organizing the SRAM into banks, multiple DMA bursts can occur simultaneously with minimal latency.

For a chip maker designing microcontrollers, licensing ARM Cortex-M processor core provides numerous advantages. The very first is the ubiquity of the ARM core architecture, being adopted in multiple market segments to support variety of applications. If this chip maker wants to design-in a new customer, the probability that such OEM has already used ARM-based MCU is very high, and it’s very important for this OEM to be able to reuse existing code (we know the heavy weight linked with software development, in the 60% to 70% of the overall project cost). But this ubiquity generates a challenge: how do you differentiate from the competition when competitors can license exactly the same processor core?

Selecting a more aggressive technology node and providing better performance at lower cost are an option, but we understand that this advantage can disappear as soon as the competition also move to this node. Integrating larger amount of Flash is another option, which is very efficient if the product is designed on a technology that enables it to keep the pricing low enough.

If the chip maker has designed on an aggressive technology node for higher performance and offers a larger amount of Flash than the competition, it may be enough differentiation. Completing with the design of a smarter memory architecture unencumbered by cache misses, interrupts, context swaps, and other execution surprises that work against deterministic timing allow bringing strong differentiation.

Pic

If you want to more completely understand how Atmel has designed this SMART memory architecture for the Cortex-M7, I encourage you to read this white paper from Jacko Wilbrink and Lionel Perdigon entitled “Run Blazingly Fast Algorithms with Cortex-M7 Tightly Coupled Memories.” (You will have to register.) This paper describes MCUs integrating SRAM organized into four banks that can be used as general SRAM and for TCM, showing one example of a Cortex-M7 MCU being implemented in the Atmel | SMART SAM S70, SAM E70 and SAM V70/V71 families.


This post has been republished with permission from SemiWiki.com, where Eric Esteve is a principle blogger, as well as one of the four founding members of the site. This blog was originally shared on August 6, 2015.

4 designs tips for AVB in-car infotainment


AVB is clearly the choice of several automotive OEMs, says Gordon Bechtel, CTO, Media Systems, Harman Connected Services.


Audio Video Bridging (AVB) is a well-established standard for in-car infotainment, and there is a significant amount of activity for specifying and developing AVB solutions in automobiles. The primary use case for AVB is interconnecting all devices in a vehicle’s infotainment system. That includes the head unit, rear-seat entertainment systems, telematics unit, amplifier, central audio processor, as well as rear-, side- and front-view cameras.

The fact that these units are all interconnected with a common, standards-based technology that is certified by an independent market group — AVnu — is a brand new step for the automotive OEMs. The AVnu Alliance facilitates a certified networking ecosystem for AVB products built into the Ethernet networking standard.

Figure 1 - AVB is an established technology for in-car infotainmentAccording to Gordon Bechtel, CTO, Media Systems, Harman Connected Services, AVB is clearly the choice of several automotive OEMs. His group at Harman develops core AVB stacks that can be ported into car infotainment products. Bechtel says that AVB is a big area of focus for Harman.

AVB Design Considerations

Harman Connected Services uses Atmel’s SAM V71 microcontrollers as communications co-processors to work on the same circuit board with larger Linux-based application processors. The software firm writes codes for customized reference platforms that automotive OEMs need to go beyond the common reference platforms.

Based on his experience of automotive infotainment systems, Bechtel has outlined the following AVB design dos and don’ts for the automotive products:

1. Sub-microsecond accuracy: Every AVB element on the network is hooked to the same accurate clock. The Ethernet hardware should feature a time stand to ensure packet arrival in the right order. Here, Bechtel mentioned the Atmel | SMART SAM V71 MCU that boasts screen registers to ensure advanced hardware filtering of inbound packets for routing to correct receive-end queues.

2. Low latency: There is a lot of data involved in AVB, both in terms of bit rate and packet rate. AVB allows low latency through reservations for traffic, which in turn, facilitate faster packet transfer for higher priority data. Design engineers should carefully shape the data to avoid packet bottlenecks as well as data overflow.

Figure 2 - Bechtel

Bechtel once more pointed to Atmel’s SAM V71 microcontrollers that provide two priority queues with credit-based shaper (CBS) support that allows the hardware-based traffic shaping compliant with 802.1Qav (FQTSS) specifications for AVB.

3. 1588 Timestamp unit: It’s a protocol for correct and accurate 802.1 AS (gPTP) support as required by AVB for precision clock synchronization. The IEEE 802.1 AS carries out time synchronization and is synonymous with generalized Precision Time Protocol or gPTP.

Timestamp compare unit and a large number of precision timer counters are key for the synchronization needed in AVB for listener presentations times and talker transmissions rates as well as for media clock recovery.

4) Tightly coupled memory (TCM): It’s a configurable high-performance memory access system to allow zero-wait CPU access to data and instruction memory blocks. A careful use of TCM enables much more efficient data transfer, which is especially important for AVB class A streams.

It’s worth noting that MCUs based on ARM Cortex-M7 architecture have added the TCM capability for fast and deterministic code execution. TCM is a key enabler in running audio and video streams in a controlled and timely manner.

AVB and Cortex-M7 MCUs

The Cortex-M7 is a high-performance core with almost double the power efficiency of the older Cortex-M4. It features a six-stage superscalar pipeline with branch prediction — while the M4 has a three-stage pipeline.  Bechtel of Harman acknowledged that M7 features equate to more highly optimized code execution, which is important for Class A audio implementations with lower power consumption.

Again, Bechtel referred to the SAM V71 MCUs — which are based on the Cortex-M7 architecture — as particularly well suited for the smaller ECUs. “Rear-view cameras and power amplifiers are good examples where the V71 microcontroller would be a good fit,” he said. “Moreover, the V71 MCUs can meet the quick startup requirements needed by automotive OEMs.”

Figure 3 - Atmel's V71 is an M7 chip for Ethernet AVB networking and audio processing

The infotainment connectivity is based on Ethernet, and most of the time, the main processor does not integrate Ethernet AVB. So the M7 microcontrollers, like the V71, bring this feature to the main processor. For the head unit, it drives the face plate, and for the telematics control, it contains the modem to make calls so echo cancellation is a must, for which DSP capability is required.

Take the audio amplifier, for instance, which receives a specific audio format that has to be converted, filtered and modulated to match the requirement for each specific speaker in the car. This means infotainment system designers will need both Ethernet and DSP capability at the same time, which Cortex-M7 based chips like V71 provide at low power and low cost.

This 1971 video shows one of the earliest self-driving cars


“Look, no hands!” While it may be hard to believe, this driverless car is from 1971. 


Though autonomous vehicles may be all the rage as of late, the idea isn’t all that new. Just take a look at this video from 1971 — which is among a series of newly-released archive footage by the Associated Press and British Movietone — that shoes a mysterious driverless car being studied at Britain’s Road Research Laboratory.

Car

The commentator introducing the futuristic technology claims the automobile is “the shape of things to come in highway travel,” and speculates that it will be part of everyday use by the year 2000.

According to the video, the system consisted of “computerized electronic impulses that are relayed to the car through a special receiving unit fixed to the front. Signals picked up from the inlaid track were interpreted by the unit to change the car’s course or its speed.” The narrator goes on to compare it to the autopilot system used in planes.

Impressively, the researchers at the lab developed the self-driving car without most of the technology readily accessible to automakers today. And while they may be 15 or so years off in terms of their timeline, the prediction was pretty darn accurate. Today, autonomous vehicles are being trailed on a 32-acre test facility at the University of Michigan, while Google has already been experimenting with cars of their own in California.