Tag Archives: cloud

Develop secure IoT apps with the Atmel Certified-ID platform


The Atmel Certified-ID security platform prevents unauthorized reconfiguration of an edge node to access protected resources on the network.


Atmel has announced a comprehensive security platform that enables businesses of all sizes to assign certified and trusted identities to devices joining the secure Internet of Things. The Atmel Certified-ID security platform prevents unauthorized reconfiguration of an edge node to access protected resources on the network. This new platform is available on the Atmel SmartConnect Wi-Fi, Bluetooth, Bluetooth Smart and ZigBee solutions that connect directly to Atmel Cloud Partners, providing a secure turnkey solution for IoT edge node-to-cloud connection.

Sec

The Atmel Certified-ID platform delivers a distributed key provisioning solution, leveraging internal key generation capabilities of the ATECC508A CryptoAuthentication device, without invoking large scale infrastructure and logistics costs. This platform even allows developers to create certified and trusted identities to any device before joining an IoT network.

With billions of devices anticipated by 2020 in the rapidly growing IoT market, security is a critical element to ensuring devices can safely and conveniently access protected assets through the Internet. Today, secure identities are commonly created through a centralized approach where IoT device keys and certificates are generated offline and managed in secure databases in Hardware Security Modules (HSM) to protect the keys. These keys are then programmed into the IoT devices by connecting the HSM to automation equipment during device manufacturing. This approach is indispensable in large deployments consisting of millions of devices. It can also entail significant upfront costs in infrastructure and logistics which must be amortized over a large number of devices for cost effectiveness.

By utilizing the unique internal key generation capabilities of ATECC508A device, the recently-unveiled platform enables decentralized secure key generation, making way for distributed IoT device provisioning regardless of scale. This method eliminates the upfront costs of the provisioning infrastructure which can pose a significant barrier in deploying devices in smaller scales. On top of that, developers will be able to create secure IoT devices compatible with partner cloud services and to securely join ecosystems.

Atmel is currently working with several cloud service companies, including Proximetry and Exosite, on the Certified-ID platform. These collaborations will give developers a wide range of ecosystem partners to choose from for a secure connection between the edge nodes and the IoT. Other partners will be announced as they are integrated in the Certified-ID platform.

“As a leader in the security space with a track record of over two decades, enabling secure networks of all sizes is our mission,” said Nuri Dagdeviren, Atmel Vice President and General Manager of Secure Products Group. “Streamlining secure processes and simplifying deployment of real world secure networks will be key to unlocking the potential and enabling rapid growth of IoT. We will continue delivering industry-leading solutions in security, a critical element in enabling billions of ‘things’ to be connected to the cloud.”

banner_AT88CKECCROOT-SIGNER

Atmel now offers security provisioning tool kits to enable independent provisioning for pilot programs or production runs when used in conjunction with the ATECC508A ICs. These devices are pre-provisioned with internally generated unique keys, associated certificates, and certification-ready authentication once it is connected to an IoT ecosystem.

Developers will need two kits to securely provision their gadgets: the AT88CKECCROOT tool kit, a ‘master template’ that creates and manages certificate root of trust in any ecosystem, and the AT88CKECCSIGNER tool kit, a production kit that enables partners to provision IoT devices.

The AT88CKECCSIGNER kit lets designers and manufacturers generate tamper-resistant keys and security certifications requiring hardware security in their IoT applications. These keys provide the level of trust demanded by network operators and allows system design houses to provision prototypes in-house—saving designers overall investment costs.

The tool kits also include an easy-to-use graphical user interface that allow everyone to seamlessly provision their IoT devices with secure keys and certificates without special expertise. With distributed provisioning, developers are not required to use expensive HSM for key management and certificate acquisition fees.

In addition to secure IoT provisioning, the new Certified-ID platform provides high-quality random number generation to guarantee a diverse set of public and private keys. It delivers solutions to a variety of IoT security needs including node anti-cloning protection, data confidentiality, secure boot, and secure firmware upgrades over-the-air. The tamper resistance built into the ATECC508A device continues to provide the desired protection even when the device is under physical attack.

Ready for the Internet of Trusted Things? Both the Atmel AT88CKECCROOT and AT88CKECCSIGNER are available today.

Atmel implements Intel EPID technology on all SmartConnect wireless solutions


Atmel is collaborating with Intel on EPID technology to enable more secure IoT applications.


Atmel is working with Intel to bring more secure Internet of Things applications to market. In this collaboration, Atmel will support Intel Enhanced Privacy ID (Intel EPID) technology on all Atmel SmartConnect wireless solutions to improve secure cloud provisioning — the mutual authentication of the IoT node with the cloud — in the rapidly growing IoT market where devices are becoming increasingly more connected.

Smart

With tens of billions of devices anticipated by 2020, security is surely one of the most critical components to enabling a seamless connection between the edge node and the cloud. To accomplish this, Atmel offers a complete portfolio of IoT solutions that combine both Atmel | SMART MCUs along with SmartConnect wireless technologies ranging from Wi-Fi, 802.15.4 and Bluetooth, and other secure products. This newly-announced effort will give developers implementing these wireless solutions the option to use the trusted Intel EPID identification standard in their next gizmo or gadget.

“Implementing Intel EPID offers IoT designers a truly seamless edge-to-cloud Internet of Things platform with proven security options available with our broad Internet of Things portfolio,” said Kaivan Karimi, Atmel’s Vice President and General Manager of Wireless Solutions. “With this new technology, Atmel’s SmartConnect wireless and IoT solutions now support Intel EPID, a security technology that has been proven over the last 5 years.”

business_technology_connectivity_iot_network_system_security_thinkstock_459434713-100468719-primary.idge

For those who may not know, Intel EPID is an ISO standard for identity and privacy that has been shipping in Intel platforms since 2011. The technology delivers a hardware root of trust and is PKI compatible. With Intel EPID, devices can be identified and a secure communication can be linked between these devices. Additionally, the group membership can be determined without revealing the identity of the specific platform allowing for another level of security. Intel EPID can dynamically assign and revoke group memberships by individuals. Even more, this technology meets the latest protected key delivery requirements for content and data protection protocols.

“With the rapidly growing IoT ecosystem, security is key, and Intel EPID is a proven secure technology that can provide the billions of devices in this new market with a common security foundation. By implementing Intel EPID technology, Atmel is enabling a more secure, seamless IoT platform,” explained Lori Wigle, Intel’s General Manager of IoT Security.

IT cloud vs. IoT cloud


Kaivan Karimi, Atmel VP and GM of Wireless Solutions, shares the top 10 factors to consider when transitioning from IT cloud to IoT cloud.


In mid-2013, the buzz phrase “Internet of Things,” also known as the “IoT,” set the technology world on fire. As a result of this craze, a lot of products that were developed for completely different end applications changed all their marketing collateral overnight to become IoT products. We saw companies that added the acronym “IoT” to the title of every executive and gadgets that became a part of an IoT enablement ecosystem. New tradeshows claimed their authoritative position on IoT, and angel investors and venture capitalists started IoT funds feeding incredible ideas — some which reminded me of the late 1990s bubble when Lemonade.com was funded. New standard bodies were formed around provisioning IoT devices, and all of a sudden, overnight, most of us in the technology community became IoT experts.

IoTCloud

Cloud companies are not an exception. While the physical infrastructure of the cloud didn’t change, the platform and software services that were developed for enterprise IT management and mobility apps support became IoT PaaS & SaaS platforms with claims of “IoT compliance.” By late 2013, at an IoT event in Barcelona, every keynote not only talked about the “metaphorical pyramid” of Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS), but almost every keynote talked about “Everything as a Service (EaaS)” thanks to IoT.

With so much hype and noise, it is hard to separate fact from fiction — unless you dig deep, really deep. This fuzziness is caused by the breath of IoT and the many vertical markets it encompasses, covering all aspects of life as we know it. And each vertical has its own unique “things,” so one size doesn’t fit all from a device perspective, requiring different types of standards and transport layers with silicon and software infrastructure to support this vast frontier. What has further muddied the water is that many large industry players look at IoT as an inflection point that they can transform themselves to something else and get into other businesses. Because of this, these players are looking at their current assets and are defining the infrastructure required for IoT differently than what logically and technically makes sense. For companies that have no play in hardware or software for the data centers, they publicly promote that the majority of the data processing should be done in other parts of the network (“closer to the source”). And, while the others promote just the opposite, a third group advocates that much of the processing should be done directly by the hierarchy of smart gateways boxes in the customer premises, along with everything in-between. The same goes for the choice of RF communications protocols, gateways, definition of things, provisioning schemes, etc.

A great example of what gets heavily promoted by one of the biggest industry players is calling IoT an “always ON revolution” and allowing sensor data collected at the edge/sensing nodes (thing side) to ALWAYS be sent to the cloud. This method requires a lot of bandwidth and storage capacity to collect data in the cloud, and encourages the promotion of their passive big data analytics capabilities to process this volume of data in the cloud. Clearly they sell hammers here, and see everything in the world as a nail. In reality IoT is a “mostly OFF revolution,” with significantly less data created than portrayed, and few of that data will make it to the cloud. For instance:

  1. A door or a key lock is mostly sleeping, until a sensor triggers a wake-up command during an opening or proximity event, in which case it communicates a few bytes of data to a gateway and then goes back to sleep.
  2. The temperature sensors on a bridge wakes-up every so often to report temperature fluctuation to the gateway on the side of the road, and report if the bridge is frozen and then telling the department of transportation to send the sand trucks to avoid accidents.
  3. The seismic sensors on the A/C unit in an office building located in Texas monitoring of the sound of the motor every 2 hours. If the motor sounds as if it will be breaking down in a couple weeks, the sensors inform the building manager to call a technician to fix what is going bad, so that they will not be stuck without air conditioning in the middle of July.
  4. The ethylene gas sensors (ripening phytohormone of fruits and plants) on fruit containers in the back of an eighteen wheeler wake up every 30 minutes and send the data to the gateway in the cabin of the truck. These signals predict the decay rate of the fruit and allow the driver to change the destination to a close by city if needed, and give some additional shelf life to the fruit, or allowing the driver to send the fruits straight to the jam factory, avoiding fuel waste of carrying a bad cargo.

In each of the aforementioned cases, and in other examples similar to these, the things (fruit container, A/C unit, bridge, home door, etc.) spend a majority of their time sleeping and only wake up based on an event trigger or predetermined wake up time based on programmed policy. This is the only way these devices can operate on batteries for years of usage. How many bytes (not mbps or even kbps) of data is really required to report those events? Would all of these events be worthy of sending to the cloud? In fact, the local event processing and analytics engine running on the local gateway will determine what will go to the cloud and only the exception events (door is open, fruit is going bad, motor is going to break down, bridge is frozen, etc.) will go to the cloud right away. But, as long as everything is normal (within policy range events), it will get registered on predetermined intervals (e.g. once every 24 hours) and the meta data will get uploaded to the cloud. Even if video capture was involved, no more than 2Mbps of bandwidth is needed.

Based on my experience with the analysis of multiple large enterprise campuses with many buildings, without video for IoT-type services, only an aggregate level of 15Mbps bandwidth max is required to fully support this type of IoT communication to the cloud for provisioning services. So one should question the folks who promote the fallacy that all types of applications, things will always be ON and lots of bandwidth will be needed. What’s in it for them to portray IoT in this manner? Of course if you are considering an enterprise campus full of smart devices with people moving massive amount of data with “chatty and persistent communication agents”, then you will need a lot more than 15mbps of connectivity to cloud, for sure. Could it be these folks are confusing an IT infrastructure with an IoT infrastructure?

For a comprehensive IoT implementation, a system-level approach is required to cover the tiniest edge/sensing nodes (things), to various types of gateways, all the way to the cloud and data centers, applications and service providers. These include data analytic engines embedded both on premise and in the cloud with a variety of SDKs and communication agents, data caching and bandwidth management as different layer and levels of hierarchy, etc. There aren’t many companies in the world that cover all of these (single-digit) items. Even if they do, these companies still require partnerships with the gadget/things side companies. Therefore, when someone claims that they are a one-stop shop, they can either: support an existing infrastructure of things to a cloud and add a new twist to it (subset of most IoT verticals), OR their system is not as comprehensive as they claim, OR ultimately a combination of both.

Not to mention, at this moment we are exclusively dealing with silo-ed clouds, and silo-ed IoT systems. While an ecosystem of cloud (cloud of clouds) is in a nascent stage for some companies, it is far from a true IoT cloud ecosystem that it will become in the near future.

The IT cloud ecosystem (versus the IoT cloud ecosystem) has had a journey of its own in the past few years. This ecosystem has shown signs of success as originally predicted with the technology distributed to provide a virtually seamless and infinite environment for communications, storage, computing, Web and mobile services, analytics, and other business uses. The cloud benefit model has come to fruition, with many examples of upfront CAPEX largely minimized or eliminated. This includes increased flexibility and control to scale users and the ability to add functionality by various organizations on demand, with the added pay-as-you-go benefit. Cloud providers have taken over the responsibility of IT requirements for many organizations, and have become vital business and channel partners.

InternetofThings_contentfullwidth

That said, the fundamental question still remains: Is the traditional IT cloud and its ecosystem the same as an IoT cloud and its ecosystem?

The answer: While 60-70 percent is the same, a 30-40 percent difference can kill your IoT roll-out and make a seemingly IoT-ready cloud almost useless for your applications.

The differences are present throughout the full end-to-end system, from the “thing” side, all the way to the data centers on the cloud side. The traditional IT cloud, web or mobility applications cloud mirrors much bigger devices with more resources on the cloud side. Over the last couple of years, a “thing” for the traditional cloud system consisted of a computer, a vending machine, a car, a gateway in customer premise, or a smart device (laptops, tablets, Smartphone, etc.). These devices are typically connected to the cloud via direct cellular links, a cellular (WAN) + Wi-Fi (LAN), or Fiber (WAN) + Wi-Fi (LAN). With the new generation of IoT “things,” you can find much more resource-constrained devices such as small battery operated sensors on doorways to keep track of people entering through the back gate of the house, battery-operated seismic sensors on roadway infrastructures (bridges, etc.), or any of the examples earlier. Instead of 20 smart devices in an office that are plugged into the wall outlet or through a large battery capacity recharging on a regular basis, you will be dealing with 500 different types of sensors and things covering that office. With multiple offices, 1000s of things at the same time, most of which are powered by batteries for years (4-5 years of battery life in consumer IoT, and 8-12 years of battery life in industrial IoT). Some of these things have a small 8-bit MCU as its brain, with very little memory and other resources, and may be hiding behind layers of gateways, relays, switches, even other things, in sleepy networks. The communication link when available (remember that they are mostly in an off state), may have very little bandwidth, and communication may go through multiple hops in mesh networks. A “Chatty” communication system that pings on the things on regular basis defeats the purpose here.

The important thing is to remember that a system needs to be fully extendable and scalable not just on the cloud side, but also on the link side from the cloud to the things–and finally on the thing side. You also need scalable data capture and aggregation to go along with a secure communication system. If you are targeting a consumer application, then a solid mobile application development platform working with your popular Smartphone operating system is a basic requirement, meaning you need to rewrite your middleware to become more agile, scalable, and be able to manage many more things simultaneously. You also need to rethink your whole communication topologies of the past. Lastly you need to pay more attention to your analytic engines and applications development environment, and depending on your IoT application, it may require completely different visualization tools and business models.

Here are some factors that an IT cloud provider transitioning to an IoT cloud provider needs to consider:

  • Understand the verticals you target; become a one-stop shop for a given vertical. In IoT, one size does not fit all. Understanding a vertical includes the evolution of that vertical and future business models that need to be considered. For example, if you are targeting the tracking of people in a hospital and their location at any given time, in the future that group would require wearables with biometric sensors, and their vital statistics would also need to be monitored. The expectation would be that your service can also cover the tracking of biometric sensors, which are usually battery-operated constraint devices with minimal bandwidth. Working with one PaaS or SaaS supplier for managing one set of its assets in the same premise and another cloud provider for a separate set of assets is not an option. The issues to consider include the protocols, networks, bandwidth management and transport technologies your IoT cloud framework would need to support.
  • Scalable data analytics and event processing engine is a must-have as the majority of the IoT value creation comes from the data analytics, and “data capital” is where the differentiation will come from. Do you have the right analytics engine on both the cloud side as well as the premise/gateways? The new in-memory streaming technologies which change the rate we can act on data will be required for some IoT applications. Hence the traditional extraction, transformation and loading (ETL)will give way to just in time (JIT) methodologies (real-time vs. batch-oriented). Can you manage fast/streaming data analytics processing for applications where extremely fast processing of (near) real-time data is required? For instance, in tele-health and elderly monitoring where passive data analytics in the cloud is not adequate, and local fast data analytics running on the local smart gateway is required to report a heart attack, or a fire in home automation, etc. Also it is imperative that you find a service provider for a given vertical—if you are not a service provider, partner with one—so that your event processing and data analytics engines are tuned for specific use cases and business logic. If your analytics engine only provides insight into the visibility or availability of a limited set of parameters in the network, work with a partner that brings the rest.
  • Know the specific type of data required to monitor/gather, the insight required for your customers. That means developing a diverse set of device data models for specific functionalities. Don’t try to be the Swiss Army Knife of the IoT cloud providers. Remember, while a Swiss Army Knife can perform many functions, they are not good at doing anything well. Understanding the verticals you need to support (item number 1) will also help you with this item. For certain applications, before the data sets get processed by analytics and visualization tools, it gets combined with external algorithmic classification and enrichment tools. This increases productivity and ease-of-use dramatically (e.g. user will know where the water tables are before drilling for a well, or what the maps of other distribution centers are before redirecting a cargo).
  • Develop a fully modularized end-to-end system. As most large OEMs may already have their own branded cloud and would only want to use a part of the functionality you offer. Arm yourselves with well defined APIs, and firewall-friendly adaptive connectivity architecture and become comfortable with working with your customers’ infrastructure, analytics engine, applications, visualization tools, things, etc. They may only be interested in your communication system. Or, ask for a mix of capabilities. The more flexible your approach, the better you can customize your offerings to their needs. On the cloud side, the formation of the cloud ecosystem (cloud of clouds – server to server(s) communication) is right around the corner. A robust ecosystem is at the heart of the IoT cloud management.

A modularized system as described above may mean a different tiered pricing approach to your business model. Flexibility needs to extend beyond your technology offerings, so be open to new business models.

  • Follow the new service delivery frameworks with large ecosystems, such as the Open Interconnect Consortium (OIC), etc. Standardization will eventually dominate both the consumer and industrial IoT space. While the alphabet soup of protocols may be expanding (e.g. MQTT, XMPP, DDS, AMQP, CoAP, RESTful HTTP, etc.), standardization is also happening and provide more clarity. Standards are being developed so there are “horses for different courses.” Get used to the idea that your proprietary system of today requires an upgrade to a standard system tomorrow or your ecosystem will leave you behind. How would you change your system today with that knowledge in hand?
  • Develop RF communication specialization (Cellular, WiFi, BLE, 802.15.4/Zigbee, 6LowPan, subGig, SigFox, etc), or partner with someone who has that expertise. A lot of the IT Cloud companies today have a big gap here and need to find a partner to optimize their IT Cloud to use such complex RF Communication protocols. They also need to optimize their systems based on the type of RF links and bandwidth limitations they will be using. This also affects the application development side, as such customization is essential for IoT, and what normally works for Cellular might not work for WiFi or BLE or Zigbee, etc. This is especially important to consider when it comes to target vertical markets, as different verticals might need different RF communication protocols or even multiple ones simultaneously, with all the coexistence issues that one may encounter. A semiconductor partner, who understands your IoT cloud requirements, can help you optimize your system from an RF communications and bandwidth management perspective.
  • Whether you have an SDK or agent-based mechanism, implement a lightweight communication system. Typical SDKs make the development and management of mobile apps easy, but remember that your smart phone has a lot more resources on it than a tiny resource-constraint sensor feeding data into an IoT system. A lightweight SDK, or agent-based system is a lot more predictable and simpler to integrate into low memory or battery-operated devices. Lightweight agents reduce device complexity and cost and can incrementally add to their capabilities depending on where they reside on the system. Obviously the more ‘bells and whistles’ you add to your system on the thing side (number of statistics to track or alarm states), the larger footprint of your SDK or agent. As you move to gateway levels of hierarchy and have more types of mechanisms, functionalities, sensors, communications, and alarms to monitor, the size of your agent or SDK will grow. One size will not fit all, but be frugal with your application and data management. So far working with various IoT cloud ecosystem partners, I have seen SDK and agent sizes varying from 3K to 150K of memory footprint. IoT cloud journey has already started, and I have no doubt the higher end of the spectrum (and some of the intermediate steps) will be shrinking in the near future, while the caching mechanism will become more robust.

Also deploy a context-centric bandwidth management system that won’t hog the entire bandwidth for your management plane activities. The rule of thumb will tell you not to occupy more than 15% of the communication link with intermediate proxy and caching functionality.

  • Pay attention to “things” with the focus on ease-of-use. That means an easy way of provisioning a device that even a nascent thing developer can follow the steps and do it on their own, regardless of the transport technology or resources available. If it takes too long, is error prone or requires an army of your developers to port and customize/optimize your agent for a particular architecture, you will be reducing your target market to only the very large OEMs. If you assume that you will be doing it for services fees, it won’t scale and you will only be targeting the large OEMs. If you partner with software services houses, you will scale better and gain additional bandwidth at a cost. And, this will still be reducing your market footprint to companies that can afford to pay for provisioning services. Why not make it easy right up front for maximum customer coverage? From the syntax of your APIs for things/sensor, to local gateways, cloud gateways, programming your agent logic and communications and service APIs, focus on simplicity, ease-of-use, and the out-of-the-box experience for your customers/developers.
  • Pay attention to visualization tools and user experience in all parts of the system. “Thing virtualization and visualization,” (including elegant and robust application that turns the device data models to comprehendible information in the cloud) are great value propositions. If you are focusing on the consumer IoT verticals where smart phones will have a prominent role, include a robust mobile apps development environment. IT cloud and IoT cloud have different consumers of data, and elegant visualization features can set you apart from your competitors.
  • Last but not least, do you have a robust and hardened security and authentication mechanism that works with advance encryption algorithms? Do you support both ECC and AES-128/256? How about PUF based key generation mechanism? In IoT, the stakes are very high and you need to spend more attention to the security of the system, from the tiniest resource constraint thing all the way to the cloud. Please note that the security knowledgebase between the thing developers is low at the moment, and the cloud partner needs to bring some of the competence needed as well as enforcing best practices. Some basic elements on the thing side that need to be protected include secure boot, thing authentication, message encryption and integrity, and a trusted key management and storage scheme. A semiconductor partner who understands your IoT cloud requirements can help you optimize your system from a “thing” security perspective.

The transition from the IT cloud to the IoT cloud has already started, and as the IT cloud was a journey, the transformation to support IoT applications will also be a journey. What’s the best way to go about this change? Make this a comprehensive approach that will make your IoT cloud sustainable as the market transitions forward.

Libelium sensors connect with Microsoft Azure cloud platform


An integration of Libelium and Microsoft Azure demonstrates a complete industrial IoT solution.


Back at Mobile World Congress 2015, Internet of Things provider Libelium revealed a new Microsoft Azure Cloud integration with its Waspmote wireless sensors to speed time-to-market for smart cities and IoT projects with scalable cloud infrastructure.

meshlium_connection_options_cloud_microsoft

This announcement couldn’t come at a better time. According to a recent report from Gartner, 1.1 billion Internet-enabled items will be used by smart cities in 2015 with that number expected to rise to 9.7 billion over the next five years. Beyond that, McKinsey Global Institute forecasts the economic potential of the IoT to value from $2.7 trillion to $6.2 trillion annually by 2025.

Powered by Atmel’s ATmega1281 MCU, Waspmote nodes are designed to be deployed by the thousands, connecting any sensor using any communication protocol to any cloud system. Sensor networks based on these nodes, along with Meshlium Internet gateways, power projects throughout the Industrial Internet, smart agriculture, energy monitoring and environmental control space. Meanwhile, business use Microsoft Azure to build and manage applications and services through a global network of data centers.

hardware_top_big-1

For Libelium customers, Microsoft Azure will now provide a scalable infrastructure for data, virtual machines, server and frontend applications. With sensor technology to measure energy use, monitor environmental conditions, water quality, businesses can reduce costs and increase productivity. A customer integration of Libelium and Microsoft Azure demonstrates a complete industrial IoT solution in a smart factory, from sensor integration on the factory floor to business processes and data visualization in real time.

“Interoperability is vital to our development partners and customers as sensor-based IoT projects deploy at scale,” explained Javier Martinez, Libelium VP of Business Development. “Our IoT ecosystem includes the best cloud platforms in the market, and we make it easy for our partners to derive business value from wireless sensor networks with Internet of Things and contextual data.”

Interested in learning more? Head over to the company’s official page here. While on the topic of cloud integration, discover how Atmel is partnering with best-in-class providers to accelerate IoT development.

Atmel to showcase smart and securely connected solutions at Embedded World 2015


Demonstrations to showcase Atmel | SMART and Atmel AVR MCUs and MPUs highlighted in a variety of technology zones.


In a matter of days, Atmel will be showcasing a number of smart and securely connected solutions that will power next-generation Internet of Things (IoT) applications at Embedded World 2015 held in Nuremberg, Germany, February 24-27. These demos will be available in the company’s booth located in Hall 4A / Booth 4-230.

EmbedW2015_Facebook_851x315_Final

To better illustrate Atmel’s broad portfolio of IoT solutions, the demonstrations will be highlighted in several technology zones.

AUTOMOTIVE: As a leader in local interconnect networking (LIN) and automotive touch, Atmel is enabling smart, connected vehicles.

Atmel’s automotive technology pod will showcase the company’s broad automotive product portfolio for car access systems, networking, drivers, Ethernet Audio/Video Bridging (AVB), and the future of human machine interface (HMI) in next-generation center consoles. By popular demand, Atmel will also be showcasing its next-generation AvantCar concept demo, a host of passive entry car access solutions using Atmel’s latest and highly secure products, including AES encryption 125kHz LF and and RF technologies, along with its popular maXTouch and QTouch capacitive touch solutions. The Atmel | SMART SAM V71 ARM Cortex-M7-based MCU will also be highlighted in an automotive application to deliver the world’s highest performance Cortex-M-based Flash MCU, along with an automotive touch application powered by Atmel’s recently launched Touch Controller solution. And, a demonstration running Audioweaver from DSPConcepts showcasing the SAM V71 ARM Cortex-M7 processor-based MCU will also be exhibited in this zone.

atmel-avantcar-2

INDUSTRIAL: Atmel provides leading-edge MCU- and MPU-based solutions for the smart, industrial market.

In the industrial technology pod, Atmel will showcase a variety of smart, secure and connected solutions for the industrial market powered by Atmel | SMART solutions including an Ultra home automation and smart fridge application running on the SAMA5D4 Xplained, and Atmel | SMART ARM Cortex-A5 processor-based boards displaying HDMI video. Other industrial applications on display include a power supply temperature monitoring and cooling using an Atmel temperature sensor and an treadmill application featuring an Atmel | SMART SAMA5D4.

maxresdefault-1-1

SMART LIVING: As a leading provider of smart and securely connected solutions, this technology zone showcases next-generation applications of modern living.

Highlighting the latest innovations for your living room, the Smart Living technology zone will highlight a number of applications ranging from a low-power Bluetooth beacon to a digital temperature sensor, a ZigBee-based smart lighting with cryptographic security (ATSHA204), and a secure IoT camera system featuring Atmel’s newly announced elliptic curve network security chip, the ATECC508A. See Atmel’s recently launched SIGFOX IoT solution, powered by Atmel’s ATA8520, communicating to the cloud while transmitting metering values, alarm signals and more. The company will also be showcasing the Atmel SmartConnect family, leveraging ultra-low power secure, wireless connectivity. A number of applications will be demoed including a weight scale, door bell with camera, Wi-Fi connected speaker, motion sensors on the window, smart plug, light bulb and gateway connected via ZigBee technologies—all controllable through a smart, mobile device. A QTouch-based water level sensing application showcasing advanced HMI and sensing capability will also be exhibited, along with a display demonstrating the world’s lowest power capacitive touch surface. Other demonstrations powered by Atmel’s maXTouch technologies and Atmel AVR MCU solutions showcasing ultra-low power smart, connected devices will be available in this zone.

1491653_1012994838717686_5322643470985167253_n

CLOUD PARTNERS: Highlighting cloud platform partner solutions.

IoT requires a system-level solution encompassing the whole system, from the smallest edge/sensing node devices to the cloud. The company has partnered with best-in-class cloud partners that can support a variety of applications for both Tier-1 OEMs and smaller companies. Atmel has integrated the partners’ technology into the company’s cloud solutions framework adding the cloud platform functionality seamlessly to all of Atmel’s wireless MCU offerings, regardless of standards or transport technology. Come meet some of the cloud platform partner solutions from companies like PubNub, Proximetry and Arrayent that are available on Atmel wireless MCUs today.

POWERED BY ATMEL. Showcasing the latest gadgets and devices powered by Atmel technologies.

Highlighting the latest smartphones, tablets and wearables available today, everything from a wireless drive and narrative life logging camera to record your every step, to fitness bands, to Atmel’s latest MCU and touch technologies, will be on display. See ‘wear’ the market is headed next!

camera_narrative-clip-teardown-00

MAKERS: From Maker space to market place, this technology pod highlights Atmel enabling unlimited possibilities.

The Maker space showcases the well-received Arduino Wi-Fi Shield which enables rapid prototyping of Internet of Things (IoT) applications on the Arduino platform, and will be featured to highlight its simplicity for the professional and Maker communities. The company will also display a number of Maker demonstrations including a remote-controlled Maker Robot powered by the Atmel | SMART SAM D21 will be displayed. “Mr. Abot” is controlled through an Android app and the communications driven through Atmel’s recently announced WINC1500 Wi-Fi solution.

Bx7bxbfIIAEeC5Z

Additionally, Atmel’s resident security expert Kerry Maletsky will be presenting “Making IoT a Reality – Leveraging Hardware Security Devices” on February 25 from 12-12:30 pm CET (Session 09/I).

And for those of you waiting to see the one-and-only AVR Man, you’re in luck. The embedded community’s favorite superhero will be in attendance!

avr-man-with-team

What’s ahead this year in digital insecurity?


Here’s a closer look at the top 10 cyber security predictions for 2015.


In 2014 worries about security went from a simple “meh” to “WTF!” Not only did high-profile attacks get sensational media coverage, but those incidents led to a pivotal judicial ruling that corporations can be sued for data breaches. And as hard as it is to believe, 2015 will only get worse because attack surfaces are expanding as mobile BYOD policies overtake enterprises, cloud services spread, and a growing number of IoT networks get rolled out. Add m-commerce, e-banking, and mobile payments to the questionable tradition of lax credit card security infrastructure in the U.S. and you get a perfect storm for cybercrime.

In fact, 92% of attacks across the range of segments come from nine basic sources (seen in the diagram below), according to Verizon. More numerous and sophisticated cyber crimes are anticipated for this year and beyond.

z1

 1. More companies to get “Sony’d”

2014 saw the release of highly-evolved threats from criminals that in the past only came from governments, electronic armies and defense firms. A wide-range of targets included organizations in retail, entertainment, finance, healthcare, industrial, military, among countless other industries. As a repeat offender, Sony is now the cyber-victim poster child, and the term “Sony’d” has become a verb meaning digital security incompetence. Perhaps Sony’s motto should be changed from “make.believe.” to “make.believe.security.” Just saying!

Prior to 2014, companies on a wholesale basis tended to simply deny cyber vulnerabilities. However, a string of higher profile data breaches — such as Sony, Heartbleed, Poodle, Shellshock, Russian Cyber-vor, Home Depot, Target, PF Chang’s, eBay, etc. — have changed all of that. Denial is dead, but confusion and about what to do is rampant.

2. Embedded insecurity rising

Computing naturally segregates into embedded systems and humans sitting in front of screens.  Embedded systems are processor-based subsystems that are “embedded” into other machines or bigger systems.  Examples are routers, industrial controls, avionics, automotive engine and in-cabin systems, medical diagnostics, white goods, consumer electronics, smart weapons, and countless others.  Embedded security was not a big deal until the IoT emerged, which will lead to billions of smart, communicating nodes.  15 to more than 20 billion IoT nodes are being forecast by 2020, which will create a gigantic attack platform and make security paramount.

IoT Installed

A recent study by HP revealed that 70% of interconnected (IoT) devices have serious vulnerabilities to attacks. The devices they investigated consisted of “things” like cloud-connected TVs, smart thermostats and electronic door locks.

“The current state of Internet of Things security seems to take all the vulnerabilities from existing spaces — network security, application security, mobile security and Internet-connected devices — and combine them into a new, even more insecure space, which is troubling,” HP’s Daniel Miessler stated.

Issues HP identified ranged from weak passwords, to lack of encryption, to poor interfaces, to troubling firmware, to unencrypted updating protocols. Other notable findings included:

  • 60% of devices were subject to weak credentials
  • 90% collected personal data
  • 80% did not use passwords or used very weak passwords
  • 70% of cloud connected mobile devices allowed access to user accounts
  • 70% of devices were unencrypted

Investigators at the Black Hat Conference demonstrated serious security flaws in home automation systems. At DEFCON, investigators hacked NFC-based payment systems showing that passwords and account data was vulnerable. They also revealed that the doors of a Tesla car could be hacked to open while in motion. Nice! Other attacks were exploited on smart TVs, Boxee TV devices, smartphone biometric systems, routers, IP cameras, smart meters, healthcare devices, SCADA (supervisory, control and data acquisition) devices, engine control units, and some wearables. Even simple USB firmware was proven to be highly vulnerable… “Bad USB.”

These are just the tip of the embedded insecurity iceberg. Under the surface is the entire Dark Net which adds even more treacherousness. Security companies like Symmantic have identified home automation as a likely early IoT attack point. That is not surprising because home automation will be an early adopter of IoT technologies, after all. In-house appliances also represent an attractive attack surface as more firmware is contained in smart TVs, set top boxes, white goods, and routers that also communicate. Node-to-node connectivity security extends to industrial settings as well.

Tools like Shodan, which is the Google of embedded systems, make it very easy for hackers to get into the things in the IoT.  CNN recently called Shodan the scariest search engine on the Internet. You can see why since everything that is connected is now accessible. Clearly strong security, including hardware-based crypto elements, is paramount.

 3. More storms from the cloud

z5

It became clear in 2014 that cloud services such as iCloud, GoogleDrive, DropBox and others were rather large targets because they are replete with sensitive data (just ask Jennifer Lawrence). The cloud is starting to look like the technological Typhoid Mary that can spread viruses, malware, ransomware, rootkits, and other bad things around the world. As we know by now, the key to security is how well cryptographic keys are stored.   Heartbleed taught us that, so utilizing new technologies and more secure approaches to maintain and control cryptographic keys will accelerate in 2015 to address endemic cloud exposure. Look for more use of hardware-based key storage.

4. Cyber warfare breaks out

eBay, PF Chang’s, Home Depot, Sony, JP Morgan, and Target are well-known names on the cybercrime blotter, and things will just get worse as cyber armies go on the attack. North Korea’s special cyber units, the Syrian Electronic Army, the Iranian Cyber Army (ICA), and Unit 61398 of the People’s Liberation Army of China are high profile examples of cyber-armies that are hostile to Western interests. Every country now seems to have a cyber-army units to conduct asymmetric warfare. (These groups are even adopting logos, with eagles appearing to be a very popular motif.)

z6

Cyber warfare is attractive because government-built malware is cheap, accessible, and covert, and thus highly efficient. Researchers have estimated that 87% of cyber-attacks on companies are state-affiliated, 11% by organized crime, 1% by competitors, and another 1% by former employees. Long story short, cyber war is real and it has already been waged against non-state commercial actors such as Sony. It won’t stop there.

 5. Cybercrime mobilizes

According to security researchers, mobile will become an increasingly attractive target for hackers. Fifteen million mobile devices are infected with malware according to a report by Alcatel-Lucent’s Kindsight Security Labs. Malvertising is rampant on untrusted app stores and ransomware is being attached to virtual currencies. Easily acquired malware generation kits and source code make it extremely easy to target mobile devices. Malicious apps take advantage of the Webkit plugin and gain control over application data which hands credentials, bank account, and email details over to hackers. What’s more, online banking malware is also spreading. 2014 presented ZeuS, which stole data, and VAWTRAK that hit online banking customers in Japan.

Even two-factor authentication measures that banks employ have recently been breached using schemes, such as Operation Emmental. Emmental is the real name of Swiss cheese, which of course is full of holes just like the banking systems’ security mechanisms.  Emmental uses fake mobile apps and Domain Name System (DNS) changers to launch mobile phishing attacks to get at online  banking  accounts and steal identities. Some researchers believe that cybercriminals will increasingly use such sophisticated attacks to make illegal equity front running and short selling scams.

z7

6. Growing electronic payments tantalize attackers

Apple Pay could be a land mine just waiting to explode due to NFC’s susceptibility to hacking. Google Wallet is an example of what can happen when a malicious app is granted NFC privileges making it capable of stealing account information and money. M-commerce schemes like WeChat could be another big potential target.

z8

E-payments are growing and with that so will the attacks on mobile devices using schemes ranging from FakeID to master key. Master key is an exploit kit similar to blackhole exploit kit that specifically targets mobile, where FakeID allows malicious apps to impersonate legitimate apps that allow access to sensitive data without triggering suspicion.

7. Health records represent a cyber-crime gold mine

Electronic Health Records (EHR) are now mandatory in the U.S. and a vast amount of personal data is being collected and stored as never before. Because information is money, thieves will go where the information is (to paraphrase Willie Sutton). Health records are considered higher value in the hacking underground than stolen credit card data. Criminals throughout both the U.S. and UK are now specializing in health record hacking. In fact, the U.S. Identity Theft Resource Center reported 720 major data breaches during 2014 with 42% of those being health records.

8. Targeted attacks increase

Targeted attacks, also known as Advanced Persistent Threats (APTs), are very frightening due to their stealthy nature. The main differences between APTs and traditional cyber-attacks are target selection, silence, and duration of attack. According to research company APTnotes, the number of attacks by year went from 3 in 2010 to 14 in 2012 to 53 in 2014. APT targets are carefully selected, in contrast to traditional attacks that use any available corporate targets. The goal is to get in quietly and stay unnoticed for long periods of time, as seen in the famous APT attack that victimized the networking company Nortel. Chinese spyware was present on Nortel’s systems for almost ten years without being detected and drained the company of valuable intellectual property and other information. Now that’s persistent!

9. Laws and regulations try to play catch up

A number of cyber security laws are being considered in the U.S. including the National Cybersecurity Protection Act of 2014, which advocates the sharing of cybersecurity information with the private sector, provide technical assistance and incident response to companies and federal agencies.   Another one to note is the Federal Information Security Modernization Act of 2014 that is designed to better protect federal agencies from cyber-attacks. A third is the Border Patrol Agent Pay Reform Act of 2013 to recruit and retain cyber professionals who are in high demand. Additionally, there is the Cybersecurity Workforce Assessment Act, which aims to enhance the readiness, capacity, training, recruitment, and retention of the cybersecurity workforce. President Obama stated that wants a 30-day deadline for notices and a revised “Consumer Privacy Bill of Rights.”

One of the more interesting and intelligent recommendations came from the FDA, who issued guidelines for wireless medical device security to ensure hackers could not interfere with things such as implanted pacemakers and defibrillators. This notion was is part stimulated by worry about Dick Cheney’s pacemaker being hacked. In fact countermeasures were installed by on the device by Cheney’s surgeon. More regulation of health data and equipment is expected in 2015.

“Security — or the lack of it — will largely determine the success or failure of widespread adoption of internet-connected devices,” the FTC Commissioner recently shared in an article. The FTC also released a report entitled, “Privacy & Security in a Connected World.”

10. Hardware-based security may change the game

According to respected market researcher Gartner, all roads to the digital future lead through security. At this point, who can really argue with that statement? Manufacturers and service providers are seeing the seriousness of cyber-danger and are starting to integrate security at every connectivity level. Crypto element integrated circuits with hardware-based key storage are starting to be employed for that. Furthermore, these crypto elements are a kind of silver bullet given that they easily and instantly add the strongest type of security possible (i.e. protected hardware-based key storage) to IoT endpoints and embedded systems. This is a powerful concept whose fundamental value is only starting to be recognized.

IoT Node Chart 1

Crypto elements contain cryptographic engines to efficiently handle crypto functions such as hashing, sign-verify, ECDSA, key agreement (e.g.  ECDH), authentication (symmetric or asymmetric), encryption/decryption, message authentication coding (MAC), run crypto algorithms (e.g. elliptic curve cryptography, AES, SHA) and many other functions.

The hardware key storage plus crypto engine combination in a single device makes it simple, ultra-secure, tiny, and inexpensive to add robust security. Recent crypto element products offer ECDH for key agreement and ECDSA for authentication. Adding a device with both of these powerful capabilities to any system with a microprocessor that can run encryption algorithms (such as AES) brings all three pillars of security (confidentiality, data integrity and authentication) into play.

2014-Crypto-Security-at-our-Core-Atmel-Has-You-Covered

With security rising in significance as attack platforms increase in size and threats become more sophisticated, it is good to know that solutions are already available to ensure that digital systems are not only smart and connected, but robustly secured by hardware key storage. This could be the one of the biggest stories in security going forward.

This installation makes it rain data from the cloud


Where is the cloud? What does it look like? And, what exactly is the big data that we store there?


Let’s face it, not a day goes by that you don’t hear words like “big data” and “the cloud.” These ambient terms have been immersed in our modern-day vocabulary, but in many ways these buzzphrases still remain distant and abstract. Jingwen Zhu, a Master’s student at NYU ITP, started asking herself those questions upon hearing various lecturers discussing how big data is affecting our lives. In an exploration as what big data in the cloud would actually look like if it were tangible, the Maker crafted her vision as an interactive data exhibition. The aptly dubbed Big Data Cloud obtains data from users, and gives the information data back to them.

Cloud

“In this installation, people are not only encouraged to interact with the cloud, but also interact with the data,” Zhu explains. “In our daily life, we are interact with big data every day. We provide our data to the cloud, and get data back from it. Yet this repeated occurance falls to the background because we use big data so often that it goes unnoticed. By creating the Big Data Cloud, I provide people with a visible and tangible experience of interacting with big data, and let them to rethink about how big data affects our lives.”

2014-11-21-20.03.47-1024x768

How it works is simple: When a user comes under the cloud, a mobile device drops down from the cloud with a question displayed on the screen. Once the user types the answer to the question, the phone “uploads” the information back into the cloud. After some thunder and lightening, the cloud begins to “rain” just as it would in a summer night’s storm. However, the big data rain is in the form of a printed roll of paper with the users’ answers to the question instead. What’s more, the most frequently repeated words are also projected as “puddles” on the ground. Users can play either with the projected raindrops, or read all the answers on the receipt.

In order to make this concept to a reality, the Maker designed a 3D polygonal cloud comprised of folded paper to enclose her device and suspended it from the ceiling. Embedded within the paper cloud are a stepper motor connected to an ATmega328 board, a projector and a thermal printer.

2014-11-29-20.56.45-1024x768

The stepper motor coils the phone up and down from the cloud, while the mobile device itself is attached to a piece of fishing wire — this enables the phone to be drawn back up into the cloud. A Processing sketch using a Temboo Google Spreadsheets Choreo acquires the data that the user entered, which allows the newly-acquired data to be both projected onto the ground as at the user’s feet and printed from the thermal printer. Using  the program, Zhu was able to write the visual effect of the cloud and count the word frequency, before arranging and displaying the terms in different sizes. Meanwhile, an ultrasonic sensor within the printer can detect when someone puts their hand above it, thereby causing the machine to print the content, which of course, is decided by the distance.

Projectin

When all is said and done, this impressive project is a great physical representation of how we send and retrieve data from “the cloud.” Interested in learning more? You can learn all about the project as well as access a step-by-step breakdown of the build here.