Tag Archives: crypto

SmartHalo will make any bike smart


SmartHalo is a smart biking system that lets you focus on what matters the most — the road. 


Like a number of other companies who have created hardware solutions to bring older vehicles into the IoT era, one Montreal-based startup is hoping to do the same for bicycles. Those who’d rather not dig deep into their pockets to purchase a new electronic bike can now make their existing one “smart” for roughly $100.

photo-original

The brainchild of CycleLabs, SmartHalo is a compact device that attaches to the handlebar of your old bike and provides you with easy-to-read GPS, distance, speed and other performance data. Aside from that, the intuitive navigation system will highlight the quickest route to work in the morning and safest way home at night along with a front headlight for enhanced visibility when it gets dark.

“We believe that technology should not be something you have to worry about – it should just work. We’re urban cyclists who wanted to deal with the main problems of biking in cities: navigation and security. We found that the existing solutions right now were just not suited for harsh urban environments,” its creators share.

d84e3755a379036a15f3bcef87276f41_original

Among its most notable features is its turn-by-turn navigation system. Simply input your desired location into its accompanying mobile app, tuck your smartphone away and SmartHalo will take care of the rest by guiding you with simple-to-follow directions via an intuitive light between your hands.

As if that wasn’t enough, SmartHalo will automatically monitor your activities. Unlike other trackers on the market today, you’ll never have to hit ’start’ or ’stop.’ Instead, the device will track your progress as soon as you begin pedaling, then display the detailed metrics in the app for later review. It even lets you set goals using any of its parameters, like calories burned, and then view your progress in real-time on your handlebars as you pedal.

goal3

Have you ever missed an important call or email because street noise and road vibration made them hard to notice? Well, you’ll be happy to learn that SmartHalo can also serve as your personal assistant while on-the-go. The mounted unit will notify you of any incoming calls and messages, and alert you should there be an impending storm allowing you to take cover.

As any urban cyclist will tell you, theft is an all too common occurrence. Fortunately, when you’re not around, SmartHalo will keep your two-wheeler safe from any burglars by triggering an alarm if its internal motion sensor detects movement. Moreover, the gadget can only be unlocked with a special “key” provided with the product and will seamlessly deactivate as you approach the bike.

8d17d22ee7af3929e2d6c5faf2fbcbe7_original

“We designed SmartHalo to be beautiful inside and out. There is no on/off button – its sophisticated sensors detect your presence. When you finish your ride, it automatically shuts down. This leads to amazing battery life, to keep going as long as you do,” the CycleLabs team explains.

With standard use, its battery can last for approximately three weeks before having to be recharged via USB. In terms of hardware, SmartHalo is equipped with a Bluetooth Low Energy module for communication, low energy LEDs, an accelerometer, magnetometer and gyroscope, as well as a crypto authentication chip for enhanced security.

Ready to make your bike smarter? Then race over to SmartHalo’s Kickstarter campaign, where the CycleLabs crew is currently seeking $50,000. The first batch of shipments is expected to get underway in May 2016.

Keeping consumables real


The most cost-effective and secure way to keep things real is through symmetric authentication without secret storage on the host using a fixed challenge.


With the ever present threat of counterfeiting, having a cost-effective and highly-secure way to ensure that a consumable product is real is a great idea. In fact, there is a proven industry standard approach to apply sophisticated cryptographic engineering and mathematics to fight counterfeiting; namely, crypto elements like the Atmel ATSHA204A device.

Crypto elements can attach to a consumable good, such as the classic example of an ink cartridge, even without being soldered in. The device can be glued directly outside of the product. When the ink or other consumable is inserted into the host system (where the MCU is), the crypto element makes contact and the host is able to communicate with the item to validate whether or not it is real. This is called authentication.

consumable

The most cost-effective yet secure way to authenticate is through symmetric authentication without secret storage on the host using a fixed challenge.

With symmetric authentication, a client and the host run the exact same calculation on each side, and if the client (the consumable) is real, then the results of those calculations (called the “responses”) will match. There is a way to go about using a very inexpensive MCU without running the crypto calculations within the host side’s MCU. That is where the concept of fixed challenge comes into play. The idea of a fixed challenge is that the calculation done for the host is conducted ahead of time, and the challenge/response pair from that calculation is loaded into the host.

The fixed challenge method is ideal when certain considerations are in play, such as the folowing:

  1. Very limited processing power (e.g. low-cost MCU)
  2. Abundance of available memory to easily store challenge-response pairs (e.g. in a smartphone)
  3. Need to get something out quickly or temporarily (e.g. time to market)
  4. Need a very low cost on the host (e.g. can’t afford adding a key storage device)
  5. Desire to not store a secret key in the host

So, how does a fixed challenge work? Like with other challenge-response operations, the process starts with the host controller sending the client a numerical challenge to be used in a calculation to create a response, which then gets compared to a “response” number in the host. What makes this “fixed” is that, because there is no crypto device in the host to generate random numbers (or make digests using hashing algorithms), the challenge cannot be random. That means that the challenges and their corresponding responses must be pre-calculated using the client’s secret key and the challenge and response pair loaded into the memory of the host. This can be looked at as effectively time-shifting the calculations used for authentication.

fixed 1

Let’s look at an example using the ATSHA204A installed in the client.

Step 1: In the factory when the host manufactured challenges are loaded into the host MCU memory together with a response that is calculated by hashing the client’s secret with that challenge.

Step 2: When the consumable is inserted into the host machine out in the field, the host MCU will ask the client (consumable) to prove it is real by sending it the preloaded challenge.

Step 3: The client will then run the hash algorithm on that challenge number using its stored secret key to generate a response, which it sends back to the host.

Step 4: The host will compare the response from the clients with the preloaded response value stored in its memory.

Step 5: If the client is real, the response from the client (which is the hash value based on the secret key and the challenge) will be the same as the response value that was preloaded in the host.

Since each host is loaded with a different challenge/response pair, each product that the host is incorporated into is then unique by definition. Cloning beyond only one copy is impossible; thus, this is a highly-secure and very cost-effective technique as it can be easily implemented with very inexpensive MCUs.

This approach can be used for firmware protection and designs with no secrets in the host (as noted), as well as be implemented with very low-cost MCUs that do not have the processing power to run the hashing algorithms.

The many benefits of fixed challenge authentication:

  • Symmetric authentication is fast
  • No secrets in the host
  • Can use low-cost MCU of host because less computation is needed for a fixed challenge
  • Prevents cloning
  • Protects investments in firmware
  • Enhances safety
  • Protects revenue stream
  • Protects brand image
  • Better control of the supply channel

Atmel crypto element devices — including ATSHA204AATECC108AATECC508A and ATAES132A — implement hardware-based key storage, which is much stronger than software based storage due to the defense mechanisms that only hardware can provide against attacks. Secure storage in hardware beats storage in software every time. Adding secure key storage is an inexpensive, easy, and ultra-secure way to protect firmware, software, and hardware products from cloning, counterfeiting, hacking, and other malicious threats.

Building a custom door chime with an ATtiny85 and AES-CMAC


AES-CMAC on an ATtiny85? You bet! 


Our friends at Hackaday recently brought to our attention a nifty little custom door chime, powered by an ATtiny85 and equipped with AES-CMAC for message signing. While Daniel Neville could’ve used a commercial product, it’s evident that the Maker wanted to pack a little extra security into the pint-sized device.

Prototype-chime-components

Controlled by the tinyAVR MCU, the gate buzzer features an LM380N audio amplifier as well as a low-cost 315 MHz receiver. Using AVR assembly, the Maker managed to cram everything into the 8 Kbytes of Flash on the ATtiny85, including an AES cypher-based message authentication code. The transmitting gadget signs the request with a key shared between both devices, and the receiver verifies that the message is indeed from a trusted sender.

chime-layout-annotated

“The chime learns up to eight transmitters with the same shared key but with different serial numbers and different secret AES-CMAC keys. Each transmitter can have either one or two sensors to monitor. Each sensor on each transmitter is associated with one of sixteen possible sets of sounds. Some sound sets include activation, deactivation and prolonged activation reminder sounds while others only include the activation sound,” Neville writes.

Intrigued? You can read all about the build and access its source codes here.

 

Forward secrecy made real easy


Taking a closer look at how ATECC508A CryptoAuthentication devices can help in providing robust authentication.  


Forward secrecy, which is often referred to as Perfect Forward Secrecy (PFS), is essentially the protection of ciphertext with respect to time and changes in security of your cryptographic session keys and/or primary keying material over time.

A cryptographic session key is used to authenticate messages and encrypt text into ciphertext before it is transmitted. This thwarts a “man in the middle” from understanding the message and/or altering that message. These keys are derived from primary keying material. In the case of Public Key Cryptography, this would be the private key.

Unless you are implementing your own security in the application layer, you probably rely on the TLS/SSL in the transport layer.

The Problem

One can envision a scenario in which ciphertext was recorded by an eavesdropper over time. For a variety of reasons out of your control, your session keys and/or primary keying material are eventually discovered and this eavesdropper could decipher all of those recorded transmissions.

Release of your secret keys could be the result of a deliberate act, as with a bribe, a disgruntled employee, or even someone thinking they are “doing the right thing” by exposing your secrets. Or, it could be the result of an unwitting transgression from protocol. Equipment could be decommissioned and disposed of improperly. The hard drives could be recovered using the infamous dumpster dive attack methodology, thus exposing your secrets.

If you rely solely on transport layer security, your security could be challenged knowingly or unknowingly by third parties controlling the servers you communicate with. Recently leaked NSA documents shows powerful government agencies can (and do) record ciphertext. Depending on how clever or influential your snoopers are, they could manipulate the server system against you.

There are many ways your forward security could be compromised at the server level, including server managers unwittingly compromise it due to bad practices, inadequate cipher suites, leaving session keys on the server too long, the use of resumption mechanisms, among countless others.

Let’s just say there are many, many ways the security of your session keys and/or primary keying material could eventually be compromised. It only takes one of them. Nevertheless, the damage is irreversible and the result is the same: Those recorded ciphertext transmissions are now open to unintended parties.

The Solution

You can wipe out much of your liability by simply changing where encryption takes place. If encryption and forward secrecy are addressed in the application layer, session keys will have no relationship with the server, thereby sidestepping server based liabilities.This, of course, does not imply transport layer security should be discarded.

A public/private key system demonstrates the property of forward secrecy if it creates new key pairs for communication sessions. These key pairs are generated on an as-needed basis and are destroyed after a single use. Their generation must be truly random. In fact, they cannot be the result of a deterministic algorithm. Once a session key is derived from the public/private key pair, that key pair must not be reused.

Atmel’s newly-revealed ATECC508A CryptoAuthentication device meets this set of criteria. It has the ability to generate new key pairs using a high quality truly random number generator. Furthermore, the ATECC508A supports ECDH, a method to spawn a cryptographic session key by knowing the public key of the recipient. When these spawned session keys are purposely short-lived, or ephemeral, the process is known as ECDHE.

Using this method, each communication session has its own unique keying material. Any compromise of this material only compromises that one transmission. The secrecy of all other transmissions remains secure.

The Need for Robust Authentication

Before any of the aforementioned instances can occur, the identity of the correspondents needs to be robustly authenticated. Their identities need to be assured without doubt (non-repudiation), because accepting an unknown public key without robust authentication of origin could authorize an attacker as a valid user. Atmel’s ATECC508A provides this required level of authentication and non-repudiation.

Not only is the ATECC508A a cost-effective asymmetric authentication engine available in a tiny package, it is super easy to design in and ultra-secure. Moreover, it offers protective hardware key storage on-board as well a built-in ECC cryptographic block for ECDSA and ECDH(E), a high quality random number generator, a monotonic counter, and unique serial number.

With security at its core, the Atmel CryptoAuthentication lineup is equipped with active defenses, such as an active shield protecting the entire device, tamper monitors and an active power supply circuit which disallows the ability to “listen” for bits changing. The ECC-based solutions offer an external tamper pin, so unauthorized opening of your product can be detected.

Greetings from Digitopia!


When it comes to the privacy and security of data, what does the future hold for consumers, companies and governments?


A tremendously interesting document, called “Alternate Worlds,” was published by the U.S. National Intelligence Council. It’s a serious document that not only examines four different alternatives of what 2030 might look like, but possesses some major geo-political thinking about the future.

Digitopia

In the entire report there was only one comment regarding privacy, which is amazing.  This brings up many questions.  Has privacy already become a quaint notion and a relic of times past? Is the loss of privacy a done deal? Will there be any attempt at reclaiming personal privacy? Will renewed privacy only be available to the upper classes? Will companies be required to take responsibility for embedding more security and privacy in their products and systems? Will governments fight for citizens’ rights to privacy or insist on the right to intrude? These all are important 21st century questions, and they are simply impossible to answer now given that there are far too many variables. Only time will tell.

At the moment, however, it is pretty clear that the trend is away from privacy, at least in the way that privacy was defined in prior generations. If you observe first-world high school and college kids, you can easily see that many, if not most, live their lives way out in the open on apps like Facebook, Twitter, Tumblr and others, and don’t really seem to care all that much who is watching. Lately, more limited audience apps like WhatsApp, Snapchat, and WeChat that focus on smaller groups rather than general broadcasts have been growing, which belies some return to privacy concerns (i.e. don’t let mom see this), but the generational theme is clearly “live out loud.” Younger people live in a type of virtual society. Let’s call it “Digitopia.” Digitopia is far from a utopian place because it is insecure — really insecure. Cyber criminals, nosey companies, sneaky governmental operators, and other techno-mischief makers run rampant there.

One of the more intriguing predictions in the Alternate Worlds report points to future brain-machine interfaces that could provide super-human abilities, as well as improve strength, speed and other enhancements (i.e. bestow super powers). This notion could have come right out of author William Gibson’s classic cyber-punk novel Neuromancer where people’s brains directly “jack-into” the matrix.  The report states:

“Future retinal eye implants could enable night vision, and neuro-enhancements could provide superior memory recall or speed of thought. Neuro-pharmaceuticals will allow people to maintain concentration for longer periods of time or enhance their learning abilities.  Augmented reality systems can provide enhanced experiences of real-world situations. Combined with advances in robotics, avatars could provide feedback in the form of sensors providing touch and smell as well as aural and visual information to the operator.”

zz2

Hanging Out in Digitopia

Even the peaceful denizens of Digitopia are by default reckless, especially when it comes to their own privacy.

“A significant uncertainty … involves the complex tradeoffs that users must make between privacy and utility. Thus far, users seem to have voted overwhelmingly in favor of utility over privacy,” the Alternate Worlds report states.

As introduced in a prior article called “Digital Annoymity: The Ultimate Luxury Item,” the desire for personalized services is very seductive, and consumers are now complicit in, and habituated to, revealing a great deal about themselves. Volunteering information is one thing, but much of the content about our digital selves is being collected automatically and used for things we don’t have any idea about. People are increasingly buying products that automatically track their lives including cars storing data about driving habits and downloading that to other parties without the need for consent. As we visit web pages, companies get access to our digital histories and bid against each other in milliseconds fir the ability to display their advertising to us. This is kind of creepy. There is now an unholy trinity of governments snooping on us, corporations targeting our buying behaviors, and cyber-criminals trying to rip us off. The antidote is better security, but cyper-security is not something that individuals will be able to make happen on their own.

Data collection systems are not accessible, and they are not modifiable by people without PhDs in computer science. Because of that, security and privacy could easily become commodities which consumers will demand and thus economically force companies to provide. The only weapon consumers have is what they consume. If consumers only purchase secure products, then only secure products will succeed. In Digitopia, a company’s success may become dependent simply upon how well they protect the interests of their customers and partners — that is not a hard concept to understand.

You can almost see how there could easily be the equivalent of a “UL” label for privacy. Products and services could be vetted for the strength of their security mechanisms. Subsequently, products should then be rated on if they have encryption, data integrity checks, authentication, hardware key storage, and other cryptographic bases.

zz3

Beyond the testing of the products themselves, there could easily be businesses set up to provide secure protections to individuals and companies like a digital Pinkerton’s for digital assets. It is likely that those who can afford digital anonymity will be the first to take measures to regain it. To paraphrase a concept from a famous American financial radio show host, privacy could replace the BMW as the modern status symbol. The top income earners who want to protect themselves and their companies will be looking for a type of “digital Switzerland.” Regaining privacy will likely democratize over time as the general population will demand the same protections as the 1%-ers. Edward Snowdon showed us that everyone is under some sort of surveillance, so we have to face the facts that data gathering on a grand scale is part of the world now and will only grow in scope. However, we don’t have to just accept insecurity because things can be done, including adding secure devices to digital systems.

The Future Belongs to the Middle Classes

Maybe the most important factor noted in the Alternative World report has to do with the forthcoming growth of middle classes. As populations increase and more people worldwide move into the middle class, a growing number of people and things will be connected. That is why the Internet of Things is expected to grow so quickly. More connected things means more points of attack, and more data gathering for legitimate and illegitimate purposes. Therefore, the need for digital security is tied directly to the number of communicating nodes, which is tied directly to the growth of the middle class. More people with financial means means there will be more things to secure. This is becoming obvious. The middle class buys the lions’ share of products and services, and more of those products and services and how they will be ordered and delivered will be electronic. More people, more electronic things, more need for security.

When it comes to population, South and East Asia are the elephants (and dragons) in the room, as the chart below demonstrates.

zz14

The most powerful trend going forward is arguably the emergence of new “super-sized” middle classes in China and India. The worldwide middle class will grow exponentially, and it has already started to super-charge demand for food, energy, and manufactured products — particularly smart communicating electronic devices, many with sensing capabilities. That, of course, is how the IoT is getting started. Major companies are holding out the IoT as a way to drive efficiencies in production and distribution while keeping costs low.  You can see that in the literature of major companies such as GE who is targeting the Industrial Internet of Things as a major strategic vector.

Population and purchasing power go hand-in-hand, and the evolution of smart, secure, and communicating systems will follow.  As Stalin said, quantity has a quality all its own.   That is why Asia matters so much.

zz15

From the demographic analyses, you can see that most Digitopians will be physically living in South and East Asia and this will continue to rise with time. So, what does that mean for security and privacy?

zz11

There is a very different view of the privacy rights in Asia due to a varied tapestry of intricate and ancient cultures — cultures that differ from Western traditions in many ways. However, it must be pointed out that that Western governments are far from the white-knight protectors of privacy rights by any means. Even with uncertainty in how privacy will be embraced (or not) long-term woldwide, in the short- to medium-term, enhanced security will have to filter into networks, systems, and end products, including the IoT nodes. You can look at that as securing the basic wiring and digital plumbing of Digitopia, even if governmental institutions retain the right to snoop.

Practical Security

To close on a practical note, in the short- to medium-term there will be a strong drive to embed more robust security to embedded systems, PCs, networks, and the Internet of Things. Devices to enhance security are already available, namely crypto element integrated circuits with hardware based key storage. Crypto elements are powerful solutions, whose fundamental value is only starting to be recognized. They contain cryptographic engines to efficiently handle crypto functions such as hashing, sign-verify (ECDSA), key agreement (ECDH), authentication (symmetric or asymmetric), encryption/decryption, message authentication coding (MAC), run crypto algorithms (elliptic curve cryptography, AES, SHA), among many others. Together with microprocessors that run encryption algorithms crypto elements easily bring all three pillars of security (confidentiality, data integrity, and authentication) into play for any digital system.

As certain forces move the world towards less privacy and more insecurity, it is good to know that there are real technologies that have the potential to move things back in the other direction. To make a fearless forecast, it seems that going forward companies will increasingly be held liable for security breaches, and that will force them to provide robust security in the products and services that they offer. Consumers will demand security and enforce their preferences with class action legal remedies which they are damaged by lack of security. The invisible hand of the market will point towards more security.  On the other hand, governments will argue that they have a duty to provide physical and economic security, which gives them license to snoop.  Countervailing forces are in play in Digitopia.

What is Ambient Security?

New technology and business buzzwords pop up constantly. Hardly a day goes by that you don’t see or hear words such as “cloud”, “IoT,” or “big data.” Let’s add one more to the list: “Ambient security.”

Ambient 1

You’ll notice that big data, the cloud, and the IoT are all connected, literally and figuratively, and that is the point. Billions of things will communicate with each other without human intervention, mainly through the cloud, and will be used to collect phenomenal and unprecedented amounts of data that will ultimately change the universe.

As everything gets connected, each and every thing will also need to be secure. Without security, there is no way to trust that the things are who they say they are (i.e. authentic), and that the data has not been altered (i.e. data integrity). Due to the drive for bigger data, the cloud and smart communicating things are becoming ambient; and, because those things all require security, security itself is becoming ambient as well.  Fortunately, there is a method to easily spread strong security to all the nodes. (Hint: Atmel CryptoAuthentication.)

Big Data

At the moment, big data can be described as the use of inductive statistics and nonlinear system analysis on large amounts of low density (or quickly changing) data to determine correlations, regressions, and causal effects that were not previously possible. Increases in network size, bandwidth, and computing power are among the things enabling this data to get bigger — and this is happening at an exponential rate.

Big data became possible when the PC browser-based Internet first appeared, which paved the way for data being transferred around the globe. The sharp rise in data traffic was driven to a large extent by social media and companies’ desire to track purchasing and browsing habits to find ways to micro-target purchasers. This is the digitally-profiled world that Google, Amazon, Facebook, and other super-disruptors foisted upon us.  Like it or not, we are all being profiled, all the time, and are each complicit in that process. The march to bigger data continues despite the loss of privacy and is, in fact, driving a downfall in privacy. (Yet that’s a topic for another article.)

Biggering

The smart mobile revolution created the next stage of “biggering” (in the parlance of Dr. Seuss). Cell phones metamorphosed from a hybrid of old-fashioned wired telephones and walkie-talkies into full blown hand-held computers, thus releasing herds of new data into the wild. Big data hunters can thank Apple and the Android army for fueling that, with help from the artists formerly known as Nokia, Blackberry, and Motorola. Mobile data has been exploding due to its incredible convenience, utility, and of course, enjoyment factors. Now, the drive for bigger data is continuing beyond humans and into the autonomous realm with the advent of the Internet of Things (IoT).

biggering 1

Bigger Data, Little Things

IoT is clearly looking like the next big thing, which means the next big thing will be literally little things. Those things will be billions of communicating sensors spread across the world like smart dust — dust that talks to the “cloud.”

big data

More Data

The availability of endless data and the capability to effectively process it is creating a snowball effect where big data companies want to collect more data about more things, ad infinitum. You can almost hear chanting in the background: “More data… more data… more data…”

More data means many more potential correlations, and thus more insight to help make profits and propel the missions of non-profit organizations, governments, and other institutions. Big data creates its own appetite, and the data to satisfy that growing appetite will derive from literally everywhere via sensors tied to the Internet. This has already started.

Sensors manufacture data. That is their sole purpose. But, they need a life support system including smarts (i.e. controllers) and communications (such as Wi-Fi, Bluetooth and others). There is one more critical part of that: Security.

No Trust? No IoT! 

There’s no way to create a useful communicating sensor network without node security. To put it a different way, the value of the IoT depends directly on whether those nodes can be trusted. No trust. No IoT.  Without security, the Internet of Things is just a toy.

What exactly is security? It can best be defined by using the three-pillar model, which (ironically) can be referred to as “C.I.A:” Confidentiality, Integrity and Authenticity.

pillars

CIA

Confidentiality is ensuring that no one can read the message except its intended receiver. This is typically accomplished through encryption and decryption, which hides the message from all parties but the sender and receiver.

Integrity, which is also known as data integrity, is assuring that the received message was not altered. This is done using cryptographic functions. For symmetric, this is typically done by hashing the data with a secret key and sending the resulting MAC with the data to the other side which does the same functions to create the MAC and compare. Sign-verify is the way that asymmetric mechanisms ensure integrity.

Authenticity refers to verification that the sender of a message is who they say they are — in other words, ensuring that the sender is real. Symmetric authentication mechanisms are usually done with a challenge (often a random number) that are sent to the other side, which is hashed with a secret key to create a MAC response, before getting sent back to run the same calculations. These are then compared to the response MACs from both sides.

(Sometimes people add non-repudiation to the list of pillars, which is preventing the sender from later denying that they sent the message in the first place.)

The pillars of security can be  implemented with devices such as Atmel CryptoAuthentication crypto engines with secure key storage. These tiny devices are designed to make it easy to add robust security to lots of little things – -and big things, too.

So, don’t ever lose sight of the fact that big data, little things and cloud-based IoT are not even possible without ambient security. Creating ambient security is what CryptoAuthentication is all about.

ECDH key exchange is practical magic

What if you and I want to exchange encrypted messages? It seems like something that will increasingly be desired given all the NSA/Snowden revelations and all the other snooping shenanigans. The joke going around is that the motto of the NSA is really “Yes We Scan,” which sort of sums it up.

nsa

Encryption is essentially scrambling a message so only the intended reader can see it after they unscramble it. By definition, scrambling and unscrambling are inverse (i.e. reversible) processes. Doing and undoing mathematical operations in a secret way that outside parties cannot understand or see is the basis of encryption/decryption.

Julius Caesar used encryption to communicate privately. The act of shifting the alphabet by a specific number of places is still called the Caesar cipher. Note that the number of places is kept secret and acts as the key. Before Caesar, the Spartans used a rod of a certain thickness that was wrapped with leather and written upon with the spaces not part of the message being filled with decoy letters so only someone with the right diameter rod could read the message. This was called a skytale. The rod thickness acts as the key.

skytale

A modern-day encryption key is a number that is used by an encryption algorithm, such as AES (Advanced Encryption Standard) and others, to encode a message so no one other than the intended reader can see it. Only the intended parties are supposed to have the secret key. The interaction between a key and the algorithm is of fundamental importance in cryptography of all types. That interaction is where the magic happens. An algorithm is simply the formula that tells the processor the exact, step-by-step mathematical functions to perform and the order of those functions. The algorithm is where the magical mathematical spells are kept, but those are not kept secret in modern practice. The key is used with the algorithm to create secrecy.

spells

For example, the magic formula of the AES algorithm is a substitution-permutation network process, meaning that AES uses a series of mathematical operations done upon the message to be encrypted and the cryptographic key (crypto people call the unencrypted message “plaintext“). How that works is that the output of one round of calculations done on the plaintext is substituted by another block of bits and then the output of that is changed (i.e. permutated) by another block of bits and then it happens over and over, again and again. This round-after-round of operations changes the coded text in a very confused manor, which is the whole idea. Decryption is exactly as it sounds, simply reversing the entire process.

That description, although in actual fact very cursory, is probably TMI here, but the point is that highly sophisticated mathematical cryptographic algorithms that have been tested and proven to be difficult to attack are available to everyone. If a secret key is kept secret, the message processed with that algorithm will be secret from unintended parties. This is called Kerckhoffs’ principle and is worth remembering since it is the heart of modern cryptography. What it says is that you need both the mathematical magic and secret keys for strong cryptography.

Another way to look at is that the enemy can know the formula, but it does him or her no good unless they know the secret key. That is, by the way, why it is so darn important to keep the secret key secret. Getting the key is what many attackers try to do by using a wide variety of innovative attacks that typically take advantage of software bugs. So, the best way to keep the secret is to store the key in secure hardware that can protect if from attacks. Software storage of keys is just not as strong as hardware storage. Bugs are endemic, no matter how hard the coders try to eliminate them. Hardware key storage trumping software is another fundamental point worth remembering.

Alright, so now that we have a good algorithm (e.g. AES) and a secret key we can start encrypting and feel confident that we will obtain confidentiality.

Key Agreement

In order for encryption on the sender’s side and decryption on the receiver’s side, both sides must agree to have the same key. That agreement can happen in advance, but that is not practical in many situations. As a result, there needs to be a way to exchange the key during the session where the encrypted message is to be sent. Another powerful cryptographic algorithm will be used to do just that.

ECDH

There is a process called ECDH key agreement, which is a way to send the secret key without either of the sides actually having to meet each other. ECDH uses a different type of algorithm from AES that is called “EC” to send the secret key from one side to the other. EC stands for elliptic curve, which literally refers to a curve described by an elliptic equation.   A certain set of elliptic curves (defined by the constants in the equation) have the property that given two points on the curve (P and Q) there is a third point, P+Q, on the curve that displays the properties of commutivity, associativity, identity, and inverses when applying elliptic curve point multiplication. Point-multiplication is the operation of successively adding a point along an elliptic curve to itself repeatedly. Just for fun the shape of such an elliptic curve is shown in the diagram.

elliptic

The thing that makes this all work is that EC point-multiplication is doable, but the inverse operation is not doable. Cryptographers call this a one-way or trap door function. (Trap doors go only one way, see?)  In regular math, with simple algebra if you know the values of A and A times B you can find the value of B very easily.  With Elliptic curve point-multiply if you know A and A point-multiplied by B you cannot figure out what B is. That is the magic. That irreversibility and the fact that A point-multiplied by B is equal to B point-multiplied by A (i.e. commutative) are what makes this a superb encryption algorithm, especially for use in key exchange.

To best explain key agreement with ECDH, let’s say that everyone agrees in advance on a number called G. Now we will do some point-multiply math. Let’s call the sender’s private key PrivKeySend.  (Note that each party can be a sender or receiver, but for this purpose we will name one the sender and the other the receiver just to be different from using the typical Alice and Bob nomenclature used by most crpyto books.) Each private key has a mathematically related and unique public key that is calculated using the elliptic curve equation.  Uniqueness is another reason why elliptic curves are used. If we point-multiply the number G by PrivKeySend we get PubKeySend. Let’s do the same thing for the receiver who has a different private key called PrivKeyReceive and point-multiply that private key by the same number G to get the receiver’s public key called PubKeyReceive.   The sender and receiver can then exchange their public keys with each other on any network since the public keys do not need to be kept secret. Even an unsecured email is fine.

Now, the sender and receiver can make computations using their respective private keys (which they are securely hiding and will never share) and the public key from the other side. Here is where the commutative law of point-multiply will work its magic. The sender point-multiplies the public key from the other side by his or her stored private key.  This is equates to:

PubKeyReceive point-multiplied by PrivKeySend which = G point-multiplied by PrivKeyReceive point-multiplied by PrivKeySend

The receiver does the same thing using his or her private key and the public key just received. This equates to:

PubKeySend point-multiplied by PrivKeyReceive  = G point-multiplied by PrivKeySend point-multiplied by PrivKeyReceive.

Because point-multiply is commutative these equations have the same value!

rabbit

And, the rabbit comes out of the hat: The sender and receiver now have the exact same value, which can now be used as the new encryption key for AES, in their possession. No one besides them can get it because they would need to have one of the private keys and they cannot get them. This calculated value can now be used by the AES algorithm to encrypt and decrypt messages. Pretty cool, isn’t it?

Below is a wonderful video explaining the modular mathematics and discrete logarithm problem that creates the one-way, trapdoor function used in Diffie-Hellman key exhange. (Oh yeah, the “DH” in ECDH stands for Diffie-Hellman who were two of the inventors of this process.)

Are you building out for secure devices?  Protect your design investments and prevent compromise of your products? Receive a FREE Atmel CryptoAuthentication™ development tool.

Shouldn’t security be a standard?

Security matters now more than ever, so why isn’t security a standard feature in all digital systems? Luckily, there is a standard for security and it is literally standards-based. It is called TPM. TPM, which stands for Trusted Platform Module, can be thought of as a microcontroller that can take a punch, and come back for more.

“You guys give up, or are you thirsty for more?"

“You guys give up, or are you thirsty for more?”

The TPM is a small integrated circuit with an on-board microcontroller, secure hardware-based private key generation and storage, and other cryptographic functions (e.g. digital signatures, key exchange, etc.), and is a superb way to secure email, secure web access, and protect local data. It is becoming very clear just how damaging loss of personal data can be. Just ask Target stores, Home Depot, Brazilian banks, Healthcare.gov, JP Morgan, and the estimated billions of victims of the Russian “CyberVor” gang of hackers. (What the hack! You can also follow along with the latest breaches here.) The world has become a serious hackathon with real consequences; and, unfortunately, it will just get worse with the increase of mobile communications, cloud computing, and the growth of autonomous computing devices and the Internet of Things.

What can be done about growing threats against secure data?

The TPM is a perfect fit for overall security. So, just how does the TPM increase security? There are four main capabilities:

  1. Furnish platform integrity
  2. Perform authentication (asymmetric)
  3. Implement secure communication
  4. Ensure IP protection

These capabilities have been designed into TPM devices according to the guidance of an industry consortium called the Trusted Computing Group (TCG), whose members include many of the 800-pound gorillas of the computing, networking, software, semiconductor, security, automotive, and consumer industries. These companies include Intel, Dell, Microsoft, among many others. The heft of these entities is one of the vectors that is driving the strength of TPM’s protections, creation of TPM devices, and ultimately accelerating TPM’s adoption. The TPM provides security in hardware, which beats software based security every time. And that matters, a lot.

TPM Functions

Atmel TPM devices come complete with cryptographic algorithms for RSA (with 512, 1024, and 2048 bit keys), SHA-1, HMAC, AES, and Random Number Generator (RNG). We won’t go into the mathematical details here, but note that Atmel’s TPM has been Federal Information Processing Standards (FIPS) 140-2 certified, which attests to its high level of robustness. And, that is a big deal. These algorithms are built right into Atmel TPMs together with supporting software serve to accomplish multiple security functions in a single device.

Each TPM comes with a unique key called an endorsement key that can also be used as part of a certificate chain to prevent counterfeiting. With over 100 commands, the Atmel TPM can execute a variety of actions such as key generation and authorization checks. It also provides data encryption, storage, signing, and binding just to name a few.

An important way that TPMs protect against physical attacks is by a shielded area that securely stores private keys and data, and is not vulnerable to the types of attacks to which software key storage is subjected.

Hack1

But the question really is, “What can the TPM do for you?”  The TPM is instrumental in systems that implement “Root of Trust” (i.e. data integrity and authentication) schemes.

Root of trust schemes use hashing functions as the BIOS boots to ensure that there have been no unwanted changes to the BIOS code since the previous boot. The hashing can continue up the chain into the OS. If the hash (i.e. digest) does not match the expected result, then the system can limit access, or even shut down to prevent malicious code from executing.  This is the method used in Microsoft’s Bitlocker approach on PCs, for example. The TPM can help to easily encrypt an entire hard drive and that can only be unlocked for decryption by the key that is present on the TPM or a backup key held in a secure location.

Additionally, the TPM is a great resource in the embedded world where home automation, access points, consumer, medical, and automotive systems are required. As technology continues to grow to a wide spectrum of powerful and varying platforms, the TPM’s role will also increase to provide the necessary security to protect these applications.

Hack

Interested in learning more about Atmel TPM? Head here. To read about this topic a bit further, feel free to browse through the Bits & Pieces archive.

This blog was contributed by Ronnie Thomas, Atmel Software Engineer. 

 

 

Digital anonymity: The ultimate luxury item

Data is quickly becoming the currency of the digital society, of which we are all now citizens. Let’s call that “Digitopia.”

Digitopia123 copy

In Digitopia, companies and governments just can’t get enough data. There is real data obsession, which is directly leading to an unprecedented loss of privacy. And, that has been going on for a long time — certainly since 9/11. Now a backlash is underway with increasing signs of a groundswell of people wanting their privacy back. This privacy movement is about digital anonymity. It is real, and particularly acute in Europe. However, the extremely powerful forces of governments and corporations will fight the desire for personal privacy revanchism at every turn. What seems likely is that those with financial means (i.e. 1%-ers) will be at the forefront of demanding and retrieving privacy and anonymity; subsequently, anonymity could easily become the new luxury item. Ironically, digital invisibility could be the highest form of status.

Anon

Let’s explore what is creating the growing demand for a return to some anonymity. The main driver is the collective realization of just how vulnerable we all are to data breaches and snooping — thanks to Edward Snowden’s NSA revelations, Russian Cyber-Vor hacker gangs stealing passwords, Unit 61318 of the People’s Liberation Army creating all kinds of infrastructure, commercial and military mischief, the Syrian Electronic Army conducting cyber attacks, Anonymous, Heatbleed, Shellshock, Target and Home Depot credit card number breaches among countless other instances of real digital danger.

What all this means is that everyone is a potential victim, and that is the big collective “ah-ha” moment for digital security. (Maybe it’s more of an “oh-no!” moment?) As illustrated by the chart below, the magnitude, types and sheer number of recent attacks should make anyone feel a sense of unease about their own digital exposure. Why is this dangerous to everyone? Well, because data now literally translates into money. And I literally mean literally. Here’s why…

Breach 1

Bitcoin Exposes the Dirty Little Secret About Money 

Bitcoin is a great starting point because it’s the poster child of the data = money equation. Bitcoin currency is nothing more than authenticated data, and completely disposes any pretense of money being physical. It is this ephemeral-by-design nature of Bitcoin that, in fact, exposes the dirty little secret about all money, which is that without gold, silver or other tangible backing, dollars, the Euro, Renmimbi, Yen, Won, Franc, Pound, Kroner, Ruble and everything else is nothing but data. Money is a manmade concept — really just an idea.

How this works can best be described by putting it into cryptographic engineering terms. Governments are the “issuing certification authority” of money. Each country or monetary union (e.g. EU) with a currency of their own is literally an “issuer.” All roads lead back to the issuer’s central bank via a type of authentication process to prove that the transaction is based upon the faith and credit of the issuer.

Banks are the links on that authentication/certification chain back that leads back to the issuer. Each link on the chain (or each bank) is subject to strict rules (i.e. laws) and audits established by the issuer about exactly how to deal with the issuer, with other banks in the system, with the currencies created by other issues (i.e. other countries), with customers, and how to account for transactions. Audits, laws, and rules are therefore an authentication process. Consumers’ bank accounts and credit cards are the end-client systems. Those end-client systems are linked back through the chain of banks via the authentication process (rules, etc.) to the issuer of the money. That linkage is what creates the monetary system.

Bitcoin was built precisely and purposefully upon cryptographic authentication and certification. It is cryptography and nothing more. There is no central issuing authority and it remains peer-to-peer on purpose. Bitcoin bypasses banks precisely so that no overseer can control the value (i.e. create inflation and deflation at their political whim). This also preserves anonymity.

The bottom line is that the modern banking system has been based upon “fiat money” since the Nixon Administration abandoned the gold standard. The Latin word “fiat” means “arbitrary agreement” and that is what money is: an arbitrary agreement that numbers in a ledger have some type of value and can act as a medium of exchange. Note that physical money (paper and coins) is only an extremely small fraction of the world’s money supply. The bulk of the world’s money is comprised of nothing more than accounting entries in the ledgers of the world’s banking system.

See?  Money = Data. Everything else is window dressing to make it appear more than that (e.g. marble columned bank buildings, Fort Knox, Treasury agents with sunglasses and guns, engraved bonds, armored cars, multi-colored paper currency, coins, etc.).

So, if money equals data, then thieves will not rob banks as often; however, those who can will raid data bases instead, despite what Willie Sutton said. Data bases are where the money is now.

1573355_the-illuminati_jpeg890495712403ec5fef85b53b0a65a1ab

By now, the problem should be obvious to anyone who is paying attention — data of any kind is vulnerable to attack by a wide variety of antagonists from hacker groups and cyber-criminals to electronic armies, techno-vandals and other unscrupulous organizations and people. The reason is simple. Yes, you guessed it: It is because data = money. To make it worse, because of the web of interconnections between people, companies, things, institutions and everything else, everyone and everything digital is exposed.

Big Data. Little Freedom.

The 800-pound gorillas of Digitopia are without a doubt governments. Governments mandate that all kinds of data be presented to them at their whim. Tax returns, national health insurance applications, VA and student loan applications, and other things loaded with very sensitive personal data are routinely demanded and handed over. Individuals and corporations cannot refuse to provide data to the government if they want the monopolized “services” governments provide (or to stay out of jail). And, that is just the open side of the governmental data collection machine.

The surreptitious, snooping side is even larger and involves clandestine scanning of personal conversations, emails, and many other things. However, there is another, non-governmental component to data gathering (I will not use the term “private sector” because it is way too ironic). Companies are now becoming very sophisticated at mining data and tracking people, and getting more so every day. This is the notion of “big data,” and it is getting bigger and bigger all the time.

The Economist recently articulated how advertisers are tracking people to a degree once reserved for fiction. (Think George Orwell’s 1984.) Thousands of firms are now invisibly gathering intelligence. Consumers are being profiled with skills far exceeding that of FBI profilers. When consumers view a website, advertisers compete via a hidden bidding process to show them targeted ads based on the individual’s profile. These ads are extremely well focused due to intensive analytics and extensive data collection. These auctions take milliseconds and the ads are displayed when the website loads. We have all seen these ads targeted at us by now. This brave new advertising world is a sort of a cross between Mad Men and Minority Report with an Orwellian script.

The Personalization Conundrum

There is a certain seductiveness associated with consumer targeting. It is the notion of personalization. People tend to like having a certain level of personalized targeting. It makes sense to have things that you like presented to you without any effort on your part. It is sort of an electronic personal shopping experience. Most people don’t seem to mind the risk of having their preferences and habits collected and used by those they don’t even know. Consumers are complicit and habituated to revealing a great deal about themselves.  Millennials have grown up in a world where the notion of privacy is more of a quaint anachronism from days gone by. But, that is all likely to change as more people get hurt.

Volunteering information is one thing, but much of the content around our digital selves is being collected automatically and used for things we don’t have any idea about. People are increasingly buying products that track their activities, location, physical condition, purchases and other things. Cars are already storing data about our driving habits and downloading that to other parties without the need for consent. So, the question is becoming at point does the risk of sharing too much information outweigh the convenience? It is likely that point has already been reached, if you ask me at least.

The Need for a Digital Switzerland

With the unholy trinity of governmental data gathering, corporate targeting, and cyber-criminality, the need for personal data security should be more than obvious. Yet, the ability to become secure is not something that individuals will be able to make happen on their own. Data collection systems are not accessible, and they are not modifiable by people without PhDs in computer science.

With privacy being compromised every time one views a webpage, uses a credit card, pays taxes, applies for a loan, goes to the doctor, drives on a toll way, buys insurance, gets into a car, or does a collection of other things, it becomes nearly impossible to preserve privacy. The central point here is that privacy is becoming scarce, and scarcity creates value. So, we could be on the verge of privacy and anonymity becoming a valuable commodity that people will pay for. A privacy industry will arise. Think of a digital Pinkerton’s.

It is likely that those who can afford digital anonymity will be the first to take measures to regain it. To paraphrase a concept from a famous American financial radio show host, privacy could replace the BMW as the modern status symbol. The top income earners who want to protect themselves and their companies will be looking for a type of digital Switzerland.

swiss army

Until now a modicum of privacy had been attainable from careful titling and sequestering of assets (i.e. numbered bank accounts, trusts, shell corporations, etc.). That is not enough anymore. The U.S. Patriot Act, European Cy­bercrime Convention, and EU rules on data retention are the first stirrings concerning a return to the right to anonymity. These acts will apply pressure to the very governmental agencies that are driving privacy away. Dripping irony…

Legal, investigational, and engineering assets will need to be brought to bear to provide privacy services. It will take a team of experts to find where the bits are buried and secure them. Privacy needs do not stop at people either. Engineers will have to get busy to secure things as well.

The Internet of Things

Everything said until this point about the loss of personal privacy also applies to the mini-machines that are proliferating in the environment and communicating with each other about all kinds of things. The notion of the Internet of Things (IoT) is fundamentally about autonomous data collection and communication and it is expected that tens of billions of dispersed objects will be involved in only a few years form now. These numerous and ubiquitous so-called things will typically sense data about their surroundings, and that includes sensing people and what those people are doing. Therefore, these things have to add security to keep personal information out of the hands of interlopers and to keep the data from being tampered with. This is called data integrity in cryptographic parlance.

What Can be Done?

To ensure that things are what they say they are, it is necessary to use authentication. Authentication, in a cryptographic sense, requires that a secret or private key be securely stored somewhere for use by a system. If that secret key is not secret then there is no such thing as security. That is a simple point but of paramount importance.

2014-Crypto-Security-at-our-Core-Atmel-Has-You-Covered

The most secure way to store a cryptographic key is in secure hardware that is designed to be untamperable and impervious to a range of attacks to get at it. Atmel has created a line of products called CryptoAuthentication precisely for this purpose.  Atmel CryptoAuthentication products — such as ATSHA204AATECC108A and ATAES132 — implement hardware-based key storage, which is much stronger then software based storage because of the defense mechanisms that only hardware can provide against attacks. Secure storage in hardware beats storage in software every time.

It is most likely that as we citizens of Digitopia continue to realize how dependent we are on data and how dependent those pieces of data are on real security, there will be a powerful move towards the strongest type of security that can be achieved. (Yes, I mean hardware.)

In the future, the most important question may even become, “Does your system have hardware key storage?” We should all be asking that already and avoiding those systems that do not. Cryptography is, as Edward Snowden has said, the “defense against the dark arts for the digital realm.”  We should all start to take cover.

Hacker plays Doom on a Canon printer

In 1993, Doom was a revolutionary, incredibly popular game. Today, it’s being used by hackers like Context Information Security’s Michael Jordon to demonstrate security flaws in connected devices.

canon-640-doom-printer-copy

Recently, a team of researchers successfully completed a four-monthlong hack that enabled them to access the web interface of a Canon PIXMA printer before modifying its firmware to run the classic ’90s computer game. During his presentation at the 44Con Conference in London, Jordon conveyed to the audience just how easily he could compromise the Canon machine – a popular fixture in many homes and businesses.

Jordon undertook the endeavor of getting the game to run the printer’s hardware in order to demonstrate the inherent security flaws present in today’s Internet of Things (IoT) devices. From the exploitation standpoint, hacking the machine was trivial, as the researcher discovered that the device had a web interface with no username or password protecting it, thus allowing anyone to check the printer’s status.

Upon initial glance, this interface was of little interest, only showing ink levels and printing status. However, it soon became apparent that a hacker like Jordon could use this interface to trigger an update to the machine’s firmware. The printer’s underlying code was encrypted to prevent outsiders from tampering, yet not secure enough to prevent knowledgeable hackers from reverse engineering the encryption system and authenticating their own firmware.

Subsequently, an outsider could have potentially modified the printer’s settings to have it ask for updates from a malicious server opposed to Canon’s official channel. What this means is that malicious hackers could access personal documents the printer was currently printing or even start issuing commands to take up resources. In a business setting, hackers could also have gained privileges into the network, on which to carry out further exploitation.

tech-canon-pixma-pro-printer

“If you can run Doom on a printer, you can do a lot more nasty things. In a corporate environment, it would be a good place to be. Who suspects printers?” Jordon explained to the Guardian. “All PIXMA products launching from now onwards will have a username/password added to the PIXMA web interface, and models launched from the second half of 2013 onwards will also receive this update, models launched prior to this time are unaffected. This action will resolve the issue uncovered by Context.”

Over the course of recent months, context has been exposing various flaws found in unexpected places, such as a connected toy bunny, a smart light bulb and an IP camera. Believe it or not, a Canon printer isn’t the only system Doom has run on. Earlier this summer, a team of Australians was able to get it running on an ATM, and last year, a crew of modders managed to convert a piano into a Doom machine.

“The maturity isn’t there.” According to the Guardian, Jordon doesn’t believe manufacturers of such smart technologies are giving enough attention to security.

“What this shows is that IoT means virtually anything with a processor and internet connection can be hacked and taken over to do just about anything,” says William Boldt, Atmel Senior Marketing Manager Crypto Products. “With cameras and mics on PCs, home alarms, phones, video game controllers like Kinect, and other things, just imagine how intrusive the IoT really can be.”

Atmel_September2014_pg2

Trust is what security is really all about, especially in today’s constantly-connected, intelligent world. And, Atmel security products are making it easy to design in trust easier. By providing highly advanced cryptographic technologies including industry leading, protected hardware based key storage that is ultra-secure, especially when compared to software based solutions, Atmel crypto technologies offer designers the strongest protection mechanisms available so their designs can be trusted to be real, reliable, and safe. After all, a smart world calls for smarter security.

The Atmel® CryptoAuthentication™ family offers product designers an extremely cost-effective hardware authentication capability in a wide variety of space-conscious packages. CryptoAuthentication ICs securely validate a wide variety of physical or logical elements in virtually any microprocessor-based system. Atmel offers both symmetric- and asymmetric-key algorithm-based devices. By implementing a CryptoAuthentication IC into your design, you can take advantage of world-class protection that is built with hardware security fortifications like full active metal shields, multiple tamper detection schemes, internal encryption, and many other features designed to thwart the most determined attacks.

Jordon’s wider point is that the world is filling up with smart objects and devices. Though they often may not appear to be computers, they often have minimal security features guarding them against hacks. This is where Atmel can help.