Tag Archives: Internet of Things

Zedcon is a smart, multi-functional LED controller


This controller illuminates dynamic digital LEDs based on time, music, events and a person’s feelings.


Recently launched on Indiegogo by Berlin-based statup Zedfy, Zedcon is an Internet-enabled, multi-purpose LED controller that allows users to control various LED strips right from their smartphone.

20150406103524-meet_zedcon_quer_620x240

Zedcon comes with a companion mobile app that enables users to dim, switch and play with over 16 million different color combinations offered by RGB LEDs. And if you’re the type of person who rather use a standard light switch, you can as well.

“Common LED strips can light up in every color of the rainbow — but only one color at a time. This is the principal difference to our digital LED strips, where every single LED can be lit up with its own color, while leaving the others unaffected. Zedcon takes full advantage of digital LED strips, by letting you address every single LED on an individual basis. This functionality is important for creating nuanced atmospheres and dynamic light conditions,” the team writes.

20150408090900-zedcon-where-to-use-leds

The controller lets users set an individual timer for every occasion such as gradually waking you up in the morning or slowly dimming the dining room as you enjoy a romantic dinner, illuminate LED patterns to the beat of music, receive light notifications as a reminder for upcoming event or a completed washing machine cycle, or even adjust to the right setting to fit a particular mood whether that’s trying to get work done or relax after a long week.

Each Zedcon is equipped with a Wi-Fi 802.11 b/g/n module, a built-in microphone, a music-processing chip and an LED indicator, among several other components. Not only is its data stored in the cloud, but each multiple devices can also be connected to a network and controlled either simultaneously or separately.

20150406143336-SETUP_steps_001

So, whether it’s automating lighting conditions at various times throughout the day, setting a mood in the office to boost productivity or putting on an impressive light show at your next party, Zedcon wants to change the way you interact with LEDs. Interested in a controller of your own? Head over to its official Indiegogo page, where the Zedfy team is currently seeking $50,000 on Indiegogo. Delivery is slated to begin in August 2015.

Neobase is a cloud-free private social network device


Neobase is turning the concept of a social media upside down, shifting the balance of ownership, control and security back to users. 


It’s nearly impossible to envision a time when social media didn’t exist. From how we receive our news to how we engage with friends and family, sites like Facebook and Twitter have truly revolutionized the way in which we interact with the world around us. Given our modern-day state of interconnectivity, it seems like just about everything we see, do and feel is shared online. However, as recent breaches have made apparent, do we truly know who has access to all of that content? Fortunately, the Neone crew has designed a solution that hopes to rid this problem.

dff526cf715c202ab3363d9a5bfb010d_original

Billed as the world’s first private network device, Neobase is an encrypted, cylindrical gadget that allows owners to create an online community that only they control. Sharing with friends and family is seamless as users decide exactly what to share and who to share it with. And unlike many services before, the unit doesn’t rely on the cloud. Instead, all posts, comments, links, photos and files shared are stored on a user’s Neobase. This keeps information protected as it never has to pass through a website, a third party vendor or the cloud — and theoretically, cuts out the middlemen. What’s more, an Atmel ATSHA204 crypto engine plays an integral role in establishing its secure architecture.

“This means that no one — not even us here at Neone  — can know anything about you, your activities or what you share. Neone doesn’t host or operate your social network. You do,” the team writes.

Neobase’s plug-and-play functionality makes it easy to install and even easier to use. To get started, owners simply connect the device to their in-home network via Wi-Fi or Ethernet and begin assigning up to five family and friends as additional users. You can even connect with other Neobase users in the Neone Network if you choose.

neone2

As posts are created, users can pick and choose specific friends from their network that will be able to see the content and any links, photos and files associated with it. Neobase then syncs directly to the other Neobase units that information is being shared with, and only relays the specific content that has been selected.

Beyond that, the folks at Neone have developed the device so that, no matter where a user is located and how they are connected while on-the-go, the Neobase mobile app uses a fully-encrypted connection that links directly to their respective Neobase. Once again, no cloud required.

“The decentralized, peer-to-peer architecture of the Neone Network is a fundamental change in how your activities and information are stored and shared on the Internet, making it the heart of the Neobase’s security and privacy,” the team adds. “We’ve added additional security technology and encryption throughout the Neobase. Your computer or mobile device uses a secure SSH tunnel to connect to your Neobase and the Neone Network, which is much more secure than a browser with SSL.”

28944822a9c260068063d534672f4c09_original

Given its sleek, polished white design and compact size (6″ tall with a diameter of 3.5” and weighs only 15 ounces), Neobase will be a welcomed, aesthetically-pleasing addition to any living room, office or dorm room. The device itself offers one Terabyte of storage and a USB port for expanding storage. The drive runs a customized version of Linux to support its social networking functions.

Sound like something you and your family would like to have? Neobase is currently live on Kickstarter, where its team is seeking $100,000. If all goes to plan, shipment is expected to begin by August 2015.

Countertop is a connected system that’ll make your kitchen smarter


What if a smart blender could suggest the perfect smoothie after a great workout? 


Given the rise in smart home popularity, it was only a matter of time before your kitchen would actually be able to make meal recommendations and then walk you through the preparation process. Particularly for those lacking the Emeril Lagasse or Gordon Ramsay culinary gene, this comes as great news.

countertop-intro-840

Developed by Orange Chef, who some may recall from their 2013 breakthrough Prep Pad scale, Countertop is a connected-kitchen gadget not only capable of offering up nutritious food suggestions but assisting cook those items as well. With it, users can even know exactly what they ate and drank during the day.

Aesthetically, the accessory essentially combines a traditional cutting board and kitchen scale with next-gen technologies. The device consists of a Bluetooth LE-embedded base that can weigh and track ingredients, along with an accompanying iOS app that dishes out step-by-step instructions and monitors nutritional intake.

these-tiny-gadgets-make-your-whole-kitchen-smart

What really sets the Countertop apart is its ability to recognize existing kitchenware, such as Vitamix blenders and Crock-Pot slow cookers. In other words, a user won’t need to replace an entire appliance, but simply retrofit them with dedicated Countertop adaptors. The location-aware gadget is also capable of recognizing how much of something is being added and adjusts the recipe accordingly — all in real-time. Meaning, even the worst cook can’t mess up a meal.

Beyond that, Countertop syncs with fitness trackers like Jawbone’s UP and Apple Health, and uses the data from workouts, activity and sleep patterns to serve up personalized meal and snack recommendations. Once a meal is suggested, a user can either swipe-left to see additional options or swipe-right to select a meal. Countertop learns meal likes and dislikes based on user selections, and as the app learns, it gets smarter and the meal recommendations become more precise. Since it can pair with wrist-adorned wearable devices, this also frees up a home chef’s hands.

Featured13-642x393

And for those wondering, yes, it is dishwasher safe. Countertop is currently available for pre-order in the U.S. with shipment expected to begin later this year.

IT cloud vs. IoT cloud


Kaivan Karimi, Atmel VP and GM of Wireless Solutions, shares the top 10 factors to consider when transitioning from IT cloud to IoT cloud.


In mid-2013, the buzz phrase “Internet of Things,” also known as the “IoT,” set the technology world on fire. As a result of this craze, a lot of products that were developed for completely different end applications changed all their marketing collateral overnight to become IoT products. We saw companies that added the acronym “IoT” to the title of every executive and gadgets that became a part of an IoT enablement ecosystem. New tradeshows claimed their authoritative position on IoT, and angel investors and venture capitalists started IoT funds feeding incredible ideas — some which reminded me of the late 1990s bubble when Lemonade.com was funded. New standard bodies were formed around provisioning IoT devices, and all of a sudden, overnight, most of us in the technology community became IoT experts.

IoTCloud

Cloud companies are not an exception. While the physical infrastructure of the cloud didn’t change, the platform and software services that were developed for enterprise IT management and mobility apps support became IoT PaaS & SaaS platforms with claims of “IoT compliance.” By late 2013, at an IoT event in Barcelona, every keynote not only talked about the “metaphorical pyramid” of Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS), but almost every keynote talked about “Everything as a Service (EaaS)” thanks to IoT.

With so much hype and noise, it is hard to separate fact from fiction — unless you dig deep, really deep. This fuzziness is caused by the breath of IoT and the many vertical markets it encompasses, covering all aspects of life as we know it. And each vertical has its own unique “things,” so one size doesn’t fit all from a device perspective, requiring different types of standards and transport layers with silicon and software infrastructure to support this vast frontier. What has further muddied the water is that many large industry players look at IoT as an inflection point that they can transform themselves to something else and get into other businesses. Because of this, these players are looking at their current assets and are defining the infrastructure required for IoT differently than what logically and technically makes sense. For companies that have no play in hardware or software for the data centers, they publicly promote that the majority of the data processing should be done in other parts of the network (“closer to the source”). And, while the others promote just the opposite, a third group advocates that much of the processing should be done directly by the hierarchy of smart gateways boxes in the customer premises, along with everything in-between. The same goes for the choice of RF communications protocols, gateways, definition of things, provisioning schemes, etc.

A great example of what gets heavily promoted by one of the biggest industry players is calling IoT an “always ON revolution” and allowing sensor data collected at the edge/sensing nodes (thing side) to ALWAYS be sent to the cloud. This method requires a lot of bandwidth and storage capacity to collect data in the cloud, and encourages the promotion of their passive big data analytics capabilities to process this volume of data in the cloud. Clearly they sell hammers here, and see everything in the world as a nail. In reality IoT is a “mostly OFF revolution,” with significantly less data created than portrayed, and few of that data will make it to the cloud. For instance:

  1. A door or a key lock is mostly sleeping, until a sensor triggers a wake-up command during an opening or proximity event, in which case it communicates a few bytes of data to a gateway and then goes back to sleep.
  2. The temperature sensors on a bridge wakes-up every so often to report temperature fluctuation to the gateway on the side of the road, and report if the bridge is frozen and then telling the department of transportation to send the sand trucks to avoid accidents.
  3. The seismic sensors on the A/C unit in an office building located in Texas monitoring of the sound of the motor every 2 hours. If the motor sounds as if it will be breaking down in a couple weeks, the sensors inform the building manager to call a technician to fix what is going bad, so that they will not be stuck without air conditioning in the middle of July.
  4. The ethylene gas sensors (ripening phytohormone of fruits and plants) on fruit containers in the back of an eighteen wheeler wake up every 30 minutes and send the data to the gateway in the cabin of the truck. These signals predict the decay rate of the fruit and allow the driver to change the destination to a close by city if needed, and give some additional shelf life to the fruit, or allowing the driver to send the fruits straight to the jam factory, avoiding fuel waste of carrying a bad cargo.

In each of the aforementioned cases, and in other examples similar to these, the things (fruit container, A/C unit, bridge, home door, etc.) spend a majority of their time sleeping and only wake up based on an event trigger or predetermined wake up time based on programmed policy. This is the only way these devices can operate on batteries for years of usage. How many bytes (not mbps or even kbps) of data is really required to report those events? Would all of these events be worthy of sending to the cloud? In fact, the local event processing and analytics engine running on the local gateway will determine what will go to the cloud and only the exception events (door is open, fruit is going bad, motor is going to break down, bridge is frozen, etc.) will go to the cloud right away. But, as long as everything is normal (within policy range events), it will get registered on predetermined intervals (e.g. once every 24 hours) and the meta data will get uploaded to the cloud. Even if video capture was involved, no more than 2Mbps of bandwidth is needed.

Based on my experience with the analysis of multiple large enterprise campuses with many buildings, without video for IoT-type services, only an aggregate level of 15Mbps bandwidth max is required to fully support this type of IoT communication to the cloud for provisioning services. So one should question the folks who promote the fallacy that all types of applications, things will always be ON and lots of bandwidth will be needed. What’s in it for them to portray IoT in this manner? Of course if you are considering an enterprise campus full of smart devices with people moving massive amount of data with “chatty and persistent communication agents”, then you will need a lot more than 15mbps of connectivity to cloud, for sure. Could it be these folks are confusing an IT infrastructure with an IoT infrastructure?

For a comprehensive IoT implementation, a system-level approach is required to cover the tiniest edge/sensing nodes (things), to various types of gateways, all the way to the cloud and data centers, applications and service providers. These include data analytic engines embedded both on premise and in the cloud with a variety of SDKs and communication agents, data caching and bandwidth management as different layer and levels of hierarchy, etc. There aren’t many companies in the world that cover all of these (single-digit) items. Even if they do, these companies still require partnerships with the gadget/things side companies. Therefore, when someone claims that they are a one-stop shop, they can either: support an existing infrastructure of things to a cloud and add a new twist to it (subset of most IoT verticals), OR their system is not as comprehensive as they claim, OR ultimately a combination of both.

Not to mention, at this moment we are exclusively dealing with silo-ed clouds, and silo-ed IoT systems. While an ecosystem of cloud (cloud of clouds) is in a nascent stage for some companies, it is far from a true IoT cloud ecosystem that it will become in the near future.

The IT cloud ecosystem (versus the IoT cloud ecosystem) has had a journey of its own in the past few years. This ecosystem has shown signs of success as originally predicted with the technology distributed to provide a virtually seamless and infinite environment for communications, storage, computing, Web and mobile services, analytics, and other business uses. The cloud benefit model has come to fruition, with many examples of upfront CAPEX largely minimized or eliminated. This includes increased flexibility and control to scale users and the ability to add functionality by various organizations on demand, with the added pay-as-you-go benefit. Cloud providers have taken over the responsibility of IT requirements for many organizations, and have become vital business and channel partners.

InternetofThings_contentfullwidth

That said, the fundamental question still remains: Is the traditional IT cloud and its ecosystem the same as an IoT cloud and its ecosystem?

The answer: While 60-70 percent is the same, a 30-40 percent difference can kill your IoT roll-out and make a seemingly IoT-ready cloud almost useless for your applications.

The differences are present throughout the full end-to-end system, from the “thing” side, all the way to the data centers on the cloud side. The traditional IT cloud, web or mobility applications cloud mirrors much bigger devices with more resources on the cloud side. Over the last couple of years, a “thing” for the traditional cloud system consisted of a computer, a vending machine, a car, a gateway in customer premise, or a smart device (laptops, tablets, Smartphone, etc.). These devices are typically connected to the cloud via direct cellular links, a cellular (WAN) + Wi-Fi (LAN), or Fiber (WAN) + Wi-Fi (LAN). With the new generation of IoT “things,” you can find much more resource-constrained devices such as small battery operated sensors on doorways to keep track of people entering through the back gate of the house, battery-operated seismic sensors on roadway infrastructures (bridges, etc.), or any of the examples earlier. Instead of 20 smart devices in an office that are plugged into the wall outlet or through a large battery capacity recharging on a regular basis, you will be dealing with 500 different types of sensors and things covering that office. With multiple offices, 1000s of things at the same time, most of which are powered by batteries for years (4-5 years of battery life in consumer IoT, and 8-12 years of battery life in industrial IoT). Some of these things have a small 8-bit MCU as its brain, with very little memory and other resources, and may be hiding behind layers of gateways, relays, switches, even other things, in sleepy networks. The communication link when available (remember that they are mostly in an off state), may have very little bandwidth, and communication may go through multiple hops in mesh networks. A “Chatty” communication system that pings on the things on regular basis defeats the purpose here.

The important thing is to remember that a system needs to be fully extendable and scalable not just on the cloud side, but also on the link side from the cloud to the things–and finally on the thing side. You also need scalable data capture and aggregation to go along with a secure communication system. If you are targeting a consumer application, then a solid mobile application development platform working with your popular Smartphone operating system is a basic requirement, meaning you need to rewrite your middleware to become more agile, scalable, and be able to manage many more things simultaneously. You also need to rethink your whole communication topologies of the past. Lastly you need to pay more attention to your analytic engines and applications development environment, and depending on your IoT application, it may require completely different visualization tools and business models.

Here are some factors that an IT cloud provider transitioning to an IoT cloud provider needs to consider:

  • Understand the verticals you target; become a one-stop shop for a given vertical. In IoT, one size does not fit all. Understanding a vertical includes the evolution of that vertical and future business models that need to be considered. For example, if you are targeting the tracking of people in a hospital and their location at any given time, in the future that group would require wearables with biometric sensors, and their vital statistics would also need to be monitored. The expectation would be that your service can also cover the tracking of biometric sensors, which are usually battery-operated constraint devices with minimal bandwidth. Working with one PaaS or SaaS supplier for managing one set of its assets in the same premise and another cloud provider for a separate set of assets is not an option. The issues to consider include the protocols, networks, bandwidth management and transport technologies your IoT cloud framework would need to support.
  • Scalable data analytics and event processing engine is a must-have as the majority of the IoT value creation comes from the data analytics, and “data capital” is where the differentiation will come from. Do you have the right analytics engine on both the cloud side as well as the premise/gateways? The new in-memory streaming technologies which change the rate we can act on data will be required for some IoT applications. Hence the traditional extraction, transformation and loading (ETL)will give way to just in time (JIT) methodologies (real-time vs. batch-oriented). Can you manage fast/streaming data analytics processing for applications where extremely fast processing of (near) real-time data is required? For instance, in tele-health and elderly monitoring where passive data analytics in the cloud is not adequate, and local fast data analytics running on the local smart gateway is required to report a heart attack, or a fire in home automation, etc. Also it is imperative that you find a service provider for a given vertical—if you are not a service provider, partner with one—so that your event processing and data analytics engines are tuned for specific use cases and business logic. If your analytics engine only provides insight into the visibility or availability of a limited set of parameters in the network, work with a partner that brings the rest.
  • Know the specific type of data required to monitor/gather, the insight required for your customers. That means developing a diverse set of device data models for specific functionalities. Don’t try to be the Swiss Army Knife of the IoT cloud providers. Remember, while a Swiss Army Knife can perform many functions, they are not good at doing anything well. Understanding the verticals you need to support (item number 1) will also help you with this item. For certain applications, before the data sets get processed by analytics and visualization tools, it gets combined with external algorithmic classification and enrichment tools. This increases productivity and ease-of-use dramatically (e.g. user will know where the water tables are before drilling for a well, or what the maps of other distribution centers are before redirecting a cargo).
  • Develop a fully modularized end-to-end system. As most large OEMs may already have their own branded cloud and would only want to use a part of the functionality you offer. Arm yourselves with well defined APIs, and firewall-friendly adaptive connectivity architecture and become comfortable with working with your customers’ infrastructure, analytics engine, applications, visualization tools, things, etc. They may only be interested in your communication system. Or, ask for a mix of capabilities. The more flexible your approach, the better you can customize your offerings to their needs. On the cloud side, the formation of the cloud ecosystem (cloud of clouds – server to server(s) communication) is right around the corner. A robust ecosystem is at the heart of the IoT cloud management.

A modularized system as described above may mean a different tiered pricing approach to your business model. Flexibility needs to extend beyond your technology offerings, so be open to new business models.

  • Follow the new service delivery frameworks with large ecosystems, such as the Open Interconnect Consortium (OIC), etc. Standardization will eventually dominate both the consumer and industrial IoT space. While the alphabet soup of protocols may be expanding (e.g. MQTT, XMPP, DDS, AMQP, CoAP, RESTful HTTP, etc.), standardization is also happening and provide more clarity. Standards are being developed so there are “horses for different courses.” Get used to the idea that your proprietary system of today requires an upgrade to a standard system tomorrow or your ecosystem will leave you behind. How would you change your system today with that knowledge in hand?
  • Develop RF communication specialization (Cellular, WiFi, BLE, 802.15.4/Zigbee, 6LowPan, subGig, SigFox, etc), or partner with someone who has that expertise. A lot of the IT Cloud companies today have a big gap here and need to find a partner to optimize their IT Cloud to use such complex RF Communication protocols. They also need to optimize their systems based on the type of RF links and bandwidth limitations they will be using. This also affects the application development side, as such customization is essential for IoT, and what normally works for Cellular might not work for WiFi or BLE or Zigbee, etc. This is especially important to consider when it comes to target vertical markets, as different verticals might need different RF communication protocols or even multiple ones simultaneously, with all the coexistence issues that one may encounter. A semiconductor partner, who understands your IoT cloud requirements, can help you optimize your system from an RF communications and bandwidth management perspective.
  • Whether you have an SDK or agent-based mechanism, implement a lightweight communication system. Typical SDKs make the development and management of mobile apps easy, but remember that your smart phone has a lot more resources on it than a tiny resource-constraint sensor feeding data into an IoT system. A lightweight SDK, or agent-based system is a lot more predictable and simpler to integrate into low memory or battery-operated devices. Lightweight agents reduce device complexity and cost and can incrementally add to their capabilities depending on where they reside on the system. Obviously the more ‘bells and whistles’ you add to your system on the thing side (number of statistics to track or alarm states), the larger footprint of your SDK or agent. As you move to gateway levels of hierarchy and have more types of mechanisms, functionalities, sensors, communications, and alarms to monitor, the size of your agent or SDK will grow. One size will not fit all, but be frugal with your application and data management. So far working with various IoT cloud ecosystem partners, I have seen SDK and agent sizes varying from 3K to 150K of memory footprint. IoT cloud journey has already started, and I have no doubt the higher end of the spectrum (and some of the intermediate steps) will be shrinking in the near future, while the caching mechanism will become more robust.

Also deploy a context-centric bandwidth management system that won’t hog the entire bandwidth for your management plane activities. The rule of thumb will tell you not to occupy more than 15% of the communication link with intermediate proxy and caching functionality.

  • Pay attention to “things” with the focus on ease-of-use. That means an easy way of provisioning a device that even a nascent thing developer can follow the steps and do it on their own, regardless of the transport technology or resources available. If it takes too long, is error prone or requires an army of your developers to port and customize/optimize your agent for a particular architecture, you will be reducing your target market to only the very large OEMs. If you assume that you will be doing it for services fees, it won’t scale and you will only be targeting the large OEMs. If you partner with software services houses, you will scale better and gain additional bandwidth at a cost. And, this will still be reducing your market footprint to companies that can afford to pay for provisioning services. Why not make it easy right up front for maximum customer coverage? From the syntax of your APIs for things/sensor, to local gateways, cloud gateways, programming your agent logic and communications and service APIs, focus on simplicity, ease-of-use, and the out-of-the-box experience for your customers/developers.
  • Pay attention to visualization tools and user experience in all parts of the system. “Thing virtualization and visualization,” (including elegant and robust application that turns the device data models to comprehendible information in the cloud) are great value propositions. If you are focusing on the consumer IoT verticals where smart phones will have a prominent role, include a robust mobile apps development environment. IT cloud and IoT cloud have different consumers of data, and elegant visualization features can set you apart from your competitors.
  • Last but not least, do you have a robust and hardened security and authentication mechanism that works with advance encryption algorithms? Do you support both ECC and AES-128/256? How about PUF based key generation mechanism? In IoT, the stakes are very high and you need to spend more attention to the security of the system, from the tiniest resource constraint thing all the way to the cloud. Please note that the security knowledgebase between the thing developers is low at the moment, and the cloud partner needs to bring some of the competence needed as well as enforcing best practices. Some basic elements on the thing side that need to be protected include secure boot, thing authentication, message encryption and integrity, and a trusted key management and storage scheme. A semiconductor partner who understands your IoT cloud requirements can help you optimize your system from a “thing” security perspective.

The transition from the IT cloud to the IoT cloud has already started, and as the IT cloud was a journey, the transformation to support IoT applications will also be a journey. What’s the best way to go about this change? Make this a comprehensive approach that will make your IoT cloud sustainable as the market transitions forward.

Report: Half of consumers believe smart home devices will be mainstream by 2020


New research from Bluetooth SIG shows that many folks are ready to live like the Jetsons.


A survey conducted by the Bluetooth Special Interest Group (SIG) has revealed that nearly half (46%) of consumers believe smart home devices will be mainstream by 2020. The study had explored the attitudes of American, German and British consumers towards connected living, and as a whole, discovered tremendous excitement around not only potential applications but future installations, too.

Ikea-kitchen_2603101b

Bluetooth SIG also that 6% of those surveyed already accepted that the era of the smart home has indeed arrived, with two-thirds (66%) thinking that smart home devices will be mainstream within the next decade. This strong consumer interest was tempered by their high expectations for simplicity and cost-effectiveness.

When asked what is required for commonplace purchases of such devices, 54% of respondents cited simplicity and straightforwardness in use with 41% believing that they should be easy to configure. Moreover, 28% suggested that these gadgets should connect seamlessly with a smartphone, tablet or PC. Nearly three-quarters (73%) admitted they would be frustrated if it took too long to set up a smart home unit.

“This study confirms consumers are looking for smart home products that ‘just work’,” added Mark Powell, Executive Director of the Bluetooth SIG. “It’s evident demand for smart home devices is ramping up and consumers are keen to live in the scenarios conjured up by the Jetsons over 60 years ago. Smart home manufacturers need to deliver products that are simple, cost-effective and secure for this segment to become mainstream.”

Evident by the sheer number of hacks and discovered flaws in recent months, it’s no surprise that 42% of consumers felt that keeping their data secure was paramount in the decision-making process. 67% of those surveyed were also concerned that some smart home devices would make their data vulnerable.

smarthome-620x400

Despite all of the buzz surround intelligent appliances, like washing machines and kitchen gadgetry, the research unearthed that the hype is yet to materialize into actual demand from consumers. Keyword being ‘yet.’ In fact, the devices consumers find most appealing are highly convenient solutions that enable them to control their environment, such as smart heating/thermostats (45%), smart lighting (34%) and smart security/monitoring devices (33%).

As Bluetooth SIG explains, the results certainly conveyed a preference towards the smart home solutions that offer tangible benefits, ranging from controlling their heating or lighting remotely to cut down on bills (66%) to receiving smartphone notifications from their home security system if it detects a threat (73%).

The results showed a preference towards the smart home solutions that offer tangible benefits as well. For example, 66 percent of consumers say that being able to control their heating or lighting remotely would help them save energy and cut their energy bills. A further 73 percent would like to receive smartphone notifications from their home security system if it detects a threat.​

“It’s clear there is an appetite for these kinds of solutions but widespread adoption will require the use of mainstream connectivity technologies,” Powell concluded. “As we’ve seen in other segments, niche technologies simply cannot provide the simplicity, interoperability and security that consumers demand. Bluetooth Smart technology offers all those things with an enormous install base in smartphones, tablets and PCs, a simple pairing process and AES-128 bit cryptography for maximum security. While consumers feel smart home devices aren’t quite mainstream yet, Bluetooth is already paving the way for manufacturers to deliver the products consumers want. These manufacturers can also be confident in the knowledge that Bluetooth Smart has a development environment that makes it easy to bring these products to market.”

More than ever, consumers have high expectations for home appliances. With billions of connected devices expected in the coming years, users will demand sophisticated, feature-rich products that are reliable, easy-to-use, and most of all, secure. Whether it’s refrigeration, cooking or washing, Atmel has you covered. Want to continue reading? You can find all of Bluetooth SIG’s findings here.

Apio is an IoT platform that lets you build smart devices


Apio lets you create smart objects in five minutes, while its SDK guides you along the way. 


Apio is an open-source platform for the Internet of Things, which lets Makers and designers create their own smart systems and connected objects in a matter of minutes. The platform is comprised of two USB devices, the General and Dongle, both of which are based on an ATmega256RFR2 and ATmega16U2, along with a custom operating system and SDK.

The General is a low-cost, low-power board that communicates wirelessly with the Dongle. This is tasked with connecting up to 65,000 General units, and through the Apio OS, controlling them via a mobile device or PC.

general1-2

The General is entirely Arduino-compatible, which means users can write their own code in the Arduino IDE, and features an integrated IEEE 802.15.4 communication channel, the LWM. This allows for every board to “talk” with one another in a wireless mesh network. Apio makes it super easy for Makers to get started right out of the box, thanks to a comprehensive set of libraries. Being open-source, more advanced users can also modify existing or write their own codes, thanks to a powerful framework that supports a number of applications including IFTTT, Unity3D and Temboo.

comunicazionegeneral

So, what sort of IoT applications can the General be used for? For starters, Makers can develop an automatic watering system that lets them know when their plants are thirsty, or smarten existing household units like a smoke alarm or thermostat. Additionally, users can design an intelligent set of blinds or even connect a General to an electronic door lock to access remotely. The possibilities are endless.

dongleTop

Meanwhile, the Dongle connects wirelessly to each General through the Apio’s OS, permitting anyone to control the boards from a smartphone, tablet or PC. The Apio Dongle integrates with Atmel’s Lightweight Mesh protocol using the ATmega256RFR2, which paves the way for all single devices to become signal repeaters. The signal becomes stronger as the devices are brought closer, therefore overcoming Wi-Fi’s typical coverage problems. According to the team, Atmel’s LWM combined with XBee can provide a more affordable, lower power solution than Wi-Fi when it comes to radio communication. Beyond that, the pairing of a Dongle and a Beaglebone Black or Raspberry Pi gives users the ability create their own smart home gateway.

“With Apio, you can interact with your creations as in an orchestra and you’re the leader. You don’t need wires or expensive installations to create your own symphony,” the team explains.

Interested? You can delve deeper into the IoT platform on its official page here, or its detailed Wiki page here.

Libelium sensors connect with Microsoft Azure cloud platform


An integration of Libelium and Microsoft Azure demonstrates a complete industrial IoT solution.


Back at Mobile World Congress 2015, Internet of Things provider Libelium revealed a new Microsoft Azure Cloud integration with its Waspmote wireless sensors to speed time-to-market for smart cities and IoT projects with scalable cloud infrastructure.

meshlium_connection_options_cloud_microsoft

This announcement couldn’t come at a better time. According to a recent report from Gartner, 1.1 billion Internet-enabled items will be used by smart cities in 2015 with that number expected to rise to 9.7 billion over the next five years. Beyond that, McKinsey Global Institute forecasts the economic potential of the IoT to value from $2.7 trillion to $6.2 trillion annually by 2025.

Powered by Atmel’s ATmega1281 MCU, Waspmote nodes are designed to be deployed by the thousands, connecting any sensor using any communication protocol to any cloud system. Sensor networks based on these nodes, along with Meshlium Internet gateways, power projects throughout the Industrial Internet, smart agriculture, energy monitoring and environmental control space. Meanwhile, business use Microsoft Azure to build and manage applications and services through a global network of data centers.

hardware_top_big-1

For Libelium customers, Microsoft Azure will now provide a scalable infrastructure for data, virtual machines, server and frontend applications. With sensor technology to measure energy use, monitor environmental conditions, water quality, businesses can reduce costs and increase productivity. A customer integration of Libelium and Microsoft Azure demonstrates a complete industrial IoT solution in a smart factory, from sensor integration on the factory floor to business processes and data visualization in real time.

“Interoperability is vital to our development partners and customers as sensor-based IoT projects deploy at scale,” explained Javier Martinez, Libelium VP of Business Development. “Our IoT ecosystem includes the best cloud platforms in the market, and we make it easy for our partners to derive business value from wireless sensor networks with Internet of Things and contextual data.”

Interested in learning more? Head over to the company’s official page here. While on the topic of cloud integration, discover how Atmel is partnering with best-in-class providers to accelerate IoT development.

Friday Smart Lock turns your smartphone into your house key


Friday Smart Lock combines cutting-edge design, functionality and security.


Given the rise of connected devices in and around our homes, it’s no surprise that a number of smart locks have begun to populate the market. However, none may be as aesthetically-pleasing as the recently-launched Friday Smart Lock, which comes in a variety of materials like steel, porcelain, bronze and wood to match any decor.

20150328205806-lock

Crafted with a focus on both design and connectivity, Friday is said to be the first lock offering both Wi-Fi and Bluetooth support right out of the box. This allows users to open their door remotely via its accompanying mobile app, which is available for Android and iOS. That same dashboard also provides users the ability to grant temporary, one-time access or permanent privileges to visitors, directly from their phone. In the same breath, users can retrieve log files to see activity as well as cancel rights at any time.

“The Friday Smart Lock is perfect if you are renting out through home rental services  such as Airbnb or Homeaway. Instead of handing out physical keys, you can grant  time definite access via the app / web, giving you peace of mind of who has access to your home,” its creators write.

friday-smart-lock

Friday is installed directly inside a door, and will immediately send a notification to an owner’s phone in the event someone tries to get inside. As the brainchild of startup Friday Labs and architectural firm BIG, the lock is packed with AES-CCM cryptography, among several other security elements. What’s more, it can be seamlessly integrated with existing Apple HomeKit and Thread technologies.

Another notable feature is Friday’s compact size, which is believed to be the smallest of any retrofit smart lock on the market today. Measuring roughly 3 inches x 2 inches, the unit won’t draw any unnecessary attention to a front door.

20150327113921-Lock_Assembly

Those wishing to learn more or adorn their homes with a Friday Smart Lock should hurry over to its official Indiegogo campaign, where the team is seeking $75,000. If all goes to plan, shipment is expected to begin in September 2015. 

Atmel’s SAM L21 MCU for IoT tops low power benchmark


SAM L21 MCUs consume less than 940nA with full 40kB SRAM retention, real-time clock and calendar, and 200nA in the deepest sleep mode.


The Internet of Things (IoT) juggernaut has unleashed a flurry of low-power microcontrollers, and in that array of energy-efficient MCUs, one product has earned the crown jewel of being the lowest-power Cortex M-based solution with power consumption down to 35µA/MHz in active mode and 200nA in sleep mode.

How do we know if Atmel’s SAM L21 microcontroller can actually claim the leadership in ultra-low-power processing movement? The answer lies in the EEMBC ULPBench power benchmark that was introduced last year. It ensures a level playing field in executing the benchmark by having the MCU perform 20,000 clock cycles of active work once a second and sleep the remainder of the second.

 

 ULPBench shows SAM L21 is lower power than any of its competitor's M0+ class chips

ULPBench shows SAM L21 is lower power than any of its competitor’s M0+ class chips.

Atmel has released the ultra-low-power SAM L21 MCU it demonstrated at Electronica in Munich, Germany back in November 2014. Architectural innovations in the SAM L21 MCU family enable low-power peripherals — including timers, serial communications and capacitive touch sensing — to remain powered and running while the rest of the system is in a reduced power mode. That further reduces power consumption for always-on applications such as fire alarms, healthcare, medical and connected wearables.

Next, the 32-bit ARM-based MCU portfolio combines ultra-low-power with Flash and SRAM that are large enough to run both the application and wireless stacks. Collectively, these three features make up the basic recipe for battery-powered mobile and IoT devices for extending their battery life from years to decades. Moreover, they reduce the number of times batteries need to be changed in a plethora of IoT applications.

Low Power Leap of Faith

Atmel’s SAM L21 microcontrollers have achieved a staggering 185.8 ULPBench score, which is way ahead of runner-up TI’s SimpleLink C26xx microcontroller family that scored 143.6. The SAM L21 microcontrollers consume less than 940nA with full 40kB SRAM retention, real-time clock and calendar, and 200nA in the deepest sleep mode. According to Atmel spokesperson, it comes down to one-third the power of competing solutions.

Markus Levy, President and Founder of EEMBC, credits Atmel’s low-power feat to its proprietary picoPower technology and the company’s low-power expertise in utilizing DC-DC conversion for voltage monitoring. Atmel’s picoPower technology employs flexible clocking options and short wake-up time with multiple wake-up sources from even the deepest sleep modes.

ULPBench aims to provide developers with a reliable methodology to test MCUs

ULPBench aims to provide developers with a reliable methodology to test MCUs.

In other words, Atmel has taken the low-power game beyond architectural improvements to the CPU while optimizing nearly every peripheral to operate in standalone mode and then use a minimum number of transistors to complete the given task. Most lower-power ARM chips simply disable the clock to various parts of the device. The SAM L21 microcontroller, on the other hand, turns off power to those chip parts; hence, there is no leakage current in thousands of transistors in that part.

Here is a brief highlight of Atmel’s low-power development efforts that now encompass almost every peripheral in an MCU device:

Sleep Modes

Sleep modes not only gate away the clock signal to stop switching consumption, but also remove the power from sub-domains to fully eliminate leakage. Atmel also employs SRAM back-biasing to reduce leakage in sleep modes.

Consider a simple application where the temperature in a room is monitored using a temperature sensor with the analog-to-digital converter (ADC). In order to reduce the power consumption, the CPU would be put to sleep and wake up periodically on interrupts from a real-time counter (RTC). The measured sensor data is checked against a predefined threshold to decide on further action. If the data does not exceed the threshold, the CPU will be put back to sleep waiting for the next RTC interrupt.

SleepWalking

SleepWalking is a technology that enables peripherals to request a clock when needed to wake-up from sleep modes and perform tasks without having to power up the CPU Flash and other support systems. For instance, Atmel’s ultra-low-power capacitive touch-sensing peripheral can run in all operating modes and supports wake-up on a touch.

For the temperature monitoring application, as mentioned above, this means that the ADC’s peripheral clock will only be running when the ADC is converting. When the ADC receives the overflow event from the RTC, it will request its generic clock from the generic clock controller and peripheral clock will stop as soon as the ADC conversion is completed.

Event System

The Event System allows peripherals to communicate directly without involving the CPU and thus enables peripherals to work together to solve complex tasks using minimal gates. It allows system developers to chain events in software and use an event to trigger a peripheral without CPU involvement.

Again, taking temperature monitor as a use case, the RTC must be set to generate an overflow event, which is routed to the ADC by configuring the Event System. The ADC must be configured to start a conversion when it receives an event. By using the Event System, an RTC overflow can trigger an ADC conversion without waking up the CPU. Moreover, the ADC can be configured to generate an interrupt if the threshold is exceeded, and the interrupt will wake up the CPU.

533

Low Power MCU Use Case

Paul Rako has mentioned a sensor monitor in his recent post in Atmel’s Bits & Pieces blog. Rako writes in his post titled “The SAM L21 pushes the boundaries of low power MCUs” about this sensor monitor being asleep 99.99 percent of the time, waking up once a day to take a measurement and send it wirelessly to a host. Such tasks can be conveniently handled by an 8-bit device.

However, moving to IoT applications, which constitute protocol stacks, there is number crunching involved and that requires a faster ARM-class 32-bit chip. So, for battery-powered IoT applications, Rako makes the case for 32-bit ARM-based chip that can wake up, do its thing, and go back to sleep. If a high-current chip wakes up 10 times faster but uses twice the power, it will still use less energy and less charge than the slower chip.

Next, Rako presents sensor fusion hub as a case study in which the device saves power by skipping the radio chip to send the data from each sensor and instead uses the ARM-based microcontroller that does the math and pre-processing to combine the raw data from all sensors and then assembles the result as a simple chunk of data.

Atmel has scored an important design victory in the ongoing low-power game that is now prevalent in the rapidly expanding IoT market. Atmel already boasts credentials in the connectivity and security domains — the other two key IoT building blocks. Its connectivity solutions cover multiple wireless arenas — Bluetooth, Wi-Fi, Zigbee and 6LoWPan — to enable IoT communications.

Likewise, Atmel’s CryptoAuthentication devices come with protected hardware key storage and are available with SHA256, AES128 or ECC256/283 cryptography. The IoT triumvirate of low power consumption, broad connectivity portfolio and crypto engineering puts Atmel in a strong position in the promising new market of IoT that is increasingly demanding low power portfolio of MCUs to be matched with high performance.


Majeed Ahmad is author of books Smartphone: Mobile Revolution at the Crossroads of Communications, Computing and Consumer Electronics and The Next Web of 50 Billion Devices: Mobile Internet’s Past, Present and Future.

This DIY monitor measures water usage throughout your house


While we may not be able to fix the drought, one Maker has set out to change how we use water at home.


As many of you are aware, California is currently facing one of the most severe droughts on record. While it may be a bit difficult to enact immediate change at the municipal level, we can drastically alter how we use our water at home. With this in mind, Maker Will Buchanan recently decided that it would be a good idea to focus his energies toward reshaping our consumption habits.

FYAPP4RI15KZLKT.MEDIUM

“It’s possible to dramatically change our behavior simply by making us aware, but we simply don’t know where our water goes. A bill at the end of the month doesn’t give you much useful information, and it gives you the information a month too late,” Buchanan writes.

Inspired by an earlier low-cost water flow sensor project, the Maker devised a plumbing-free, home automation system that can track water usage in real-time across in-home fixtures. This was done by employing a piezo buzzer and a Pinoccio mesh networking device.

F1TM4FXI15KZLUP.MEDIUM

For those unfamiliar with the IoT startup, a Pinoccio Scout is a pocket-sized board packed with wireless networking, a rechargeable LiPo battery, some sensors, and the ability to expand its capabilities through shields, much like an Arduino. It is equipped with an ATmega256RFR2 and a single-chip AVR 8-bit processor, along with a low power 2.4GHz transceiver for IEEE 802.15.4 communications.

In order to get a comprehensive idea of where the water goes, Buchanan thought it would be a good idea to monitor it at the outlet as well as the inlet. Through visual queues (such as light color, duration and intensity) at each fixture, the system can inform a user as to how much water they are using at any given moment.

FMZRFMNI12K9R97.MEDIUM

Beyond that, he wanted the mechanism to relay the information to the cloud, where the data could be parsed and visualized in a “household usage” dashboard using Plotly’s streaming API. To accomplish this, the Maker created a source stream via Pinoccio and a destination stream with data.sparkfun.com, while Python was used to bridge the selected data. Buchanan then uploaded an Arduino code onto his respective wireless Field Scouts.

While this DIY system may not solve the impending crisis, it is surely a start. Not to mention, the monitor may make for a great Hackaday Prize submission. So if you’re ready to save the world one drop at a time, head over to the project’s detailed page here.