Tag Archives: DIY

Arduino Uno vs BeagleBone vs Raspberry Pi

Left to right: Arduino Uno, BeagleBone, Raspberry Pi

Left to right: Arduino Uno, BeagleBone, Raspberry Pi

We like to build stuff here at Digital Diner. There is always some sort of project going on. These days, most of our projects include some sort of digital component – a microprocessor. If you haven’t gotten bitten by the Maker bug yet, we strongly encourage it. It can be incredibly rewarding. If you have even a minimal understanding of programming, there are websites, platforms and tools to help you develop your skills to the point where you actually create a hardware device with buttons, knobs and servos – a real physical world gadget. Software is fun, but when you can make your project physical it is even better.

There are so many great platforms for creating digitally enabled devices that its gotten hard to figure out which one to use. For example, we are currently building a hydroponic garden project and had to choose a controller to run the pumps, read the sensors etc. We were surprised at the number of choices that were available to us. It can be a little confusing for the beginner. To help, we’ve taken three of the popular models and compared them so that you can choose the right tool for your next project. Spoiler: we recommend all three.

The three models (all of which we use here at Digital Diner) are the Arduino, Raspberry Pi and BeagleBone. We chose these three because they are all readily available, affordable, about the same size (just larger than 2″ x 3″) and can all be used for creating wonderful digital gadgets. Before we get to the comparison, here is a brief introduction to each one.

Arduino with Atmel

Arduino with Atmel

The Arduino Uno is a staple for the maker community.  Arduinos come in various sizes and flavors, but we chose the Arduino Uno as an example of the prototypical Arduino.  It has an easy to use development environment, an avid user base and is designed to be easy to interface all sorts of hardware to.

Rasberry-Pi

Rasberry-Pi

The Raspberry Pi is the newcomer to the game.  It isn’t really an embedded computer.  It is actually a very inexpensive full-on desktop computer.  It is barebones, but at $35 for a real computer, its worthy of note, and it is a great platform for lots of Maker projects.

BeagleBone

BeagleBone

The BeagleBone is the perhaps the least known of these platforms, but an incredibly capable board worthy of consideration for many projects.  It is a powerful Linux computer that fits inside an Altoid’s mint container.

underside of Rasberry-Pi

underside of Rasberry-Pi

Raspberry Pi

The underside of the Raspberry Pi.

All three boards features that make them valuable to the hobbyist.  Below is a chart I put together outlining the features of the three for comparison.  If you aren’t familiar with what all these mean, that is fine.  However, there are a few differences that make each of these gadgets shine in their own types of applications.

Comparing the Three Platforms

Comparing the Three Platforms

Comparing the three platforms.

First, the Arduino and Raspberry Pi and very inexpensive at under $40. The BeagleBone comes in at nearly the cost of three Arduino Unos. Also worthy of note is that the clock speed on the Arduino is about 40 times slower than the other two and it has 128,000 (!) times less RAM. Already, you can see the differences starting to come out. The Arduino and Raspberry Pi are inexpensive and the Raspberry Pi and BeagleBone are much more powerful. Seems like the Raspberry Pi is looking really good at this point, however, it’s never that simple. First, its price isn’t quite as good as it seems because to run the Raspberry Pi you need to supply your own SD Card which will run you another $5-10 in cost.

Also, despite the clock speed similarities, in our tests the BeagleBone ran about twice as fast as the Raspberry Pi. And perhaps most counterintuitive, the Arduino was right in the mix as far as performance goes as well, at least for a beginner. The reason for this is that the Raspberry Pi and BeagleBone both run the Linux operating system. This fancy software makes these systems into tiny computers which are capable of running multiple programs at the same time and being programmed in many different languages. The Arduino is very simple in design. It can run one program at a time and it programmed in low level C++.

An interesting feature of the BeagleBone and the Raspberry Pi is that they run off of a flash memory card (SD Card in the case of Raspberry Pi and MicroSD Card in the case of BeagleBone). What this means is that you can give these boards a brain transplant just by swapping the memory card. You can have multiple configurations and setups on different cards and when you swap cards, you’ll be right where you left off with that particular project. Since both of these boards are fairly sophisticated, it even means that you can easily change operating systems just by creating different cards to swap in.

Choosing a Platform

So why would you choose one platform over the other?

For the beginner, we recommend the Arduino. It has the largest community of users, the most tutorials and sample projects and is simplest to interface to external hardware. There are more ways to learn about Arduino for beginners than you can shake a soldering iron at.

The boards are designed to easily interface with a wide variety of sensors and effectors without and external circuitry, so you don’t need to know much about electronics at all to get started. If you haven’t played with these before, get one (they’re inexpensive) and try it. It can be a really great experience.

Raspberry Pi

A credit-card sized computer that plugs right into your TV. It has many of the capabilities of a traditional PC and can be used for word-processing, spreadsheet, and games.

Rasberry-Pi-Atmel

Raspberry Pi
A credit-card sized computer that plugs right into your TV. It has many of the capabilities of a traditional PC and can be used for word-processing, spreadsheet, and games.

BeagleBone  It's the low cost, high-expansion hardware-hacker focused BeagleBoard for people that love embedded Linux systems. Basically a bare bones BeagleBoard, it can run all by itself or act as a USB or Ethernet connected expansion for your current BeagleBoard or BeagleBoard-xM

BeagleBone
It’s the low cost, high-expansion hardware-hacker focused BeagleBoard for people that love embedded Linux systems. Basically a bare bones BeagleBoard, it can run all by itself or act as a USB or Ethernet connected expansion for your current BeagleBoard or BeagleBoard-xM

Arduino Uno  An amazing tool for physical computing — it's an open source microcontroller board, plus a free software development environment.

Arduino Uno
An amazing tool for physical computing — it’s an open source microcontroller board, plus a free software development environment.

For applications minimizing size we recommend the Arduino. All three devices are similar in size, although the Raspberry Pi SD Memory card sticks out a bit making it slightly larger overall.  There are so many different flavors of Arduinos it is ridiculous.  Basically, what makes an Arduino an Arduino is a particular microprocessor and a little bit of software.  It uses a very small, inexpensive, embedded system on a chip microprocessor from a company named Atmel.  For advanced projects that need to be really small, you can buy these chips for a dollar or two and put the Arduino bootloader (a program that makes the Arduino give the Arduino its basic functions) on the chip and viola, you have an Arduino.  We have done this for a few projects and it can make for a very tiny little gadget when you don’t even have a circuit board.

A variety of different Arduino sizes and form factors

A variety of different Arduino sizes and form factors

The BeagleBone beside its big brother the BeagleBoard

The BeagleBone beside its big brother the BeagleBoard

The BeagleBoard has a larger and more powerful big brother, the BeagleBoard, so if you may need to scale up, the BeagleBone is a good choice.

The Arduino Uno, BeagleBone and Raspberry Pi Note the Ethernet ports on the BeagleBone and Raspberry Pi

The Arduino Uno, BeagleBone and Raspberry Pi
Note the Ethernet ports on the BeagleBone and Raspberry Pi

For applications that connect to the internet we recommend the BeagleBone or Raspberry Pi. Both these devices are real linux computers. They both include Ethernet interfaces and USB, so you can connect them to the network relatively painlessly. Via USB, you can connect them to wireless modules that let then connect to the internet without wires. Also, the Linux operating system has many components built-in that provide rather advanced networking capabilities.

A very small USB WiFi adapter plugs right in to the BeagleBone or Raspberry Pi, and the Linux operating system can support these types of devices

A very small USB WiFi adapter plugs right in to the BeagleBone or Raspberry Pi, and the Linux operating system can support these types of devices

The Arduino supports plug-in peripherals called “shields” that include the ability to connect to Ethernet, but the access to the networking functions is fairly limited. Plus by the time you buy the Ethernet shield you might as well just get one of the more advanced boards.

For applications that interface to external sensors we recommend the Arduino and the BeagleBone. The Arduino makes it the easiest of any of the boards to interface to external sensors. There are different versions of the board that operate at different voltages (3.3v vs 5v) to make it easier to connect to external devices. The BeagleBone only operates with 3.3v devices and will require a resistor or other external circuitry to interface to some devices. Both the Arduino and BeagleBone have analog to digital interfaces that let you easily connect components that output varying voltages. The BeagleBone has slightly higher resolution analog to digital converters which can be useful for more demanding applications.

With that said, it is important note that many things that you would want to connect to, including little sensors, have digital interfaces called I2C or SPI. All three boards support these types of devices and can talk to them fairly easily.

For battery powered applications, we recommend the Arduino.  The Arduino uses the least power of the bunch, although, in terms of computer power per watt, the BeagleBone is the clear winner.  However, the Arduino has an edge here since it can work with a wide range of input voltages.  This allows it to run from a variety of different types of batteries and keep working as the battery loses juice. The Arduino uses the least power of the bunch, although, in terms of computer power per watt, the BeagleBone is the clear winner.  However, the Arduino has an edge here since it can work with a wide range of input voltages.  This allows it to run from a variety of different types of batteries and keep working as the battery loses juice.

For applications that use a graphical user interface, we recommend the Raspberry Pi.  The Raspberry Pi is really in a category by itself because it has an HDMI output.   That means you can plug in a mouse and keyboard and connect it directly to your TV.  At that point you have a fully functional computer with graphical user interface.  This makes the Raspberry Pi ideal for use as a low cost web browsing device of for creating kiosk-type projects where you may have a display that people interact with.  In fact, just for fun, we installed the Arduino development tools on the Raspberry Pi and we were able to write a small program and download it to an Arduino from the Raspberry Pi.  It’s not a very fast computer, but it really is a computer.

Summary

The Arduino is a flexible platform with great ability to interface to most anything. It is a great platform to learn first and perfect for many smaller projects. The Raspberry Pi is good for projects that require a display or network connectivity. It has incredible price/performance capabilities.

The BeagleBone is a great combination of some of the interfacing flexibility of the Arduino with the fast processor and full Linux environment of the Raspberry Pi (more so in fact). So, for example, to monitor our hydroponic garden, we will likely use the BeagleBone since it has good input/output features and can easily connect to the network, so we can have it run a web server to make its readings available to us.

All three of these are staples of our projects here at Digital Diner. Of course, there are other platforms out there, for example, we monitor our tomato garden using Sun SPOTs, but these three will cover most people’s needs until you get fairly advanced.

Thanks to Make and Roger Meike for allowing us to repost his comparison article here on the Atmel site. Regarding original source, the Monday Jolt is a new column about microcontrollers and electronics that appears in MAKE every Monday morning. This post was written by Roger Meike and appeared on the Digital Diner on October 24, 2012. It is reposted here on the MAKE site with permission.  

1:1 interview with Michael Koster

Series 3 – Why IoT Matters?


By Tom Vu, Digital Manifesto and Michael Koster, Internet of Things Council Member


Three-part Interview Series (Part 3)


Tom Vu (TV):  Describe how Internet of Things matters? Why should anyone care? Should futurist, technologist, data hounds, product extraordinaires, executives, and  common consumer need to understand what’s to come?

Michael Koster (MK):

There are two main effects we see in the Internet of Things. First, things are connected to a service that manages them. We can now monitor things, predict when they break, know when they are being used or not, and in general begin to exploit things as managed resources.

The second, bigger effect comes from the Metcalfe effect, or simply the network effect, of connecting things together. Bob Metcalfe once stated that the value of a communications network is proportional to the square of the number of connected compatible communicating devices. Since then it’s used to refer to users, but maybe Bob was thinking way ahead. Notice the word compatible. In this context, it means to be able to meaningfully exchange data.

When we connect physical objects to the network, and connect them together in such a way as to manage them as a larger system, we can exploit the Metcalfe effect applied to the resources. We are converting capital assets into managed resources and then applying network management.

Because Internet of Things will be built as a physical graph, it’s socialization of everything, from simple everyday devices to industrial devices. Metcalfe states that 10X connections is 100 times the value.  Cisco is projecting that the Internet of Everything has the potential to grow global corporate profits by 21 percent in aggregate by 2022. I believe these represent a case for pure information on one end, and an average efficiency gain over all of industry on the other.

This has the potential to change things from a scarcity model, where the value is in restricting access to resources, thus driving up price, to a distribution centered model, where value is in the greater use of the resource.  Connecting things to the network is going to reverse the model, from a model of “excluding access” to “inclusion access”, a model where you push toward better experience for consumer/customer/co-business.

Crowdsourcing of things is an example, where models are inverted.  The power arrow is going in the opposite direction, a direction equalizing toward the benefit of the massive body consumers and people.  This in turn, helps shift the business model from a customer relationship managed by vendors, also called advertising, to vendor relationship managed by customers. This is called Vendor Relationship Management, or VRM, pioneered by Doc Searls. This reverses the power arrow to point from customer needs toward business capability to meet needs, and needs are met now that the vendor is listening.  A lot of this is not just IoT but also open source nature, and the big changes happening in people, where sharing being held more valuable than the exclusion of access.

Inverting the value model, breaking down artificially bloated value chains, creating a more efficient economy, I believe it important to create a layer of connectivity that will act as the necessary catalyst to the next Internet of Everything, Internet of Things, Industrial Internet.  Break down the scarcity-based models, exclusion of access, turn it around. Instead of excluding access and driving prices up for limited resources, we will yield higher more efficient utilization of resources.

michael-koster-2-Maker-Faire-2013-SanMateo-Atmel-Maker-Movement

Michael Koster describing Internet of Things and the Maker Movement and Open Source Importance of this Development with Booth attendees at Maker Faire 2013 in San Mateo

It matters on a Global Scale, by giving us better resource utilization. SMART Grid alone has resulted in up to 19.5% efficiency improvement, with an average of 3.8% improvement over all deployments already. We do not have enough energy storage or transmission capacity to deal with the major shift to solar energy sources now in progress worldwide. We are going to have to adapt, learn, monitor, manage, and control our usage in ways only possible with large scale sensing and control.

For the spirit of IoT, it’s not only in making peoples/consumers lives more convenient, solving their first world problems, but its more in the ability to manage resources together as a larger system, from the individual out to a global scale. Especially, this holds true with the effects of globalization, balancing, localization, connectivity, and ubiquity.  It’s for the people.  Social Media had it’s transformation across many things, Internet of Things will also have an efficiency and business transformation.

Companies like Atmel play an important role in creating the building blocks for embedded control and connectivity by means of progressing the ARM / AVR / Wireless / Touch portfolio of products, all of which are the necessary thinking and connecting glue of the Internet of Things. Internet of Things has a large appetite for ultra low power connectivity using wireless standards.  Wireless Sensor Networks are key technology for the IoT, so much that WSN was probably the number one issue in the early deployment. There are many competing standards: Zigbee, SA100.11, Bluetooth, Body Area Network, Wi-Fi Direct, NFC, Z-Wave, EnOcean, KNX, XRF, WiFi, RFID, RFM12B, IEEE 802.15.4 (supporting WPAN such as ZigBee, ISA100.11a, WirelessHART, IrDA, Wireless USB, Bluetooth, Z-wave, Body Area Network, and MiWi).

michael-koster-Maker-Faire-2013-SanMateo-Atmel-Maker-Movement

Michael Koster Exhibiting with Atmel Booth at Maker Faire 2013 San Mateo

Tom Vu (TV):  What would be the most important design decision that supersedes the eventual success of an open source Internet of Things compliance?

Michael Koster (MK):

The first most important decisions are to do open source design based on needs and use cases. I don’t think we can build an IoT if its not open source, or if it’s not connected to the real world use cases.

Just like the Internet, built on open source and open standards, the starting data models are important for building on and building out. HTML and http and URLs allowed many platforms to be built for the web and supersede each other over time, for example Server Pages, SOAP, Javascript, and AJAX. A browser can understand all of the current platforms because they are all based on common abstractions. We believe that the Semantic Web provides a solid basis of standard web technology on which to base the data models.

Tom Vu (TV):  Describe the importance of Internet of Things silos and other M2M standards currently at large in the development community? What are the differences?

Michael Koster (MK):

The IoT has started off fueled by crowdfunding, VC money and other sources that have to some extent built on a business model based on vertical integration. Vertical integration has a big advantage; you need to have a self-contained development to get things done quickly for proof of concept and demonstration.

Vertical integration is also a big driver of the current machine-to-machine, or M2M, communication market. This is the paradigm supporting the initial deployment of connecting things to services for management on an individual thing basis.

The downside of vertical integration is that it leads to silos, where the code developed for a system, the data collected, and even the user interfaces are all unique to the system and not reusable in other systems. Moreover, the vertical integration is often seen as a proprietary advantage and protected through patents and copyrights that are relatively weak because they apply to commonly known patterns and methods.

It’s not always this way, though. As an example, the Eclipse foundation is open source, allowing their M2M system to be used for vertical application development as well as integrated with IoT Toolkit data models and APIs to enable interoperability with other platforms.

The European Telecommunications Standardization Institute, or ETSI, also has an M2M gateway that is a combination of open source and paid license code. New features are enabled through Global Enablers or GEs that implement a particular function using an OSGi bundle consisting of Java code. The Smart Object API can be built into ETSI through a GE bundle, which will enable an ETSI M2M instance to inter-operate with other IoT Toolkit instances. This is the power of the approach we’re taking for interoperability, which is obtained by adding a Smart Object API layer to the system.

Tom Vu (TV):  Explain horizontal and service interoperability for Internet of Things, why is it so important?

Michael Koster (MK):

Connected things connect through WSN gateways and routers to Internet services that fulfill the application logic for the user. Today, for the most part, each vendor provides a cloud service for the devices they sell, e.g. Twine, Smart Things, or the Nest thermostat. There are also some cloud services that allow any connection, providing an API for anyone to connect, for the purpose of integrating multiple devices. But the dedicated devices mentioned earlier don’t work with the generic cloud services.

Many IoT services today are based on providing easy access to the devices and gateway, with open source client code and reference hardware designs, selling hardware on thin margins, and Kickstarter campaigns. There is typically a proprietary cloud service with a proprietary or ad-hoc API from the device or gateway to the service, and a structured API to the service offering “cooked” data.

These systems contain a highly visible open source component, but much of the functionality comes from the cloud service. If a user wishes to use the open source part of the system with another service, the APIs will need to be adapted on either the device/gateway end or service end, or both. It’s not exactly a lock-in, but there is a fairly steep barrier to user choice.

IoT in Silos

Internet of Things (IoT) in Silos

There is the beginning of an ecosystem here, where some devices are being built to use existing services, e.g. Good Night Lamp uses Cosm as their cloud service. Other services that allow open API connectivity include Thingworx and Digi Device Cloud. These services all use very similar RESTful APIs to JSON and XML objects, but have different underlying data models. As a result, sensors and gateways must be programmed for each service they need to interact with.

The current system also leaves users vulnerable to outages of a single provider. Even if there was a programmable cloud service that all could connect to that ran user applications, there would be a vulnerability to provider outages. Much better and more robust would be an ability to configure more than one service provider in parallel in an application graph, for a measure of robustness in the face of service outages. Even more, it should be possible to run user application code in IoT gateways, local user-owned servers, or user-managed personal cloud services. Today’s infrastructure and business models are at odds with this level of robustness for users.

In terms of business and business models, a lot of the connection and network infrastructure today was built on a “value chain” model. These are businesses that are built on a model of vertical integration. In these models, value is added by integrating services together to serve one function, hence vertical.  With the Internet of Things, traditional value chains are collapsing down and flattening. There is a bit of a disruption in the business model (services, etc), but also new opportunities emerge to create new Internet of Things services, which is good for business and consumers.

Companies will continue to build out vertical models to specialize in their services. IoT can potentially augment service models with the customer even further and offer creative possibilities of cost savings and experience and deploy more customer centric business fabrics, which will result in better service for consumers.

If companies build their vertically based infrastructure of applications integrating into the IoT Toolkit platform, the basic enablement for horizontal connections will already exist, making it easy to create horizontal, integrative applications based on automatic resource discovery and linkage.

Access to the knowledge can enhance the customer experience and ROI for businesses.  We are at the brink of the new era, where companies and products can arise from the information economy; only now motivation via implicit or explicit engagement is tied to things, assets, information, sensors, education, and augmentation; and everything is more intertwined and involved.

Tom Vu (TV):  Please assume the role of a futurist or even contemporary pragmatist. How does the landscape of Internet of Things fit into that picture for an individual?

Michael Koster (MK):

It goes back to the idea that your life is going to change in ways that we are no longer be driven by the scarcity pressures we experienced as hunter gatherers. IoT will trigger the overall shift from the resource accumulative, to the interaction driven and resource sharing-enjoying model due to the ubiquitous connectivity and the right kind of applications we can use to bring this experience to maturity.

We expect the Internet of Things to be where the interaction moves away from screens and becomes more like everyday life, only more convenient, comfortable, and easy to manage. We’re still looking for the valet, the system that simply helps us manage things to enable us to become more as people.

Tom Vu (TV):  Do you have any insights into how industries like Semi-Conductor can help share the responsibility of making Internet of Things for the People and by the People?

Michael Koster (MK):

Yes, of course, everyone has a part in the build up and build out of Internet of Things.  From business to academia, in the home and across the planet, the march to Internet of Things is inevitable.  Again and again, the familiar signs of disruption are being seen.  We see that happening today with the very first initial releases of connected products.  There is a movement in Makers, with substantial global activity. Which is quite harmonious to open source and open hardware.  This will be even wider spread once critical mass takes effect with products more and more becoming connected and smart via Internet.  The power of the sensor proliferation is akin to Twitter having 10 people registered and using their Social Fabric versus 100s of millions.  The more everyday devices and things are connected, the more the power of IoT will overwhelmingly surface.

It’s only how well we integrate and collaborate together across industry to propel this next phase of Internet to the next level.  Every potential disruptive technology has a turning point.  We are at that point and we are all part of this movement. In turn, the Internet of Things will make better products, a better user experience, and optimized efficiency across all resources. How we decide to apply this technology will make all the difference.

This very notion forces industries to be more aware, efficient, and productive. Sensors and connected devices will help supply chain, manufacturing, research, product roadmaps, experience, and ultimately drive an economy of growth. The enterprise begins to have a visibility, transparency to customers, people.   Ultimate, it’s a true nervous system, connected via an enterprise level to a personal consumer level.

SMART, AWARE, and SENSORY are new enhancements to business to include customer habits and patterns of use, threaded right into the production routine and product design. Internet of Things will help sculpt a more consumer oriented and customer centric world of products. Customers will have direct influence in the manufacturing of individual products and instances of products.  Companies can help by being part of the community, albeit in the field of electrical engineering, design, data, to software development on the cloud.  Internet of Things will have touch points between customers and business as much as the electrical power grids have influence across all business today.

The new ecosystem will have micro scale and agile manufacturing at a level of customization unimaginable today. It’s the next driver for brilliant machines, maybe artisan-machines that work for individuals but still live on the factory floor.

You can work with the developers and work toward expanding businesses that can embrace the development world.  Help build the $50 cell phone or connected devices that bridge fiscal and energy compliance for a better world.

Ride the long tail wave… and the inverted business models…  Make more accessibility to all products and be responsible in accessibility… From crowdfunding or crowdsourcing, like Kickstarter or Makers, someone is going to figure out how a sensor can do more, in a very impactful and human experience paradigm. The new innovations will come from everywhere; from the 14 year old in Uganda who takes apart her cellphone to repurpose it into a medical monitoring device, from the basements and garages of millions of makers and DIY’ers worldwide who have sure genius among them.

It is super important to get the very latest hardware out to the open community so that innovation can be leveraged, taken to new levels of creativity and crowdsource ideation for collaboration and massive cross-contribution. Accessibility, documentation, development, ecosystem for software support for the MCUs are all too important.  Atmel holds building blocks to many of these pieces, combined with their development tools and evaluation ecosystem (Atmel Studio 6, Atmel Spaces, Atmel Gallery) and involvement with Makers and Arduino.

Open Hardware / Open Source will come to be de-facto standards.  Bundle open source along with the open hardware to make it even more accessible and embed rapid guide start for newcomers. Right now a key piece is the Wireless Sensor Net. If there were a good open source WSN available and supported by manufacturers, it could enable a groundswell of connected devices.

Build open source and open hardware educational IoT developer’s kits for ages 8 and up, for high school and college, to hit all levels of involvement and expertise. Support community hackspaces and places (ie Noisebridge) where everyone can learn about the digital world and programming.

We are seeing the leveling out of the development happening in all parts of the world. Radical innovation is happening everywhere. Open Source is helping shape this curvature.  This is the broader whole tide that we are seeing. Pinocchio is one great innovation emerging from Makers and Open Source, then we have IoT hubs such as SmartThings, Thingworx, or Xively (formerly Cosm).  There is a lot of crowdfunding, ideation, blooming of disruptive products looking to change the scene of things to come….
Support open source and open collaboration in everything, to create a culture of sharing and innovation, a culture of synergy in building the Internet of Things together. Involve customers as participants and makers of their own experiences. Make sure everyone has access to the information and support they need to build, maintain, hack, and repurpose their devices over time to promote a healthy ecosystem.

This time innovation is going global. The ideation is happening everywhere. There are many global Silicon Valley type hubs, other metros in the world, as well as global accessibility to the same information. We see startup mentality blossoming across all geo-locations.  Again, Semi-Conductors is contributing, helping pave the back-plane for innovation & connectivity for the development layers on top.  Global village of innovation is coming of age… Now.

 

Also read Part 1 and Part 2 of the Interview Series.

1:1 Interview with Michael Koster


Three-part Interview Series (Part 2)


Series 2 – IoT Toolkit and Roadmap

Tom Vu (TV):  What is in the roadmap for IoT Toolkit?

Michael Koster (MK):

The IoT Toolkit is an Open Source project to develop a set of tools for building multi-protocol Internet of Things Gateways and Service gateways that enable horizontal co-operation between multiple different protocols and cloud services. The project consists of the Smart Object API, gateway service, and related tools.

IoT Smart Object Structure

IoT Smart Object Structure

The foundation of the platform is purely bottom up, based on applying best practices and standards in modern web architecture to the problem of interoperability of IoT data models. I believe that the practice of rough consensus and running code results in better solutions than a top-down standard, once you know the basic architecture of the system you’re building.

To that end, I created a public github and started building the framework of the data model encapsulations and service layer, and mapped out some resourceful access methods via a REST interface. The idea was to make a small server that could run in a gateway or cloud instance so I could start playing with the code and build some demos.

The next step is to start building a community consensus around, and participation in, the data models and the platform. The IoT Toolkit is a platform to connect applications and a mixture of devices using various connected protocols.  It’s real power lies in its broader use, where it can span across all of our connected resources in industry, ranging from commerce, education, transportation, environment, and us. It’s a horizontal platform intended to drive Internet of Things more widely as an eventual de facto standard, built for the people who are interested in building out Internet of Things products and services based on broad interoperability.

IoT Sensor Nets Toolkit

IoT Applications Run on Cloud or On Gateway

We intend to create a Request For Comment (RFC), initiate a formal process for the wider Internet of Things platform and standards.  An community agreed upon process similar to the world wide web that we use today, based on rough consensus and running code, with RFCs serving as working documents and de facto standards that people can obtain reference code, run in their system to test against their needs, and improve and modify if necessary, feeding back into the RFC for community review and possible incorporation of the modifications.

The Internet of Things interoperability platform stands as an ideal candidate, leveraging the power of the open source community’s development process.  In turn, community involvement is taken to a new level, across many fields of discipline, and in many directions. Here is where we can get the most benefit of an agile community.  Crowdsource the development process based on principles of open communication and free of the need for participants to protect interests toward proprietary intellectual property.

We need to build the platform together meshed around the community of Makers, DIY, Designers, Entrepreneurs, Futurist, Hackers, and Architects to enable prototyping in an open ecosystem.  Proliferation then occurs; a diverse background of developers, designers, architects, and entrepreneurs have many avenues of participation. They can create a new landscape of IoT systems and products.

This broad participation extends to industry, academia and the public sector.  We are aiming for broad participation from these folks, build a global platform based on common needs. As a member of the steering committee, when I participated in the IoT World Forum, I heard from the technical leaders of enterprise companies (Cisco and others), research departments, and IoT service providers. They believe an open horizontal platform would be needed to enable applications that span across their existing vertical markets and M2M platforms.

Instead of a top-down approach, where people from corporations and institutions get together in a big meeting and put all their wish lists together to make a standard, we’re taking an overall bottom-up approach, bringing together a diverse community ranging from makers to open source developers, and entrepreneurs. Together with corporations, academia, and public sector, we all will participate in a very broad open source project to develop a platform that can be ubiquitous that everyone can use.

In many ways, this is modeled after the Internet and World Wide Web itself.  As we need to create a more formal standard, it will likely engage with the IETF and W3C. A good example is the semantic sensor network incubator project, which is an SSN ontology that describes everything about sensors and sensing. This enables broad interoperability between different sensor systems and platforms, based on common data models and descriptions. What we want to do is something similar to that, only on a more comprehensive scale and intended for the Internet of Things.

Tom Vu (TV):  Can you take us through a tour of the Data Object model importance and how it yields significance for simple and sophisticated connected devices?

Michael Koster (MK):

The Internet of Things today consists of many different sensor networks and protocols, connected to dedicated cloud services, providing access through smartphone and browser apps. It is rare for these separate “silos” to cooperate or interact with each other.

We abstract the complexity of sensor nets connecting devices and hardware by adding a layer of semantic discovery and linkage. This enables the sensors and actuators on disparate sensor nets to be easily combined to build integrated applications.

The way this works is using a few techniques. First, the different sensor nets are integrated through a common abstraction layer. This works a lot like device drivers in an operating system, adapting different devices and protocols to a common system interface. Only in this case, they are adapted to a common data model.

The common data model for sensor nets is based on the new IETF CoRE application protocol and sensor descriptions. This provides standard ways for common types of sensors to be discovered by their attributes, and standard ways for the data to be linked into applications, by providing descriptions of the JSON or BSON data structure the sensor provides as it’s output.

We use the W3C Linked Data standard to provide web representations of data models for sensor data and other IoT data streams. Linked data representations of IETF CoRE sensor descriptions are web-facing equivalents of CoRE sensor net resources. Linked data provides capabilities beyond what CoRE provides, so we can add functions like graph-based access control, database-like queries, and big data analysis.

Internet Smart Objects

Internet Smart Object

Internet of Things Applications are essentially graph-structured applications. By using Linked data descriptions of JSON structures and the meaning of the data behind the representation, we can create applications that link together data from different disparate sources into single application graphs.

Then we enable the platform with an event-action programming model and distributed software components. The common semantic language enables the data sources and software components to easily be assembled and make data flow connections. The result is an event-driven architecture of self-describing granular scale software objects. The objects represent sensors, actuators, software components, and user interaction endpoints.

FOAT Control Graph

Interent of Things with FOAT Control Graph


Tom Vu (TV):  Who and what companies should be involved?

Michael Koster (MK):

Whoever wants to participate in the building out of the Internet of Things. The people that use the infrastructure should build it out; the people who want to provide products and services based on interoperability, along with those who provide the backplane of thinking low power microcontrollers / microprocessors, connected sensors, and importantly the network infrastructure.

We want to enable all avenues of participation to allow corporations, academia, policy and standards makers, entrepreneurs and platform developers, makers, and DIY hackers all to be involved in building the platform as a community.

For corporations, we will provide an important role, to build a vendor-neutral platform for data sharing and exchange, an open horizontal platform that will allow the integration of what were traditionally vertical markets into new horizontal markets.

Anyone participating or expecting to participate in the emerging Internet of Things, Internet of Everything, Industrial Internet, Connected World, or similar IoT ecosystems initiatives, could benefit by participating in creating this platform. Companies that provide network infrastructure and want to build in value add can adopt this standard platform and provide it as infrastructure. Companies that want to provide new services and new connected devices that can use the IoT Toolkit to easily deploy and connect with existing resources could benefit.

All companies, organizations, and people that can benefit from an open Internet of Things are welcome to participate in the creation of a platform that everyone can use.

Tom Vu (TV):  How important is Open Source to Internet of Things evolution?

Michael Koster (MK):

I don’t see how the Internet of Things can evolve into what everyone expects it to without a large open source component. We need to go back to Conway’s law and look at it from both the system we’re trying to create and the organization that creates it. Interoperability and sharing are key in the system we want to create. It’s only natural that we create an open development organization where we all participate in both the decisions and the work.

Removing the attachment of intellectual property, changes the dynamics of the development team, keeps things engaged and moving forward solving problems. It’s important for software infrastructure projects like this to remove the barrier to cooperation that arises from the self-protection instinct around proprietary Intellectual Property, or even egoism associated with soft intellectual property, “my” code.

Instead, we turn the whole project into a merit-based system as opposed to being ego driven.  Rather than worry about guarding our property, we are motivated to solve the problems and contribute more to the deliverable. The limits to participation are removed and there is a more rapid exposure of intentions and goals. Engagement and innovation can rule in this environment of deep collaboration.

Tim Berners-Lee said that he was able to achieve the creation of the World Wide Web system because he didn’t have to ask permission or worry about violating someone’s copyright. We are creating the same environment for people who want to build our platform, and even for those who want to build their services and applications on top of the platform.

We are going to create the service enabled layer as open source as well so that any one of the companies can help proliferate the idea and everyone has influence and access to the development of the underlying IoT platform.  If it’s open source infrastructure and platform software, you can make a service on top of that software that can contain proprietary code. With our license, you can even customize and extend the platform for your own needs as a separate project.

Tom Vu (TV):  Describe your work with the EU IoT organization and how you are involved as a voice for the Internet of Things?

Michael Koster (MK):

I work with the IoT Architecture group within the overall EU Internet of Things project. The IoT-A group is closely related to the Future Internet project. They have an Architecture Reference Model describing different features one might build in an IoT platform, a sort of Architecture for Architectures. Since their process mirrors my own design process to a large extent, I found their reference model to be compatible with my own architecture modeling process.

They are conducting a Top-Down activity, stewarding the participation in the architecture and standardization model.  One of the ways I work with IoT-A is to use the Smart Object API as a validation case for the Architecture Reference Model. They are building the reference model top down, and we’re building the architecture bottom-up, based on a common expression of architecture relationships and descriptions.

I am also involved in advocating open source of IoT and building of local IoT demonstrator projects, educating around IoT, open data, etc. as well as user controlled resource access and privacy.  I am providing a voice for open source and open standards, into the standards movement going forward.

Here in the USA, there is not anything like what they have in Europe. Here the process will be to engage corporations and institutions and create a participatory structure that enables fair and open opportunity for influence and access to both the development process and the final products.

Tom Vu (TV):  How important is an open standard – building of an RFC in which all industries can agree upon ultimately serving to a wider scale factors of adoption and proliferation?

Michael Koster (MK):

To simply put it, the construction of a formal RFC is something that describes part of system.  A Request for Comments (RFC) is a memorandum published by the Internet Engineering Task Force (IETF) describing methods, behaviors, research, or innovations applicable to the working of the Internet and Internet-connected systems.  It is a process or evolution in achieving a more widely adopted standard.  The founders of the Internet created this process, and http, etc are all built using original RFC process from many years ago.

Through the Internet Engineering Task Force, engineers and computer scientists may publish discourse in the form of an RFC, either for peer review or simply to convey new concepts, information, or (occasionally) engineering humor. The IETF adopts some of the proposals published as RFCs as Internet standards.

If the IoT Toolkit platform becomes adopted, it may eventually be as many as 10-12 different RFCs, but it’s important to get people to agree on common first set.  This is the initial phase into a more pervasively used universal standard.  In fact, it’s sort of like a strawman platform.  It’s intent is to describe and collaborate, but also invoke and seek out broader participation…  We are at the stage of putting proposals together over the next few weeks and setting up meetings to talk to many people around collaboration and participation in building an Internet of Things platform.

We believe that an open standard platform for horizontal interoperability is key to achieving the promise of the Internet of Things. Everyone needs to be able to present and process machine information in machine understandable formats on the IoT, just as we humans enjoy commonly understandable web data formats and standardized browsers on today’s WWW. It’s important that developers be able to focus on solving problems for their clients and not waste resources on communication and translation.

Read Part Three to Learn More about Why IoT (Internet of Things) Matters?

Here are Part 1 and Part 2 of the Interview Series.

Imagining the Future — DIY Style

By Eric Weddington

It’s the beginning of February already. The New Year has started with a bang, with barely enough time to reflect on the past year. However, there have been some exciting things in 2012 that I can’t wait to see continue on in 2013…

Engineers can be a funny group. On one hand they’re the makers of a wide range of technology. But because engineers are, in general, interested in getting the details right, sometimes they can get caught up in the details, with a focus on what should be the “right” way of doing something. One of the privileges of being involved in the open source community has been attending the Maker Faires, put on by Make: magazine, in the Bay Area in May, and in New York in September. The Arduino microcontroller board is a big part of  these Maker Faires, powering all sorts of projects. It’s become popular because it enables people who are not engineers to get involved in making stuff with electronics, allowing them to add smarts to all sorts of things.

What I’ve discovered is that it doesn’t magically turn these people into engineers. They see the Arduino as a tool that they can use to turn their ideas into reality. They don’t get caught up in the details of what is the “right” way, or the “wrong” way, to implement a solution according to their engineering training. They keep their eyes firmly on their goal. They’re too busy creating! During the last year, I have been amazed at all the cool, weird, wonderful ideas that have been thought up and implemented by many in this Maker community. I wouldn’t have thought up half the stuff that I have seen done with an Arduino and our AVR processors. A DIY X-ray CT scanner controlled by an Arduino. FireHero, which has an Arduino controlled propane “puffer” interfaced to a GuitarHero controller. A winner of the California Science Fair used an Arduino to measure foot pressure for diabetics. All manner of quadcopters and UAVs. Desktop 3D printers. Clothing design. And the list goes on. It’s exhilarating to see what’s been done and to think about what people will imagine next! Yes, it’s going to be a fun 2013!

Open Sauce

By Steve Castellotti

CTO, Puzzlebox

North Beach, San Francisco’s Italian neighborhood, is famous for the quality and wide variety of its many restaurants. From colorful marquees scattered up and down Columbus to the hushed, more dimly lit grottos hidden down side streets and back alleys, there is no lack of choice for the curious patron.

Imagine then, having chosen from all these options, you sit down and order your favorite dish. When the plate arrives the waiter places next to it a finely embossed card printed on thick stock. A closer examination reveals the complete recipe for your meal, including hand-written notations made by the chef. Tips for preparation and the rationale for selecting certain ingredients over others are cheerfully included.

Flipping the card over reveals a simple message:

“Thank you for dining with us this evening. Please accept this recipe with our regards. You may use it when cooking for friends and family, or just to practice your own culinary skills. You may even open your own restaurant and offer this very same dish. We only ask that you  include this card with each meal served, and include any changes or improvements you make.”

Sharing the “Secret” Sauce

Having been raised in an Italian family myself, I can assure you that there is no more closely guarded secret than the recipe for our pasta gravy (the sauce). But I can’t help but wonder how such an open sharing might affect the landscape of a place such as North Beach. If every chef was obliged to share their techniques and methods, surely each would learn from the other? Customers would benefit from this atmosphere of collaboration in terms of the taste and quality of their dinners.

These many restaurants, packed so tightly together as they are, would still be forced to compete on terms of the dining experience. The service of their wait-staff, the ambience, and cost would count for everything.

For the majority of customers, knowledge of the recipe would simply be a novelty. In most cases they would still seek a professional chef to prepare it for them. But to the aspiring amateur, this information would contribute to their education. A new dish could be added to their repertoire.

An experienced restaurateur could no doubt correct me on any number of points as to why such a scenario would be a poor business model and never could or should be attempted. But just across town, throughout Silicon Valley and indeed across the globe, in the realm of technology, this exact model has been thriving for decades.

Open Source in the Software World

In the software world, developers have been sharing their source code (the recipe for the programs they write) under licenses similar to the one outlined above on a grand scale and to great success. The Internet itself was largely constructed using open platforms and tools. Mobile phones running Google’s Android operating system are now the most popular in the world, with complete source material available online. And in 2012 Red Hat became the first open source company to achieve a billion dollars in revenue, with customers from IBM to Disney and Pixar among their roster.

The benefits are many. Developers can leverage each others’ work for knowledge and time saving. If you want to build a new web site, there’s no need to write the web server or common routines such as user management from scratch. You can take open versions and start from there. Even better, if you have questions or run into trouble, more likely than not someone else has, too, and the answer is only a search away. Most importantly, if the problem you found indicates a flaw in the software (a bug), then a capable coder is empowered to examine the source and fix it himself or herself. And the result can be shared with the entire community.

There are parallels here to several fields. Similar principles form the basis of the scientific method. Without the sharing of procedures and data, independent verification of results would be impossible. And many discoveries result from iterating on proven techniques. A burgeoning do-it-yourself community, a veritable Maker Movement, has grown around magazines like Make and websites such as Instructables.com. New inventions and modifications to popular products are often documented in meticulous detail, permitting even casual hardware hackers to follow along. Electronics kits and prototyping boards from companies like Arduino are based on Atmel microcontrollers  plus open circuit designs, and are often used to power such projects.

Puzzlebox Brain Controlled Helicopter in Flight

Brain-Controlled Helicopter

Recently, our company, Puzzlebox, released the Orbit, a brain-controlled helicopter. The user begins by setting a display panel to the desired level of concentration and/or mental relaxation they wish to achieve.  A mobile device or our custom Pyramid peripheral processes data collected by a NeuroSky EEG headset. When that target is detected in the user’s brainwaves, flight commands are issued to the Orbit using infrared light. One can practice maintaining focus or a clarity of thought using visual and physical feedback.

Puzzlebox Brain-Controlled Helicopter with Atmel AVR

Puzzlebox Brain-Controlled Helicopter with Atmel AVR

Beyond novelty, however, lies the true purpose of the Puzzlebox Orbit. All source code, hardware designs, schematics, and 3D models are published freely online. Step-by-step guides for hacking the software and electronics are included. Methods for decoding infrared signals and extending mechanisms to operate additional toys and devices are shared. Creative modification is encouraged.  The goal is to promote the product as a teaching aid for middle and high school sciences classes and in university-level programming and electrical engineering courses.

Puzzlebox forging Classroom and Early Adoption of Technology for Education

This business model is itself a bit of an experiment, much like the restaurant described above. There is little preventing a competitor from producing a knock-off and leveraging our own recipes to do it. They might even open their doors just across the street from ours. We’ll need to work hard to keep our customers coming back for seconds. But so long as everyone abides by the rules, openly publishing any modifications of improvements made on our recipe, we’re not afraid to share the secrets of our sauce. We only ask that they include the original material with each dish they serve, and include any changes or improvements made along the way. We’re willing to compete on cost and dining experience. In this way we hope to improve the quality and flavor for everyone.

Puzzlebox with Arduino and Atmel AVR

Puzzlebox with Arduino and Atmel AVR

Puzzlebox Software IDE Interface

Openness and The Internet of Things

Today, communities such as Kickstarter and others tapping into the power of openness and crowd-sourcing are fueling a lot of technological innovation.  The next era for enterprise is revolving around The Internet of Things (#IoT), machine-to-machine (#M2M) communications and even the Industrial Internet (#IndustrialInternet).

One strong proponent of innovation and thought, Chris Anderson, is renowned for having his fingerprints and vision on trends as they bloom into movements.  Anderson is committed and energized in this Make-infused world. His latest book, “Makers: The New Industrial Revolution”, eloquently outlines the “right now” moment with makers. “Hardware is the new software”, opening up the brink of the next age of the Internet, where devices and machines become connected. Cloud, agile apps, and embedded design hardware (systems on chips, microcontrollers, or smart devices) are converging and  paving the next generation of integrated products across the fabric of devices.

“The real revolution here is not in the creation of the technology, but the democratization of the technology. It’s when you basically give it to a huge expanded group of people who come up with new applications, and you harness the ideas and the creativity and the energy of everybody. That’s what really makes a revolution.

…What we’re seeing here with the third industrial revolution is the combination of the two [technology and manufacturing]. It’s the computer meets manufacturing, and it’s at everybody’s desktop.”

Excerpt credited from Chris’s Anderson’s “Maker: The New Industrial Revolution”

With that said, we enter the next age, where hardware is the new software.

Arduino-Based Personal Satellites Could Launch This Fall

The Arduino platform has become a common component in robotics and an array of do-it-yourself (DIY) tech gadgets. Now, Arduino boards, based on Atmel AVR megaAVR 8-bit and ARM processor-based microcontrollers, are poised to power personal satellites that could get launched into space as early as this fall.

One of the driving forces behind these cracker-sized satellites, dubbed “Sprites,” is Zac Manchester, who recently talked to the San Francisco Chronicle about his Kickstarter-funded project. Working from NASA’s Ames Research Center, Manchester and his team are aiming to get 250 of the personal satellites into space via a container placed inside the SpaceX’s Falcon 9 rocket, which resupplies the International Space Station.

More on the Sprite project here. What would you do with your own personal satellite?

Monitor Hurricanes with Arduino Platform

When Mother Nature roars in the form of hurricanes, like the recent Hurricane Sandy on the U.S.’s East Coast, social media sites like Twitter demonstrate how critical it is to be able to easily share information. With the Arduino platform, you can create a DIY auto-Tweeting weather station. See photos and learn how here.