Tag Archives: IoT-A

IoT - 1:1 Interview Rob van Kranenburg

1:1 interview with Rob van Kranenburg (Part 1)

1:1 Interview conducted by Atmel’s Tom Vu with Rob van Kranenburg, IoT-A Stakeholder Coordinator, Founder of Council, and Adviser to Open Source Internet of Things, osiot.org.

rob-van-kranenburgTV: Why IoT-A? There are a multitude of IoT consortiums important to forging the progress of this next era of connective technology. Why is it important to the general business and mainstream? Why so many consortiums? Will it eventually roll up to one?

RvK: In systemic shifts the next normal is at stake. Of course you have to believe that IoT is a systemic shift first. Paradoxically, it is precisely the fact that we see so many contenders and consortia – no one wants to miss out or be left behind – that IoT is moving from being a vision to a business proposition. The success of the device as a standard – the Steve Jobs approach to controlling hardware, software, connectivity, app store; what goes in and what goes out and who it is friends with – has been an eye opener.

Patrick Moorhead writes in his Forbes piece that “the stunning success of smartphones, followed by similar success for tablets, has pushed the standardization opportunities for next generation infrastructure into play for the top tier of visionary companies”1, listing among others IBM Smarter Planet, Cisco’s Internet Business Solutions Group, Google, IPSO Alliance, ARM, International M2M Council, IoT-A (Internet-of-Things Architecture), and Intel’s Intelligent Systems Framework (ISF).  Software as a service, could only come into existence with the Cloud: “In the 90s, storage disks of less than 30GB capacity were incredibly expensive. Today, thanks to innovations in silicon technology, we are able to get high capacity storage disks at a nominal cost.”2 In the early 2000s we see the first experiments with real-time feedback.

In an earlier post you mention Formula 1. In 2002 Wired published a piece on sailing and the America’s Cup: “We’re trying to find patterns, to see that one set of conditions tends to result in something else. We don’t know why, and we don’t need to, because the answer is in the data.” This a programmer talking, a programmer and a sailor: Katori is writing a program that crunches the measurements and creates a “wind profile number an implied wind,” a wind an implied boat can sail on, as sailing, so long an intuitive art, has become a contest of technology: “Sensors and strain gauges are tracking 200 different parameters every second and sending the information across Craig McCraws OneWorld’s LAN to its chase boats and offices. Then the info gets dumped into a Microsoft SQL database, where it’s sifted to pinpoint the effects of sail and hardware experiments. Unraveling all the input is, in the words of OneWorld engineer Richard Karn, “nearly impossible.” And that’s not all: every day for the past two years, five OneWorld weather boats have headed out into the Gulf to harvest data.”3

I remember how struck I was by that notion of an “implied wind.” Before that notion there was the “real” and the “digital,” two concrete and separate worlds. You could argue that prior to that there was the “real” and the “surreal” or spiritual world. Large groups of people historically have been animists. To them objects do have stories, hold memories, are “actors.” Things are alive in that vision. Introducing this notion of implied, it became clear that it was no longer about the relation between the object and the database, materialized in a “tag,” but that the relation itself was becoming an actor, a player in a world where you did not know why, and you could nor care less why or why not – you wanted to gather data. There is “something” in it.

Grasping this key paradigm shift, it then becomes clear that the stakes are very high. In 2001, Steve Halliday, then vice president of technology at AIM, a trade association for manufacturers of tagging (RFID) technology, interviewed by Charlie Schmidt claimed: “If I talk to companies and ask them if they want to replace the bar code with these tags, the answer can’t be anything but yes. It’s like giving them the opportunity to rule the world.”4 Since then the most publicized attempt to create one single architecture, an Object Name Server, is the story of the RFID standard called “EPC Global” -two standard bodies EAN and UCC merging to become GS1 in 2005. In a bold move that no regulator foresaw, they scaled their unit of data from being in a batch of 10,000 and thus uninteresting for individual consumers to that of the uniquely identifiable item.

TV: Gartner suggest IoT as a #4 business creation factor for the next 5 years. What are your thoughts? Is this true?

Gartner-Hype-Cycle-IoT

Credit: Image obtained from Gartner’s 2012 Hype Cycle for Emerging Technologies Identifies “Tipping Point” Technologies, Unlocking Long-Awaited Technology Scenarios

*****

RvK: Depending on how you define IoT, I would say definitely. Internet of Things influences changes in production (smart manufacturing, mass customization), consumption (economy of sharing, leasing vs ownership), energy (monitoring grids, households and devices), mobility (connected cars), decision making processes (shift to grassroots and local as data, information and project management tools come in the hands of ‘masses’), finance (IoT can sustain more currencies: Bitcoin, bartering, and again ‘leasing’) and creates the potential for convergence of the above shifts into a new kind of state and democratic model based on the notion of “platform.”

It is more an operation on the scale of: before and after the wheel, before and after printing/the book. In a kind of philosophical way you could say that it is the coming alive of the environment as an actor, it touches every human operation. The browser is only 20 years old – Mosaic being the first in 1993. The web has dramatically changed every segmented action in every sequence of operations that make up project management tools in any form of production and consumption. Because of this some people in the EU and elsewhere are trying to change IoT name-wise to something like Digital Transition. The Singularity is another way of looking at it. As a concept it is Borgian in the sense that the next big trends: Nano electronics and (DIY) biology are not in an emergent future realm as time to market could increase exponentially as they are drawn into being grasped within the connectivity that IoT is bringing.

Interested in reading more? Tune into Part 2 of Atmel’s 1:1 interview with Rob van Kranenburg. View Part 2  and Part 3

*****

1 http://www.forbes.com/sites/patrickmoorhead/2013/06/27/how-to-intelligently-build-an-internet-of-things-iot/?goback=%2Egde_73311_member_253757229

2 http://www.ramco.com/blog/5-cost-effective-ways-to-store-data-on-the-cloud

3 Carl Hoffman, Billionaire Boys Cup. High tech hits the high seas in a windblown battle between Craig McCaw and Larry Ellison. Carl Hoffman sets sail with Team OneWorld in the race to take back the America’s Cup.http://www.wired.com/wired/archive/10.10/sailing_pr.html

4 Beyond the Bar Code – High-tech tags will let manufacturers track products from warehouse to home to recycling bin. But what’s great for logistics could become a privacy nightmare. By Charlie Schmidt, March 2001.http://www.technologyreview.com/featuredstory/400913/beyond-the-bar-code/

An introduction to Kevin Ashton’s recent IoT keynote

Recently, a number of industry heavyweights have taken a keen interest in the Internet of Things (IoT). Essentially, the IoT involves various nodes collectively generating a tremendous amount of data.  We know there is a strong emphasis now for the “Things being connected”.  In a small scale, a Formula 1 constructor such as McLaren uses a cluster of sensor nodes to transmit vital telemetry from the pit crew to garage, then to race engineers and ultimately back to R & D centers. During the races, this all happens in realtime. Of course, the customer in this scenario is the driver and engineering team – converging machine logs and other relevant data to ensure a vehicle runs at optimal speed.  During the races, this happens realtime; converging decisive machine log and digital data together to formulate decisive actions toward minor setting adjustments; this results in balancing the force of physics to the engine and car to produce fractions of a competitiveness in seconds.  This equates to a win in the race and competitiveness on the circuit.  Comparatively as a smaller micro-verse, this is the world of Industrial Internet and Internet of Things.

Now let’s imagine this same scenario, albeit on a global scale. Data gathered at crucial “pressure points” can be used to optimize various processes for a wide variety of applications, scaling all the way from consumer devices to manufacturing lines. To be sure, an engine or critical component like a high efficiency diesel Spark Plug is capable of transmitting information in real-time to dealerships and manufacturers, generating added value and increasing consumer confidence in a brand.

Sounds like such a scenario is years away? Not really, as this is already happening with GE and other larger Fortune 500s. Then again, there are still many frontiers to continually innovate. Similar to aviation, its more about building smarter planes, rather than aspiring to a revolution in design. Meaning, building planes capable of transmitting data and implementing actions in real-time due to evolved processes, automation and micro-computing.

Likewise, applications combined with embedded designs also yield improved output. Given the multitude of various mixed and digital signals, efficiency and computing quality factors also play vital roles in the larger system. The GE jet engine featured in one particular plane has the ability to understand 5,000 data samples per second. From larger systems down to the micro embedded board level, it’s all a beautiful play of symphony, akin to the precision of an opera. To carry the analogy further, the main cast are the architects and product extraordinaires who combine intelligent machine data, application logic, cloud and smartly embedded designs to achieve the effect of an autonomous nervous system.

Remember, there are dependencies across the stack and layers of technology even down to the byte level. This helps planes arrive at their destination with less fuel – and keeps them soaring through the sky, taking you wherever you want to go. Ultimately, a system like this can save millions, especially when you take into account the entire fleet of aircraft. It is truly about leveraging intelligent business – requiring connectivity states concerted in a fabric of communication across embedded systems. Clearly, the marriage of machine data and operational use-cases are drawing closer to realization.

“When you’ve got that much data, it had better be good. And reducing the CPU cycles cuts energy use, especially important in applications that use energy harvesting or are battery powered. And that is why Atmel offers a wide range of products mapping to more than the usual embedded design ‘digital palette’ of IoT building blocks. The market needs illustrations and further collaboration; diagrams that show what plays where in the IoT and who covers what layers,” says Brian Hammill, Sr. Atmel Staff Field Applications Engineer.

“Something like the OSI model showing that we the chip vendors live and cover the low level physical layer and some cover additional layers of the end nodes with software stacks. Then, at some point, there is the cloud layer above the application layer in the embedded devices where data gets picked up and made available for backend processing. And above that, you have pieces that analyze, correlate, store, and visualize data and groups of data. Showing exactly where various players (Atmel, ARM mbed (Sensinode), Open Platform for IoT, Ayla Networks, Thingsquare, Zigbee, and other entities and technology) exist and what parts of the overall IoT they cover and make up.”

Atmel offers a product line that encompasses various products that give rise to high end analog to digital converter features.  For example in Atmel’s SAM D20 an ARM based Cortex-M0+, the hardware averaging feature facilitates oversampling.  Oversampling produces sample rates at high resolution.  The demand for high resolution sampling runs congruent to many real-world sensor requirements.  In the world of engineers and the origin of the embedded designs, achieving lean cost by ensuring no extra software overhead – competitive with benefits.  In the design and mass fulfillment of millions of components and bill of materials used to create a multi-collage of global embedded systems, there exist strong ledger point of view – even for engineers, designers, architects, and manufacturing managers.  Ultimately, augment business line directives to fullest ROI.  Expanding the design/experience envelope, Atmel microcontrollers have optimized power consumption.  Brian Hammill concurs, “Atmel offers several MCU families with performance under 150 microamperes/MHz (SAM4L has under 90 uA/MHz, very low sleep current, and flexible power modes that allow operation with good optimization between power consumption, wakeup sources, wakeup time, and maintaining processor resource and memory.”

Geographically, there seems to be a very strong healthcare pull for IoT in Norway, Netherlands, Germany, Sweden and this follows into Finland and other parts of Asia as well as described in Rob van Kragenburg’s travels of IoT in Shanghai and Wuxi. Therein lies regional differences mixed with governance and political support. It is also very apparent that Europe and Asia place an important emphasis on IoT initiatives.

Elsewhere, this is going to happen from bottom-up (groups akin to Apache, Eclipse for the early web, open source, and IDE, and now IoT-A, IoT Forum) in conjunction with top-down (Fortune 500’s) across the span of industry. But first, collaboration must occur to work out the details of architecture, data science and scalability. This is contingent on both legacy systems and modern applications synchronizing and standardizing in the frameworks conceived by open and organizing bodies (meant to unify and standardize) such as IoT-A and IoT-I. Indeed, events like IoT-Week in Helsinki bring together thought leaders, technologist and organizations – all working to unify and promote IoT architecture, IP and cognitive technologies, as well as semantic interoperability.

In the spirit of what is being achieved by various bodies collaborating in Helsinki, Brian Hammill asserts: “The goal of a semiconductor company used to be to provide silicon. Today it is more as we need development tools as well as software stacks. The future means we need also to provide the middleware or some for of interoperability of protocols so that what goes in between the embedded devices and the customers’ applications. I think an IoT Toolkit achieves that in its design.  Atmel also offers 802.15.4 radios, especially the differentiation of the Sub-GHz AT86RF212B versus other solutions that have shorter range and require and consume more power.

We also must provide end application tools for demonstration and testing, which can then serve as starter applications for customers to build upon.”

There will be large enterprise software managing data in the IoT. Vendors such as SAS are providing applications at the top end to manage and present  data in useful ways, especially when it comes to national healthcare. Then there are companies which already know how to deal with big data like Google and major metering corporations such as Elster, Itron, Landis+Gyr and Trilliant. Back in the day, meter data management (MDM) was the closest thing to big data because nobody had thought about or cared to network so many devices.

We tend to think of IoT as a stereotype of sorts – forcing an internet-based interaction onto objects. However, it is really trying to configure the web to add functionality for “things,” all while fundamentally protecting privacy and security for a wide range of objects and devices, helping us shift to the new Internet era. Currently, there a number of organizations and standards bodies working to build out official standards (IETF) that can be ratified and put into engineering compliance motion. Really, it’s all starting to come together, as illustrated by the recent IoT Week in Helsinki which is also working to bring Internet of Things together. Here is IoT’s very own original champion, a leader whom has been working toward promoting the Internet of Things (IoT) for 15 years: Kevin Ashton’s opening talk for the Internet of Things Week in Helsinki (video).

iot-week-partners

Remarks at the opening of Third Internet of Things Week, Helsinki, June 17, 2013:

Thank you, and thank you for asking me to speak at the Third Internet of Things Week. I am sorry I can’t be with you in Helsinki. This is a vibrant and growing community of stakeholders. I am proud to have been a part of it for about 15 years now.

One of the most important things that is going to happen this week is the work on IOT-A.  It is really important to have a reference model architecture for the Internet of Things. And one of the reasons is that for most of those 15 years, we’ve been talking about the Internet of Things as something in the future, and, thanks to amazing work by this community — I would particularly like to recognize  Rob van Kranenburg and Gérald Santucci and the work of the European Union, which has been amazing for many, many years now — the Internet of Things is not the future anymore. The Internet of Things is the present. It is here, now.

I was with an RFID company a month ago who told me that they had sold 2 billion RFID tags last year and were expecting to sell 3 billion RFID tags this year.
rfid-tags

So, just in 2 years, this one company has sold almost as many RFID tags as there are people on the planet. And, of course, RFID is just one tiny part of the Internet of Things, which includes many sensors, many actuators, 3-D printing, and some amazing work in mobile computing and mobile sensing platforms from modern automobiles, which are really now sensors on wheels, and will become more so as, as we move into an age of driverless cars, to the amazing mobile devices we all have in our pockets, that I know some of you are looking at right now. Then there are sensor platforms in the air. There is some really amazing work being done in the civilian sector with drones, or “unmanned aerial vehicles.: that are not weapons of war or tools of government surveillance but are sensor platforms for other things.

And all this amazing technology, which is being brought to life right now, is connected together by the Internet, and we can only imagine what is coming next. But one thing I know for sure is, now that the Internet of Things is the present and not the future, we have a whole new set of problems to solve. And they’re big problems. And they’re to do with architecture, and scalability, and data science. How do we make sure that all the information flowing from these sensors to these control systems is synchronized and harmonized, and can be synthesized in a way that brings meaning to data. It is great that the Internet of Things is here. But we have to recognize we have a lot more work to do.

It is not just important to do the work. It is important to understand why the work is important. The Internet of Things is a world changing technology like no other. We need it now more than ever. There are immeasurable economic benefits and the world needs economic benefits right now. But there is another piece that we mustn’t lose sight of. We depend on things. We can’t eat data. We can’t put data in our cars to make them go. Data will not keep us warm.

And there are more people needing more things than ever before. So unless we bring the power of our information technology — which, today, is mainly based around entertainment, and personal communication, and photographs, and emails — unless we bring the power of our information technology to the world of things, we won’t have enough things to go around.

The human race is going to continue to grow. The quality of our lives is going to continue to grow. The length of our lives is going to continue to grow. And so the task for this new generation of technology and this new generation of technologists is to bring tools to bear on the problems of scaling the human race. It is really that simple. Every generation has a challenge, and this is ours. If we do not succeed, people are going to be hungry, people are going to be sick, people are going to be cold, people are going to be thirsty, and the problems that we suffer from will be more than economic.

I have no doubt that we have to build this network and no doubt [it] is going to help us solve the problems of future generations by doing a much more effective job of how we manage the stuff that we depend on for survival. So, I hope everyone has a great week. It is really important work. I am delighted to be a small part of it. I am delighted that you all are in Helsinki right now. May you meet new people, make new friends, build great new technology. Have a great week.

 

1:1 Interview with Michael Koster


Three-part Interview Series (Part 2)


Series 2 – IoT Toolkit and Roadmap

Tom Vu (TV):  What is in the roadmap for IoT Toolkit?

Michael Koster (MK):

The IoT Toolkit is an Open Source project to develop a set of tools for building multi-protocol Internet of Things Gateways and Service gateways that enable horizontal co-operation between multiple different protocols and cloud services. The project consists of the Smart Object API, gateway service, and related tools.

IoT Smart Object Structure

IoT Smart Object Structure

The foundation of the platform is purely bottom up, based on applying best practices and standards in modern web architecture to the problem of interoperability of IoT data models. I believe that the practice of rough consensus and running code results in better solutions than a top-down standard, once you know the basic architecture of the system you’re building.

To that end, I created a public github and started building the framework of the data model encapsulations and service layer, and mapped out some resourceful access methods via a REST interface. The idea was to make a small server that could run in a gateway or cloud instance so I could start playing with the code and build some demos.

The next step is to start building a community consensus around, and participation in, the data models and the platform. The IoT Toolkit is a platform to connect applications and a mixture of devices using various connected protocols.  It’s real power lies in its broader use, where it can span across all of our connected resources in industry, ranging from commerce, education, transportation, environment, and us. It’s a horizontal platform intended to drive Internet of Things more widely as an eventual de facto standard, built for the people who are interested in building out Internet of Things products and services based on broad interoperability.

IoT Sensor Nets Toolkit

IoT Applications Run on Cloud or On Gateway

We intend to create a Request For Comment (RFC), initiate a formal process for the wider Internet of Things platform and standards.  An community agreed upon process similar to the world wide web that we use today, based on rough consensus and running code, with RFCs serving as working documents and de facto standards that people can obtain reference code, run in their system to test against their needs, and improve and modify if necessary, feeding back into the RFC for community review and possible incorporation of the modifications.

The Internet of Things interoperability platform stands as an ideal candidate, leveraging the power of the open source community’s development process.  In turn, community involvement is taken to a new level, across many fields of discipline, and in many directions. Here is where we can get the most benefit of an agile community.  Crowdsource the development process based on principles of open communication and free of the need for participants to protect interests toward proprietary intellectual property.

We need to build the platform together meshed around the community of Makers, DIY, Designers, Entrepreneurs, Futurist, Hackers, and Architects to enable prototyping in an open ecosystem.  Proliferation then occurs; a diverse background of developers, designers, architects, and entrepreneurs have many avenues of participation. They can create a new landscape of IoT systems and products.

This broad participation extends to industry, academia and the public sector.  We are aiming for broad participation from these folks, build a global platform based on common needs. As a member of the steering committee, when I participated in the IoT World Forum, I heard from the technical leaders of enterprise companies (Cisco and others), research departments, and IoT service providers. They believe an open horizontal platform would be needed to enable applications that span across their existing vertical markets and M2M platforms.

Instead of a top-down approach, where people from corporations and institutions get together in a big meeting and put all their wish lists together to make a standard, we’re taking an overall bottom-up approach, bringing together a diverse community ranging from makers to open source developers, and entrepreneurs. Together with corporations, academia, and public sector, we all will participate in a very broad open source project to develop a platform that can be ubiquitous that everyone can use.

In many ways, this is modeled after the Internet and World Wide Web itself.  As we need to create a more formal standard, it will likely engage with the IETF and W3C. A good example is the semantic sensor network incubator project, which is an SSN ontology that describes everything about sensors and sensing. This enables broad interoperability between different sensor systems and platforms, based on common data models and descriptions. What we want to do is something similar to that, only on a more comprehensive scale and intended for the Internet of Things.

Tom Vu (TV):  Can you take us through a tour of the Data Object model importance and how it yields significance for simple and sophisticated connected devices?

Michael Koster (MK):

The Internet of Things today consists of many different sensor networks and protocols, connected to dedicated cloud services, providing access through smartphone and browser apps. It is rare for these separate “silos” to cooperate or interact with each other.

We abstract the complexity of sensor nets connecting devices and hardware by adding a layer of semantic discovery and linkage. This enables the sensors and actuators on disparate sensor nets to be easily combined to build integrated applications.

The way this works is using a few techniques. First, the different sensor nets are integrated through a common abstraction layer. This works a lot like device drivers in an operating system, adapting different devices and protocols to a common system interface. Only in this case, they are adapted to a common data model.

The common data model for sensor nets is based on the new IETF CoRE application protocol and sensor descriptions. This provides standard ways for common types of sensors to be discovered by their attributes, and standard ways for the data to be linked into applications, by providing descriptions of the JSON or BSON data structure the sensor provides as it’s output.

We use the W3C Linked Data standard to provide web representations of data models for sensor data and other IoT data streams. Linked data representations of IETF CoRE sensor descriptions are web-facing equivalents of CoRE sensor net resources. Linked data provides capabilities beyond what CoRE provides, so we can add functions like graph-based access control, database-like queries, and big data analysis.

Internet Smart Objects

Internet Smart Object

Internet of Things Applications are essentially graph-structured applications. By using Linked data descriptions of JSON structures and the meaning of the data behind the representation, we can create applications that link together data from different disparate sources into single application graphs.

Then we enable the platform with an event-action programming model and distributed software components. The common semantic language enables the data sources and software components to easily be assembled and make data flow connections. The result is an event-driven architecture of self-describing granular scale software objects. The objects represent sensors, actuators, software components, and user interaction endpoints.

FOAT Control Graph

Interent of Things with FOAT Control Graph


Tom Vu (TV):  Who and what companies should be involved?

Michael Koster (MK):

Whoever wants to participate in the building out of the Internet of Things. The people that use the infrastructure should build it out; the people who want to provide products and services based on interoperability, along with those who provide the backplane of thinking low power microcontrollers / microprocessors, connected sensors, and importantly the network infrastructure.

We want to enable all avenues of participation to allow corporations, academia, policy and standards makers, entrepreneurs and platform developers, makers, and DIY hackers all to be involved in building the platform as a community.

For corporations, we will provide an important role, to build a vendor-neutral platform for data sharing and exchange, an open horizontal platform that will allow the integration of what were traditionally vertical markets into new horizontal markets.

Anyone participating or expecting to participate in the emerging Internet of Things, Internet of Everything, Industrial Internet, Connected World, or similar IoT ecosystems initiatives, could benefit by participating in creating this platform. Companies that provide network infrastructure and want to build in value add can adopt this standard platform and provide it as infrastructure. Companies that want to provide new services and new connected devices that can use the IoT Toolkit to easily deploy and connect with existing resources could benefit.

All companies, organizations, and people that can benefit from an open Internet of Things are welcome to participate in the creation of a platform that everyone can use.

Tom Vu (TV):  How important is Open Source to Internet of Things evolution?

Michael Koster (MK):

I don’t see how the Internet of Things can evolve into what everyone expects it to without a large open source component. We need to go back to Conway’s law and look at it from both the system we’re trying to create and the organization that creates it. Interoperability and sharing are key in the system we want to create. It’s only natural that we create an open development organization where we all participate in both the decisions and the work.

Removing the attachment of intellectual property, changes the dynamics of the development team, keeps things engaged and moving forward solving problems. It’s important for software infrastructure projects like this to remove the barrier to cooperation that arises from the self-protection instinct around proprietary Intellectual Property, or even egoism associated with soft intellectual property, “my” code.

Instead, we turn the whole project into a merit-based system as opposed to being ego driven.  Rather than worry about guarding our property, we are motivated to solve the problems and contribute more to the deliverable. The limits to participation are removed and there is a more rapid exposure of intentions and goals. Engagement and innovation can rule in this environment of deep collaboration.

Tim Berners-Lee said that he was able to achieve the creation of the World Wide Web system because he didn’t have to ask permission or worry about violating someone’s copyright. We are creating the same environment for people who want to build our platform, and even for those who want to build their services and applications on top of the platform.

We are going to create the service enabled layer as open source as well so that any one of the companies can help proliferate the idea and everyone has influence and access to the development of the underlying IoT platform.  If it’s open source infrastructure and platform software, you can make a service on top of that software that can contain proprietary code. With our license, you can even customize and extend the platform for your own needs as a separate project.

Tom Vu (TV):  Describe your work with the EU IoT organization and how you are involved as a voice for the Internet of Things?

Michael Koster (MK):

I work with the IoT Architecture group within the overall EU Internet of Things project. The IoT-A group is closely related to the Future Internet project. They have an Architecture Reference Model describing different features one might build in an IoT platform, a sort of Architecture for Architectures. Since their process mirrors my own design process to a large extent, I found their reference model to be compatible with my own architecture modeling process.

They are conducting a Top-Down activity, stewarding the participation in the architecture and standardization model.  One of the ways I work with IoT-A is to use the Smart Object API as a validation case for the Architecture Reference Model. They are building the reference model top down, and we’re building the architecture bottom-up, based on a common expression of architecture relationships and descriptions.

I am also involved in advocating open source of IoT and building of local IoT demonstrator projects, educating around IoT, open data, etc. as well as user controlled resource access and privacy.  I am providing a voice for open source and open standards, into the standards movement going forward.

Here in the USA, there is not anything like what they have in Europe. Here the process will be to engage corporations and institutions and create a participatory structure that enables fair and open opportunity for influence and access to both the development process and the final products.

Tom Vu (TV):  How important is an open standard – building of an RFC in which all industries can agree upon ultimately serving to a wider scale factors of adoption and proliferation?

Michael Koster (MK):

To simply put it, the construction of a formal RFC is something that describes part of system.  A Request for Comments (RFC) is a memorandum published by the Internet Engineering Task Force (IETF) describing methods, behaviors, research, or innovations applicable to the working of the Internet and Internet-connected systems.  It is a process or evolution in achieving a more widely adopted standard.  The founders of the Internet created this process, and http, etc are all built using original RFC process from many years ago.

Through the Internet Engineering Task Force, engineers and computer scientists may publish discourse in the form of an RFC, either for peer review or simply to convey new concepts, information, or (occasionally) engineering humor. The IETF adopts some of the proposals published as RFCs as Internet standards.

If the IoT Toolkit platform becomes adopted, it may eventually be as many as 10-12 different RFCs, but it’s important to get people to agree on common first set.  This is the initial phase into a more pervasively used universal standard.  In fact, it’s sort of like a strawman platform.  It’s intent is to describe and collaborate, but also invoke and seek out broader participation…  We are at the stage of putting proposals together over the next few weeks and setting up meetings to talk to many people around collaboration and participation in building an Internet of Things platform.

We believe that an open standard platform for horizontal interoperability is key to achieving the promise of the Internet of Things. Everyone needs to be able to present and process machine information in machine understandable formats on the IoT, just as we humans enjoy commonly understandable web data formats and standardized browsers on today’s WWW. It’s important that developers be able to focus on solving problems for their clients and not waste resources on communication and translation.

Read Part Three to Learn More about Why IoT (Internet of Things) Matters?

Here are Part 1 and Part 2 of the Interview Series.