IoT Interoperability for Big Data: What Exactly?

The importance of IoT interoperability is widely acknowledged and promoted.  The McKinsey report estimates that achieving interoperability would unlock an additional 40% of IoT market value.  However, there is usually little specificity as to what type of interoperability is being discussed and what it is it useful for when achieved.   There are two important flavors of IoT data interoperability illustrated in the adjoining figure.  We argue that aggregation or service-level level interoperability across domains and specifications is the one necessary to enable useful big-data IoT aggregations.  In contrast, most active IoT standards definitions are focusing on M2M, intra-domain interoperability.

IoT Data Interoperability: What Exactly?

In IoT system hierarchies, such as the one depicted in the figure, there are at least two very important levels of data interoperability:

  • Machine-to-machine (M2M), device-level interoperability

  • Aggregation level interoperability, as provided to cloud services by data queries and APIs

Current IoT Standards Direction: M2M Interoperability, Intra-domain

Device-level M2M interoperability is the one most commonly talked about and it is the target of most IoT standards under development, including OCF (formerly OIC, now merged with AllJoyn and UPnP), W3C WoT, IPSO, OneM2M.  For devices to interoperate, such standards – in addition to data models - often define or assume communication protocols, security, discovery, and sometimes add configuration and device management.  Interoperability is often claimed, but at this level it means that devices, such as home appliances, from different vendors will interoperate if all of them implement correctly the same specification or run some common middleware.  Similar functionality may be achieved by installing proprietary, vendor-specific middleware on all target devices, albeit without the benefit of a public specification and with the ensuing vendor lock in.  Interoperability at M2M level is typically intra-specification and intra-domain, i.e. it can be achieved among a group of devices with compliant implementations of a common specification. The problem in practice is that there are many different IoT standards proposals with incompatible data models and definitions, usually aimed at rather narrowly focused domains – such as (many flavors of) industrial and consumer.  While IoT is poised to thrive by using internet connectivity and much of its design and technology, unfortunately so far it has failed to create a moral equivalent of HTML for sensor data.

Big-data IoT Aggregations: Inter-domain, Inter-specification Interoperability

To increase the value of analytics and AI, big-data aggregations need to include large and diverse sets of IoT data that typically span a multitude of devices across different domains.  In practice, that typically implies the use of multiple different IoT standards mixed with proprietary and legacy data and meta-data formats that are thus inter-domain and inter-specification. As an illustration, consider a smart city that plans to use IoT technology to coordinate and optimize data across its many disparate IoT and legacy systems such as transportation, lighting, energy, building management, water supply.  Or a holistic factory-optimization project that needs to stitch together IoT sensor data with a variety of different and likely incompatible data formats provided by vendor-specific equipment into a common sensor database for predictive analytics.These requirements can be addressed by providing data interoperability at the aggregation or cloud service level, shown in the figure at the layer above the domain-specific or federated sensor database(s). Service-level data interoperability implies providing a common data “meta” model or annotation for data queries used by services.  It provides the ability to query aggregate data and meta-data sets and obtain results in a common form, regardless of differences in formats used to originally encode data when captured at the edge. IIC refers to this as conceptual interoperability [IIC IOT ref arch, p68], i.e. representation of information in a form whose meaning is independent of the application generating or using it.

Why?

A significant benefit of IoT data interoperability is the ability to create large, useful aggregations of sensor data for post-processing such as data mining, analytics, machine learning, and AI.  It is well known that effectiveness of all those techniques increases with the size and diversity of data sets. At present, IoT data are fragmented and locked in silos due to incompatibility of formats in proprietary platforms and in numerous evolving standards that focus on limited domains.  Another significant benefit of an architected approach to interoperability is the ability to incorporate volumes of existing sensor data being produced by legacy systems, including proprietary Supervisory Control and Data Acquisition (SCADA) systems, Building Management Systems (BMS), energy utilities, and various automation and manufacturing systems using legacy standards such as BACnet and Modbus.In addition to increasing the IoT data-set size and diversity, the availability of a commonly understood format enables creation and use of portable IoT applications and services – such as analytics and AI - and significantly accelerates variety and usefulness of offering by following the proven Internet playbook.  In the absence of service-level data interoperability, which is the case today, IoT services and applications tend to be custom tailored to specific systems and data formats, which makes it difficult and prohibitively time consuming to port them to another system or cloud hosting service.  This situation makes today’s complex IoT installations look more like the proprietary and brittle SCADA systems of yore. To proliferate at Internet velocity and scale, IoT data need to be made easy to aggregate with portable apps and services to operate on them – both made possible by service-level interoperability.

How to Get There?

Aggregation- or service-level interoperability requires a common methodology for trans-specification data and meta-data annotation or modeling.  There is no current specification for that, but some standards bodies – like IPSO, with encouraging interest from others – are beginning to work on semantic interoperability across domains and specifications.  Its requirements and properties will be discussed in another post.  However, its feasibility and an implementation option were demonstrated by our interoperability POC. It provides a real-time translation of sensor data encoded in IPSO, OCF, and Haystack into a common semantically interoperable form.