are those that will enable the Industrial IoT.
Obviously there are many standards that already exist and are
highly relevant to these second-generation systems. The IIC has
published a Reference Architecture that identifies relevant standards,
and the OpenFog Consortium similarly has working groups on open
architectures (using open standards). Both of these organizations
were founded and are led by some of the biggest names in IT, those
with the influence to define the next-generation industrial digital
1. Look beyond traditional (first-gen) M2M systems. Integrating
multiple north-south data silos later will be a nightmare.
2. Look to industry consortia such as the IIC and the OpenFog
Consortium to help you understand concepts such as edge
intelligence, distributed analytics, and fog computing.
3. New standards are rarely required. Applying existing proven and
recommended standards (such as DDS and MQTT) within an open
reference architecture will give you a timely solution with minimal risk.
4. Consider the reference architectures being collaboratively
developed by the biggest names in IT, and beware of M2M vendors’
Dr. Edward Griffor, Associate Director, Smart Grid
& Cyber Physical Systems Program Office, NIST
Yes and no. Yes, because the vision for M2M,
or more generally IoT, anticipates a high degree
of interoperability, and standards are a traditional
way of achieving that goal. Standardized interfaces
would also, as in other domains, create opportunities for commercial
products based on a broad potential usage for solutions.
These standards would enable uniformization at different levels:
from a common data model to common communications protocols
and network management algorithms, by domain, all the way
to common ‘parts’ (i.e. common communications hardware and
No, because there remain in the commercial space multiple
perspectives, participants, and goals supporting divergence in device-to-device communications. Vendors producing customized variants
have independent revenue streams associated with each variant.
This is the case even if much of the abstract data being
communicated is largely common. Their customers still regard this
communication layer as proprietary, as it reflects and supports closely
‘their design’. Standards may cut into some of those revenue streams.
Agreeing on standards would potentially render many of the vendors’
In the Product Design & Development Brainstorm we talk with industry leaders to get their
perspective on issues critical to the design engineering marketplace. In this issue, we ask:
As M2M systems become more pervasive, will vendors need to agree upon standards for
device-to-device communication? What kind of standards, if any, are needed?
Steve Jennis, SVP Corporate Development,
Let’s first address the definition of M2M
systems. A traditional M2M system involves
point-to-point (north-south) connections
over the Internet between a device and a
Typically, these first-generation systems only address specific
tactical applications and have their own embedded ways of
enabling communications – standards-based or not. As such,
they have only generated a relatively small market (compared to
enterprise computing markets and the potential of the IoT) and
are thus predominantly serviced by relatively small vendors.
Now let’s look at the approach required to fully exploit the
potential of the IoT. The infrastructure required is quite different
from point-to-point first-generation M2M systems. It is being
defined by industry giants such as Intel, Cisco, IBM, and
Microsoft in collaboration forums such as the Industrial Internet
Consortium (IIC) and the OpenFog Consortium.
This infrastructure is required to enable a digital enterprise
through supporting ubiquitous, yet secure, data accessibility,
system-wide interoperability, and composability. That is, both
north-south and east-west data connectivity to support business
value-add through applications at the edge (in devices), in
gateways, in fog nodes (in appliances), or in the Cloud (as remote
services). Many analysts (e.g. IDC) comment that soon 40% of Io T
data will be “stored, processed, analyzed at or near the edge” and
that 50% of Io T systems will be “network constrained.”
Therefore, first-generation M2M systems (with their
dependence on Cloud services and always-on Internet
connectivity) will not be acceptable for many reasons (latency,
bandwidth, security, reliability, robustness, recovery, etc.). This
has resulted in industry efforts to support concepts like edge
intelligence, fog computing, and distributed analytics, in addition
to, and complementing, Cloud services. If first-generation M2M
systems pioneered the use of the Internet and Cloud services to
add value to device data, then these second-generation systems