Industrial Internet of Things (IIoT), Streaming Data and Analytics



The value of data is becoming a larger part of the business value chain, the lines between different industries become more vague, or as GE’s Chairman and CEO Jeff Immelt once stated: “If you went to bed last night as an industrial company, you’re going to wake up today as a software and analytics company.” This is not only true for an industrial company, but for many companies that produce “things”: cars, jet-engines, boats, trains, lawn-mowers, tooth-brushes, nut-runners, computers, network-equipment, etc. GE, Bosch, Technicolor and Cisco are just a few of the industrial companies that offer an Internet of Things (IoT) platform.

Manufacturing has undergone a quiet revolution in the last few years, but even most industry insiders don’t realize it yet – because it’s an invisible revolution, running through its cables and wires and circuits.

Today’s manufacturers produce more data in a single day than they did in a full month just ten years ago: sensor data, camera images, PLC data, digital gauge data, bar code scans, and much more. Making sense of this data – gaining real time, actionable intelligence from this flood of manufacturing data – is the challenge. The issue is so prevalent that the industry has coined a term for it: DRIP (Data Rich, Information Poor).

This explosion of data is not unique to manufacturing, but it is the most acutely felt in this industry. According to multiple sources, manufacturing creates the highest percentage of data from any industry. Data has become the lifeblood of advanced manufacturing, demonstrated through mantras such as “You can’t fix what you don’t measure.” The challenge now is that EVERYTHING is measured.

In other industries, this challenge has become known as “Big Data”. Simply put, Big Data means data sets so large and complex that it becomes difficult to analyze and process using traditional applications. An entire generation of innovative new technology has sprung up to address these issues.

Streaming Analytics for IoT

Streaming Analytics for IoT

 

Stream Processing and Streaming Analytics

Big data architecture contains several parts. Often, masses of structured and semi-structured historical data are stored in Hadoop (Volume + Variety). On the other side, stream processing is used for fast data requirements (Velocity + Variety).

“Streaming processing” is the ideal platform to process data streams or sensor data (usually a high ratio of event throughput versus numbers of queries), whereas “complex event processing” (CEP) utilizes event-by-event processing and aggregation (e.g. on potentially out-of-order events from a variety of sources – often with large numbers of rules or business logic). CEP engines are optimized to process discreet “business events” for example, to compare out-of-order or out-of-stream events, applying decisions and reactions to event patterns, and so on. For this reason multiple types of event processing have evolved, described as queries, rules and procedural approaches (to event pattern detection). The focus of this article is on stream processing.

Stream processing is designed to analyze and act on real-time streaming data, using “continuous queries” (i.e. SQL-type queries that operate over time and buffer windows). Essential to stream processing is Streaming Analytics, or the ability to continuously calculate mathematical or statistical analytics on the fly within the stream. Stream processing solutions are designed to handle high volume in real time with a scalable, highly available and fault tolerant architecture. This enables analysis of data in motion.

In contrast to the traditional database model where data is first stored and indexed and then subsequently processed by queries, stream processing takes the inbound data while it is in flight, as it streams through the server. Stream processing also connects to external data sources, enabling applications to incorporate selected data into the application flow, or to update an external database with processed information.

A recent development in the stream processing industry is the invention of the “live data mart” which provides end-user, ad-hoc continuous query access to this streaming data that’s aggregated in memory. Business user-oriented analytics tools access the data mart for a continuously live view of streaming data. A live analytics front ends slices, dices, and aggregates data dynamically in response to business users’ actions, and all in real time.

Live Datamart

Live Datamart

Industrial Analytics of Things(IAoT)
The “Industrial Analytics of Things” (IAoT) is the analytical component of “Industrial Internet of Things” (IIoT). In the industrial setting, device and sensor technologies are rapidly becoming more intelligent and directly “on the net”, bringing us more data from everywhere and needing to go anywhere.

“Industrial Analytics of Things” uses this data about operations including materials, process conditions, product and production performance, customer feedback from the web, product and brand sentiment, everything you need no matter where it is located. It captures, validates, cleans and filters, analyzes, predicts, adapts and optimizes enterprise performance in real-time. Real-time optimization is proven to reliably improve performance 5-20% and sometimes much more. To deliver the “Industrial Analytics of Things”, it requires creating a new high performance distributed computing architecture that captures all this streaming industrial “big data” at its source and delivers awareness, understanding and predictive performance everywhere, anywhere, all the time.

Value for Real-Time Optimization has been proven to improve = 5-20% to the bottom line (especially in Oil & Natural Gas industry with heavy dependency on SCADA  devices)

Regardless of the scale of the company, this gives a sizable boost to your bottom line. Other benefits include:

  • Clean and Validated Data to manage decision making
  • Situational awareness and visibility across the company
  • Understanding the Insights the information provides which adds value.
  • Causality Identification in Time to eliminate issues.
  • Increase Customer Satisfaction and lower operating Costs
  • Enhances collaboration and sharing across the organization

Industrial operations today produce operational data in larger volumes in real-time and with the “Industrial Internet of Things” it’s coming from everywhere; controllers, sensors, test equipment, all sorts of devices found in unit operations, plants, regions, labs, maintenance shops,etc. In the future, fewer sensors will be hard-wired to control systems, but will become a shared resource of streaming data. This data needs to be captured, stored, validated, cleaned, synchronized, related, visualized, analyzed and shared to ensure compliance of planned processes to yield optimal business performance.

The best way to handle all this data is to put analytics close to the source and aggregate it realtime. Some call this “Fog” computing, a low level cloud or “Warehouse Scale Computing”, but here the “warehouse” is the entire enterprise. Such systems are:

  • Access, validate and store data at or near its source but accessible from anywhere
  • Collect only data that is useful – right-size sensor data collection
  • Perform analytics realtime, close to the operation for high speed reaction times
  • Optimize “Big Data” and bandwidth by not sending all data to a central Cloud
  • Aggregate and analyze related data where they intersect

To architect such a system, we draw upon the experiences, architectures, tools and successes of such computing giants as Google, Amazon, YouTube, Facebook, Twitter and others. They have created robust high performance computing architectures that span global data centers. They have provided development tools and languages such as Google’s IoT Cloud (https://cloud.google.com/solutions/iot/) that are well suited for high speed concurrent distributed processing and robust networking and web services. Having a similar need, but more finely distributed, we can adopt similar high performance computing architectures to deliver and share results where they are needed in real-time.

Industrial Internet Consortium (IIC)

Industrial Internet Consortium (IIC)

The Industrial Internet Consortium (IIC) was founded in March 2014 to bring together the organizations and technologies necessary to accelerate growth of the Industrial Internet by identifying, assembling and promoting best practices. Membership includes small and large technology innovators, vertical market leaders, researchers, universities and governments.

This goal of the IIC is to:

  • Drive innovation through the creation of new industry use cases and testbeds for real-world applications;
  • Define and develop the reference architecture and frameworks necessary for interoperability
  • Influence the global development standards process for internet and industrial systems;
  • Facilitate open forums to share and exchange real-world ideas, practices, lessons, and insights;
  • Build confidence around new and innovative approaches to security.

http://www.industrialinternetconsortium.org/