top of page
  • Jim Gavigan

Are Industrial Data Historians an IIoT Platform?


**** This is originally posted on Martin Davis's Industry 4.0 blog. I suggest you give him a follow. The original post where you can see how to follow him is here. ****

A few weeks ago, Martin Davis and I were invited to present on a webinar hosted by Fluke/Accelix. We talked about the importance of the IIoT and Industry 4.0 as it pertains to maintenance. Martin talked strategy and how all of the pieces fit together and what you should be thinking about. I then showed some examples of what we have been doing using IIoT and Industry 4.0 principles to better our customers’ maintenance practices.

In the Q&A portion of the webinar, we got on to the subject of IIoT platforms and if the traditional data historian is an IIoT platform and/or how are they different. I am not sure you would pay attention long enough for me to really define all of the seeming nuances of the IIoT, but I want to at least try to somewhat succinctly answer these questions.

I must admit, even I as one who works in this space exclusively, I have trouble really telling anyone what an IIoT platform is because it seems to mean different things to different people. Some are focused on hardware connectivity, while other platforms are more about software connectivity, which is where we are focused. So, I will focus there in my explanation below.

In a nutshell, the IIoT is all about connectivity and how easily (or not) data is shared between layers. Those layers being:

  • Sensors

  • Controllers or data concentrators at the plant floor level

  • Historical data repositories (data historians)

  • MES/MOM systems (including lab systems)

  • ERP

  • CMMS

  • Analytics platforms either on premise or in the cloud

In the past, there have been lots of data silos. The data historian had data, the lab system had data, the MES System had data, the ERP system had data, etc. However, none of these systems shared their data easily with others or much less with a cloud-based analytics tool, data lake (data swamp in most cases), or other like system. It seemed that we always had to write some kind of custom code to get these systems to talk and to get the data mapping done right. So, this makes for a very inflexible environment that is expensive to maintain. It is sometimes difficult to just make changes in the environment if one of the systems needs to be swapped out or even if one of the systems just needs to be upgraded.

The promise of an IIoT platform is that there will be a unified namespace where all data (control system data, data historians, MES/MOM Systems, ERP Systems, and Cloud Systems) is managed. My friend, Walker Reynolds at Intellic Integration in Dallas, has better helped me understand this concept and why this is so important. You can catch one of his primers here. I will get back to what this means in a moment, but it builds on an idea my friend Lance Fountaine, now an Operation Intelligence Leader at Cargill had talked with me about when he was working for Alcoa and I was with OSIsoft. He talked about OSIsoft’s PI System being the data infrastructure for any of the point systems that people kept trying to sell them. He would say something like: “You have an energy management solution? Great, here is my data” and he would point them to their PI System. So, the idea was not to integrate a bunch of point systems, but to have a unified data repository for all pertinent plant floor data. If you ask Ben Still, who works for me, he will tell you that he views connectivity in an IIoT world as all of the data is published out to a “data bus” and you simply plug in the application you are integrating and subscribe to the data you need.

So, think about it, if ALL systems producing/consuming data produce their information to a central “broker,” and there is a common/easy way to configure what information your new system would consume or produce out into the namespace, integration would be much easier. I actually wrote a post on January 12, 2016, where I posed the question “Can data flow be a resource just like water or electricity?” that dealt with this topic before the unified namespace idea really had any traction. Think about it, we take having electricity and running water for granted. There is a tremendous infrastructure out there for connecting water and electricity to our houses, businesses, and anywhere we go. However, if we want that easy of access to plant floor data when we go to work, forget about it. So, the IIoT (software side of things) is really about allowing for ease of sharing of ALL data with minimal amounts of custom code to make disparate systems “talk” to each other. Each system simply publishes information out to a broker, and any other system can consume the information. That way, we can consume data just like we use water and electricity. Is this type of technology here? Yes, but not everyone is playing this way as of yet.

So, what does the traditional industrial data historian have to do with this type of an environment and how does it play? Well, I believe it will still play an incredibly valuable role. I wrote about what I think the IIoT architecture of the future would look like back in March of 2017 and you can read it here. I think that what we will see is that the data historian in some cases may be the “edge” storage platform (i.e. the closest physical place to the actual equipment that you are storing data, running analytics, or running machine learning algorithms). In other cases, it will remain as the on-premise storage and analytics engine, and in other cases, you will see traditional historians in the cloud because of the economics and redundancy capabilities. In almost every case, the historian will remain the best repository for time-series data and pretty much any historian worth its salt today has analytics and event capture capability. They also have some type of web-based visualization tool for dashboarding. I still think that for many years to come, the data historian will remain quite relevant. After all, if you want to do machine learning or artificial intelligence, you better have good, quality historical data and data historians are built and optimized for time-series data’s ingress and egress. I still think out of the box functions for event capture and basic analytics and aggregation is still a tremendous value in the data historian and I believe this will remain true for quite some time to come.

Now, in a true IIoT architecture, I see the data historian as a “node” publishing and consuming information to the unified namespace. I don’t see MES, ERP, or even a lot of cloud-based analytics platforms always storing all of the time-series data. I think in many cases, the data historian will aggregate the data and push it to other places. For instance, you might want to publish an hourly average flow rate from the historian, and not the 10 second raw flow rate from the historian to a unified namespace to be consumed in a cloud-based analytics platform. As a company, you may want to aggregate run hours and other conditions once or twice a day into a unified namespace to then be consumed by your CMMS system, rather than all of the raw data that is produced by the equipment. Again, let the historian aggregate the pertinent data and publish it as needed. Your ERP system may only need shift totals or daily totals for production accounting and I think the data historian is often a great place to perform these aggregations and rollups, which can then be pushed to the namespace to be consumed in the ERP system. MES, CMMS, and ERP systems are often horrible at dealing with these types of aggregations of raw time-series data, as that is not what they were built for. So, let the data historian do the aggregation and basic analytics work, and then publish the important/correct/contextualized data to the namespace so that other systems can consume the data to do their jobs.

I must admit, I am not a big fan of all of these buzzwords: IIoT, Industry 4.0, Digital Transformation, et al. However, they do give us a common platform to try to communicate from. However, it seems as if everyone jumps on the bandwagon and puts their own spin on things. I have seen people claim to have an IIoT platform, and all they have is a few sensors pushing some data to their own AWS cloud instance. That isn’t a “platform” folks. That might be a small corner of a platform, but that isn’t a platform, so stop calling it that and quit confusing the market. So, if you take one thing out of this post, take the fact that the IIoT is all about connectivity and ease of integration between disparate systems by using a unified namespace.

If we are going to reach the promise of a fourth industrial revolution, we have to leverage our data in a much smarter, more proactive, and more predictive way. The IIoT is simply a way to make this a reality and not some dream that is hopelessly complex and impossibly expensive. The historian will have a role to play, but I don’t see any data historian being a true IIoT platform. My belief is that the data historian will simply be a very important node in the ecosystem.

2,361 views0 comments

Recent Posts

See All
bottom of page