IoT data architecture: Can you get data out of silos?

Article By : James Nolan

In many cases, legacy enterprise IT infrastructure will form the basis for initial IoT deployments.

The Internet of Things (IoT) will generate unprecedented amounts of data—a staggering 400 zettabytes (ZB) of data a year by 2018, according to a report from Cisco. A major part of the value in this data will be in using it effectively across the enterprise or even across different enterprises and industries. If your legacy systems and/or new IoT data exist in silos, your IoT deployments will be limited in their ability to add value to the enterprise.

Before we can discuss the specifics of why silos are not ideal, it’s important to discuss the nature of IoT data. Most different types of IoT deployments will generate different types of data. Some deployments will generate very high volumes of data that could be lots of noise and very little signal; let’s take the transport sector for example. Highway traffic monitoring systems produce high volumes of video, but its value—to monitor traffic patterns—may not be that high; in other words, it may not detect any traffic jams or back-ups at all although massive videos have been produced. Other deployments will generate a much lower volume of data, which could be of very high value; for example, an emergency service generates data only when it detects a critical emergency event alarm that requires immediate action. In either example, messages may be flowing in both directions as well—such as a settings adjustment, a control message, or a configuration protocol message.

Even though you’re likely going to be doing more than just collecting your data, you’ll need to develop sophisticated data collection and management tools to extract value from it. You may find that you’ll need data analytics or business intelligence tools to extract valuable information from the data in enterprise deployments.

In many cases, legacy enterprise IT infrastructure will form the basis for initial IoT deployments. For example, a recent survey from the Telecommunications Industry Association, and commissioned by my company InterDigital, found that 76% of companies are either exclusively or primarily focusing on integrating legacy business systems with IoT solutions to get them fit for purpose. Anyone with a passing familiarity of enterprise IT infrastructure will be able to see how the tendency exists for data to become siloed in legacy applications within an architecture that is not shared or available to other tools or applications. Some of the silos are formed due to different business organisations within an enterprise, but some of them are due to a non-scalable architecture. A typical design would call for one type of device to be connected via a secure gateway to a device management platform, and from there to a series of specific data services. That’s a perfectly acceptable model for a small-scale deployment, but what if you wish to evolve and add a second type of device to your infrastructure? Or a third? Will the new devices be able to connect to the same type of gateway? Maybe. But can they also use the same data management platform and data services? If they can’t, then silos form and your data will have limited use.

There are a number of different ways to get data out of silos. One approach is to look at how cloud-based data services can be integrated with your enterprise IoT deployments in a more open way than the one-to-one model described above. An example would be to almost think of putting all of your data in a marketplace or a “clearing house” type of public or private cloud solution, where it can be jointly used by many different data services at once. This helps improve efficiency in obvious ways, mostly by avoiding the duplication of data streams. But, in other ways, it liberates potential of data from one part of your deployment to be used by applications from another. That simple shift in thinking turns something that wasn’t possible in the earlier example into a tangible solution.

Leave a comment