In July, I WROTE about the role IoT analytics can play in optimizing today’s complex storage networks. I wrote in that blog, “Storage system providers have turned to IoT analytics both to help ensure optimal performance and to gain a deeper understanding of how customers are using their solutions.”
Here, I’d like to discuss a related topic: how enterprises need to handle the massive amount of data IoT creates. This data very quickly has evolved from “interesting” to “mission critical.” Storage of sensor data often starts out as an afterthought, without regard to the unique type of data sensors create. Shortly, however, it becomes clear this data requires a different type of thinking when it comes to storage.
Machines and devices are creating enormous amounts of sensor data. The obvious impact on storage is that there is more data to store and for analytics to be effective, this data must be available immediately. The more subtle impact is that IoT data from sensors consists of very small files, of for example, log file data. Sensors can create millions of files that must be accessed when conducting an analysis. This sensor data, typically stored in large-capacity network attached storage (NAS) systems, is moving to all-flash arrays to allow faster analytics.
Another impact is in data protection. Most sensor data can never be recreated; e.g., sensor data on how a machine is performing just prior to a failure is a unique data set. As a result, data protection is even more critical for IoT-related data than it is for other types of enterprise data. And, many of today’s back up applications do not handle processing millions of files in a short timeframe effectively.
Storage has been a key vertical for Glassbeam right since our inception – resulting in many early customers and successful case studies. Please visit our STORAGE SOLUTIONS to get tons more information.