A recent routine trip to the family doctor resulted in conversations with our primary physician, a specialist and a short trip to a medical facility for a MRI. Nothing extraordinary with the process, and somewhat expected, and yet the process held one last surprise. What hadn’t been expected was the work we had to do to interpret the results – three separate portals had to be accessed with the family intimately involved in correlating the data. As much as we always talk about a highly integrated world where external agencies are working on our behalf to correlate data, we have a long way to go.
Having just leased a new hybrid vehicle that is completely drive-by-wire, where it’s left to the computers to determine synchronization between two gearboxes, applying torque to the front wheels when needed and yes, providing just enough steering feedback to convince us we are sensing changing road condition, the days of autonomous driving no longer seems to me to be that far off in the future. As far-fetched as it was just a couple of years ago as we all saw the first pictures of Google’s weird egg-shape self-driving car, the emphasis today has been on where and when the first cities or their surrounding suburbs are zoned for autonomous driving only.
Data and the sensors that generate data are already having an impact on our everyday lives. Whether it’s the computer enhanced imaging the medical profession can produce for patients or simply the computer control that the auto industry can give drivers, a wealth of data is being generated. If we are on the sidelines about the potential impact on our business of the Internet of Things (IoT), then we may be missing the point – it’s already arrived and the only question now is how much more data is headed our way. Like everything encountered inside the data center, it’s going to come down to a question of scale – just what are we going to do with the data streaming by and where will we store data we deem relevant?
Talking to a number of vendors this month has really opened my eyes to just how many data center components are being shipped today that are capable of providing sensor data every minute of every day. For IT professionals looking outwards to what our business needs to support its customers, we have become a class of users and for the moment, we are still well behind in the race to make better use of what is now coming to us from our networks and supporting infrastructure. Like everyone else, I was shocked to hear of yet another major meltdown of an airline computer system.
While I am not confident that I know all the details, it looks like a glitch in the power brought down servers, many of which didn’t have backups, and the subsequent layered implementation of multiple applications meant that returning service to normal took a lot longer than anyone expected. Yes, there was a lot of data steaming into the data center but apparently not enough information to support a level of data center automation we would have expected. So often we marvel at manufacturers who today live by just-in-time deliveries of individual components as well as subassemblies, but when are we going to put the data center on a truly response-driven model that responds to just in time arrival of data?
The data center seems to be the logical choice for the first application of IoT data correlation and automated response. After all, turning data into information has been the mantra of IT professionals for decades and from the time the first database was deployed primitive applications performance monitoring solutions appeared. However, with what we see today, there’s just too much data to expect vendors providing such solutions to simply provide an add-on for analytics. There just has to be more intelligence provided to front-end the data that arrives at the data center. The image of trying to sip water from a fire hydrant comes to mind and continues to be the best way to illustrate the difficulties associated with turning data into information.
“With the onset of the Internet of Things, our universe is now flooded with events that require event processing. These events are introduced to the digital world with all the wonders of IoT and wearable computing that report on anything and everything that happens. But our ability to take advantage of the power of these events is currently quite limited,” writes Opher Etzion in a July 18, 2016, post to the Striim blog, Programming: A Shortfall of Legacy Event Processing. Etzion serves as Professor and Department Head, Information Systems at Academic College of Emek Yezreel. Previously he had worked for IBM where, most recently, he was the senior technical staff member and chief scientist of event processing having been the lead architect of event processing technology in IBM Websphere. He has also been the chair of EPTS (Event Processing Technical Society).
“The way that most programmers have approached event processing is by inserting all events into a database, and then executing ad-hoc or periodic queries to detect patterns. This is consistent with the ‘request-response’ paradigm of thinking in most programming,” wrote Etzion. The answer, when so much data is being generated, is obvious to Etzion when he adds, “Like the Internet that succeeded when browsing the web became possible for everybody, the Internet of Things will bloom when everybody can control an application’s logic by creating and modifying patterns based on multiple events.” This is exactly where Striim inserts itself into the data streams coming from every device that is continuously generating data.
When there is so much talk about the role of data stream analytics, particularly when it takes place in real time, with IT professionals stepping up to leverage these analytics in healthcare, automobiles, and manufacturing – isn’t it time to begin applying analytics to the operations of the data center itself? Whether it’s the oversight of the network, the monitoring of the applications or even the health of the physical data center itself, there is just so much data that traditional approaches to processing it all simply fail. It truly is coming down to how rapidly we respond to evolving patterns as they form and this has always been the design center for the data stream analytics solution from Striim.
Just in time processing of critical data, detecting patterns and then shaping useful information for all involved in the health of applications is perhaps the most useful early adoption of data analytics. The payback in terms of improved reliability as well as keeping well away from the media’s bright spotlight, are clearly the goals of most CIOs and data center managers. Manufacturing quickly learnt the merits of Just-in-Time supply chains but the question remains – can IT professionals everywhere respond as quickly as their manufacturing peers and embrace data just in time?