The Tipping Point – Data Stream Analytics Meets NonStop Transaction Processing in Real Time

4 Minute Read

Are we comfortable yet with data streams? Are we considering tapping them for greater insight into changes of behavior and do we acknowledge their potential contribution to all aspects of our mission critical transaction processing? Or, are we prepared to let the streams burble on by for a while longer? In discussing with the NonStop community big data analytics, and in particular the move to real-time data stream analytics, there is clearly a diversity of opinions, not totally unexpected. However, there are some early signs that there’s a growing appreciation among NonStop users and vendors alike of the intersection between data stream analytics and transaction processing.

Across the business world there’s often talk about tipping points and lynchpin occurrences. Often the fodder for novels and movies lynchpin occurrences are a reference to the first domino that falls which, in turn, triggers an unstoppable fall of dominoes that eventually leads to completely changed circumstances. Tipping point, on the other hand, “Is that subtle juncture where a certain idea, product or behavior suddenly catches on, or ‘tips,’ and establishes a whole new reality on the ground,” as one source I turned to noted. What lynchpins and tipping points have in common is the irreversible impact they have on everything they touch – a kind of course correction where there’s no coming back.

Race car drivers are always fine tuning their race cars, but even as they make adjustments they ensure that they can return to a baseline set-up should changes prove to be a step backwards. That’s the nature of the sport – simply put, if you are not advancing then in standing still you are falling behind. When it comes to IT, improvements came slowly at first with systems even carrying nomenclature tied to a decade – the System 360 followed by the System 370 and so on, as we saw IBM coming out with new mainframes. Fortunately, we have seen vendors drop such classifications as change continues to accelerate. Recent changes are opening our eyes to just how big a contribution technology is making to business – as I pointed out in my recent presentation on IoT and IoT Analytics (IoTA) to the NonStop community at the 2015 NonStop Technical Boot Camp (Boot Camp), by quoting in my opening slides Meg Whitman, CEO of Hewlett Packard Enterprise, “IT strategy and business strategy are no longer separate, they have become inseparable … every business is a technology business today.”

The prospect of reaching a tipping point came as a result of a discussion with one vendor attending my presentation at Boot Camp. What was central to the conversation was the business acceptance of hybrids. In particular, where hybrids weren’t just a reference to clouds, public private and / or managed, but to the technology collective that now makes up a data center. Of course, the development of shared infrastructure chassis from HPE NonStop that packages NonStop together with Linux and / or Windows that share a common InfiniBand infrastructure was part of the conversation, but the picture is a lot bigger than just one vendor’s platform. Hybrids are indeed catching on and from my perspective will prove to be a tipping point when it comes to grappling with understanding all that is transpiring within a data center.

Often referenced as a disruptive technology big changes taking place, such as the deployment of hybrids, highlight that we need new tools and new processes – we cannot simply have data center operations try to comprehend all that’s happening around them. There are just too many data streams of events and alerts presented in different forms for any form of understanding or consensus to materialize without additional assistance being provided. What I like about the models behind Striim is how it can be turned on to look at disparate data streams and make sense from a lot of data that lacks any uniform structure. It’s as if the processing Striim performs ensures a steady supply of consumable, or indeed to put it another way, drinkable water.

Striim (pronounced, “stream”) reduces the babble that arises from data streams as they pass by – the noise that otherwise would be incomprehensible – and turns it into actionable data. In my presentation at Boot Camp I referenced the blog post of March 9, 2015, In a Realtime World, NonStop Customer Experience Can Change in an Instant! In the real-time IT world systems, platforms, operating systems, middleware and applications are all providing updates about their operational status and yet, I noted, even as we mix in other systems it all becomes noise! Picking just one example, I added in that post, how this constant barrage of data makes tracking the performance of an application difficult; who can tell whether basic SLA metrics are being met?

The intersection between data stream analytics and transaction processing has indeed arrived and it does represent a course change that all in the data center are coming to appreciate. Even the most hardened of NonStop system managers is aware of the need to integrate the data being generated by the execution of adjacent applications. Are databases truly in synch? Are networks and firewalls really functioning for all users? Are the processes running actually conforming to the SLAs in place? It’s all too hard to do without an additional fabric and yes, the tipping point has been reached. The more I talked with the NonStop community following my presentation (jointly given with Justin Simonds, Master technologist at Hewlett Packard Enterprise), the more I came to appreciate the full potential on offer with Striim, and from my perspective, it may very well prove to be the tipping point for every data center where NonStop systems reside.