Barriers Are Down: For NonStop, Striim Joins the Worlds of Stream Processing and Transaction Processing!
The spotlight is shining even more brightly on Big Data, with no evidence that the illumination is lessening in any way. Business is beginning to understand the value proposition of Big Data, and increasingly, it’s not just the process of saving everything an enterprise encounters as it pursues its business interests, but rather the impact processing the data in real time that matters most. Stream processing is adding the “special sauce” enterprises have been looking for to better assist them in identifying trends and patterns that develop and to integrate the knowledge about these trends and patterns into customer-facing solutions to be able to respond rapidly to changes in customers’ behavior.
While this is well understood by IT in general, among the NonStop community progress towards embracing stream processing has been slow and mirrors the overall rate of change occurring among the NonStop user community as a whole. The good news is that NonStop users rarely leave the platform, but it’s also very apparent that due to the nature of the applications NonStop supports, the highly important-customer facing solutions, change happens at what seems to be a glacial speed. Occupying highly visible positions within an enterprise results in a level of risk aversion that sometimes gets in the way of greater integration of NonStop with the rest of the enterprise, but that’s all about to change.
But first, what is stream processing and why is it being separated from the bigger story surrounding Big Data? According to a report published by InfoQ, an electronic publication aimed at facilitating “the spread of knowledge and innovation in professional software development,” stream processing and today’s real time transaction processing are joined at the hip. “Stream processing is designed to analyze and act on real-time streaming data, using ‘continuous queries’ (i.e. SQL-type queries that operate over time and buffer windows). Essential to stream processing is Streaming Analytics, or the ability to continuously calculate mathematical or statistical analytics on the fly within the stream. Stream processing solutions are designed to handle high volume in real time with a scalable, highly available and fault tolerant architecture. This enables analysis of data in motion.”
Just as importantly for the NonStop community, the explanation of stream processing goes even further, according to InfoQ in this post of Sep 10, 2014, Real-Time Stream Processing as Game Changer in a Big Data World with Hadoop and Data Warehouse. “In contrast to the traditional database model where data is first stored and indexed and then subsequently processed by queries, stream processing takes the inbound data while it is in flight, as it streams through the server. Stream processing also connects to external data sources, enabling applications to incorporate selected data into the application flow, or to update an external database with processed information.” It is this concept of “enabling applications to incorporate selected data into the application flow” that makes embracing stream processing attractive to all enterprises, even those exploiting the real time transaction processing capabilities of NonStop systems.
There are often discussions within the NonStop community about the relevance of Big Data and, by implication, stream processing. Oftentimes, Big Data is being confused with really big SQL databases – a label that might have its place at some enterprises, but in general, trivializes much of the value proposition of stream processing and indeed, Big Data. I see little evidence across the community of support for running Big Data directly on NonStop. What I do see is the first trembling footsteps being taken to pull data from out of Big Data platforms even as the arrival of stream processing attracts more vendors from among the real time transaction processing solutions community.
At a time when Martin Fink, EVP & HPE CTO, talks openly of the mainstreaming of NonStop he also talks about strategy when it comes to what’s next for NonStop. “The mainstream world today tends to revolve around 4 things: Linux, Virtualization, Open Source (and) Clouds.” Furthermore, according to Fink, “I also believe that we can start to make NonStop available as a service. The mixed-mode NonStop SQL/MX engine could be offered as a service that delivers a database engine that’s currently unmatched in performance capabilities, scalability and resilience.” What’s missing? There’s no reference by Fink whatsoever to including Big Data in discussions about the mainstreaming of NonStop within HPE.
There are no barriers for applications running on NonStop to consider integration with Big Data – Big Data platforms, such as any popular HADOOP distribution, run external to NonStop. All it takes is a pipe and a process – think back to the days when the all-important consideration for the NonStop community was how to network with IBM mainframes via SNA. It required implementing SNA on NonStop and adding networking connectivity hardware so that connections could be established and data could be exchanged. Think of much the same need for stream processing, only this time with the energy HPE NonStop development is putting into the support of hybrid infrastructure, including the Yuma project with its APIs in support of InfiniBand, and the inclusion of stream processing becomes a lot less risky proposition than deploying SNA all those years ago.
Striim, is a stream processing platform coming from a development team very familiar with NonStop. Many of the members of this team were the same folks that brought GoldenGate to market, so pulling information from log files, via change-data-capture (CDC), isn’t a difficult concept to grasp – think of stream processing much as we used to think of intermediary capture with trail file processing. There are new process names, certainly, but the capture side is something the NonStop community should be very familiar with – detecting changes in behavior, after all, is simply a matter of correlation of changes coming from many sources to determine commonality. Patterns and trends, essentially!
NonStop will not be immune to stream processing just as it’s highly unlikely that any Big Data platform will find its way onto NonStop. It’s not necessary, given today’s IT transformation to hybrid architectures. Furthermore, IT will not be immune to feeding the results of stream processing to applications running on NonStop – advanced system and network monitoring, fraud detection, SLA compliance, database replication synchronization all come to mind. And for the team at Striim, we have access to folks who know as much about NonStop as anyone belonging to the NonStop community. Whatever barriers we thought were present, preventing us from further consideration of integrating the worlds of stream processing with transaction processing, all in real time, are definitely down and for that, the NonStop community wins yet again.