In my previous post I focused on perseverance and in the NonStop community this is a common trait of all who support the presence of NonStop systems in their data centers. It wasn’t the only story line in that post – the change of company’s and product’s name to Striim, pronounced stream, picks up on the growing awareness that data analytics isn’t just for post processing but rather, lends itself to analysis as it passes by, in real time. Increasingly, defining your value proposition is connected to images that form simply from seeing a company or product name in print and choices can lead to misunderstandings but a stream? Little chance of ambiguity with a company and product name as easy to grasp as Striim!
When Oracle was founded in 1977 it was called Software Development Laboratories – today, the company shares its name with its premier product offering, Oracle. When GoldenGate initially launched its data replication and integration product it was simply called Extractor /Replicator but it wasn’t too long before the product shared the name of the company, GoldenGate. Much of this reflects the desire not to confuse customers and prospects even as much it adds weight to who you are and what you do – whether you heard about the product first or saw the company name in a publication, you will end up at the same place, no matter how you start your search.
However, more importantly, technology keeps changing and along with it, the imagery we use to support such changes even as we use graphical representations to simplify very complex matters. For decades, and long before the expression was reused in a more modern context, whenever a cloud was depicted it represented a network – whether X.25, OSI, SNA, etc. Inside the network cloud could be any mix of proprietary and open technologies along with links spanning the globe. It was just a cloud and could simplify the depiction of any computer complex. Likewise, stylized lightning bolts connected to the cloud represented WAN connections to systems and or terminals outside of the primary network as depicted inside the cloud.
In other words, such graphics convey very complex messages and so it is today with Big Data and Data Analytics. Data warehouses and data stores have seen a new term appear of late, data lakes. Rather than mapping incoming data to fit the storage / retrieval of any particular data store, with a lake you can save data in its original format. In so doing, according to the data lake proponents, without any potential data loss as a result of transformation, it will be valuable to those data scientists and indeed, data processes, performing analysis on the data. Data lakes have their critics already, naturally, but the image of a lake is hard to ignore even as we can visualize lakes of different sizes supporting many different populations of fish.
Others have described the body of water as a reservoir and in particular, a man-made reservoir even as others have gone so far as to describe it as the catchment area for all data entering the organization implying that not all the water makes it to the data lake. Furthermore, as a model it works on the premise that you save the water in a lake and then analyze all that the body of water contains – and for many data centers looking to more quickly capture insight from the data, this is almost akin to batch processing. Increasingly, the need to determine value from the data as it arrives dictates that analysis be performed in real time. This is what data streams addresses and, “Rather than diverting the flow to store and then analyze, with streams, analysis occurs as the information is flowing in real- or near-real time. The analogy here is that working in data streams is much like panning for gold.”
This comes from the online publication, SiliconAngle, in a post of April 17, 2014, Diving into Big Data : Data lakes vs. data streams. “The primary value in this approach is that information can be accessed quickly and insights can be gleaned in a rapid fashion,” the post explained. “Given the dynamic nature of the current environment for enterprises, it is often imperative that anomaly information or real time trends can be understood quickly so that appropriate action can be taken before they significantly impact service or revenue.” Yes, a flowing stream is every bit as easy to visualize as a lake even as it conveys the message of constant change with different rates of movement. And yes, too, anything out of the ordinary appearing in the data stream needs to be reported in real time for its true value to be leveraged.
NonStop applications and solutions are typically real time transaction processing implementations. For this reason, the move to data streams is significant as an image of panning for gold is an easy one to grasp. Renaming the company and the product to Striim has come at a time when many within the NonStop community are beginning to appreciate the value that comes from the intersection of analysis with real time. Drinking from the stream is just as understandable with the secondary message of life saving equally hard to miss. As NonStop users know all too well, changes in customer behavior can come with consequences and being the last to know of any changes can come with a price NonStop users are increasingly not prepared to pay.