HPE NonStop Bi-directional replication

HPE NonStop Community Prioritizes Bi-Directional Data Movement to the Cloud

As 2019 comes to a close, many of the posts and news releases coming from Striim have focused on the bidirectional data movement support between traditional databases and the cloud. For the HPE NonStop community, this is becoming a topic of conversation, as in many cases this is a compelling alternative to other options available.

While many NonStop users rely on multiple NonStop deployments geographically separated – their business continuity requirements – having the option to back up mission critical data to the cloud has its merits. The most compelling is the attraction of a lower cost option to more traditional approaches to business continuity through global distribution, as is the case with some of the larger financial institutions.

HPE NonStop Bi-directional replicationThere are instances too where implementing hybrid configurations involving NonStop and private, on-prem, clouds necessitates the movement of data between the cloud and NonStop – particularly with the increased interest in NonStop SQL supporting database as a service (DBaaS) to applications running in the cloud.

Having the opportunity to move the data in both directions allows for a risk-reduced operation, whereby data can be moved in phases knowing that, should something fail in the process, the application can continue to run uninterrupted. This lends itself to satisfy needs of those NonStop users who are running mission-critical applications.

In her post On-Premises-to-Cloud Migration: How to Minimize the Risks to the Striim blog, Irem Radzik writes about the inherent value that comes with Striim having embraced a change data capture (CDC) model to better ensure consistency between source and target databases. This is of great importance to the NonStop community and has been at the heart of the business continuity implementations for more than a decade. According to Radzik:

“Here comes the good news that I love sharing: Today, newer, more sophisticated streaming data integration with change data capture technology minimizes disruptions and risks mentioned earlier. This solution combines initial batch load with real-time change data capture (CDC) and delivery capabilities.

“As the system performs the bulk load, the CDC component collects the changes in real time as they occur. As soon as the initial load is complete, the system applies the changes to the target environment to maintain the legacy and cloud database consistent.”

Bi-directional data replication featuring CDC was the theme of the article News from Gartner and news from Striim; one objective – (bi-directional) movement to the cloud! published in the December 2019 issue of NonStop Insider. In this article, Alok Pareek, Co-Founder and EVP of Product at Striim is quoted:

“Responding to requests from marquee customers who use Striim to enable their hybrid cloud infrastructure, our engineering team has delivered a robust bi-directional data replication offering. These enterprise customers finally have a next-generation, zero-downtime, zero-data-loss solution for online phased database migrations, allowing them to seamlessly run their new cloud environments in parallel with the legacy systems for a gradual transition for their end users.”

For now, however, the extent of the opportunity for the NonStop community to leverage this capability (available in Striim release 3.9.7) requires additional input from NonStop users. As it was noted in the article in NonStop Insider, when it comes to the NonStop community:

“Striim is still canvassing the NonStop community members for further feedback about their own use-case potential as Striim is to prioritize support of bidirectional NonStop SQL to / from Cloud based solely on these NonStop users requirements.

Mr. Pareek goes on to state (and NonStop users would agree):

“These enterprise customers finally have a next generation zero-downtime, zero-data-loss solution for online phased database migrations, allowing them to seamlessly run their new cloud environments in parallel with the legacy systems for a gradual transition for their end users.”

Should you have any trouble at all with the above hyperlink to the article published in the digital publication NonStop Insider, you can always cut and paste this link into your browser:

https://www.nonstopinsider.com/uncategorised/news-from-gartner-and-news-from-striim-one-objective-bi-directional-movement-to-the-cloud/

For now, should the functionality of this latest release of Striim be of interest to you as a NonStop user and you would like to know more about the capabilities on offer with Striim Release 3.9.7 please email us or give us a call and make sure you check out our web site for all the news as it breaks at https://www.striim.com/.

What’s New in Striim 3.9.5

What’s New in Striim 3.9.5: More Cloud Integrations; Greater On-Prem Extensibility; Enhanced Manageability

Striim’s development team has been busy, and launched a new release of the platform, Striim 3.9.5, last week. The goal of the release was to enhance the platform’s manageability while boosting its extensibility, both on-premises and in the cloud.

I’d like to give you a quick overview of the new features; starting with expanded cloud integration capabilities.

  • Striim 3.9.5 now offers direct writers for both Azure Data Lake Storage Gen 1 and  Gen 2. This capability allows businesses to stream real-time, pre-processed data to their Azure data lake solutions from enterprise databases, log files, messaging systems such as Kafka, Hadoop, NoSQL, and sensors, deployed on-prem or in the cloud.
  • Striim’s support for Google Pub/Sub is now improved with a direct writer. Google Pub/Sub serves as a messaging service for GCP services and applications. Rapidly building real-time data pipelines into Google Pub/Sub from existing on-prem or cloud sources allows businesses to seamlessly adopt GCP for their critical business operations and achieve the maximum benefit from their cloud solutions.
  • Striim has been providing streaming data integration to Google BigQuery since 2016. With this release, Striim supports additional BigQuery functionalities such as SQL MERGE.
  • Similarly, the new release brings enhancements to Striim’s existing Azure Event Hubs Writer and Amazon Redshift Writer to simplify development and management.

In addition to cloud targets, Striim boosted its heterogeneous sources and destinations for on-premises environments too. The 3.9.5 release includes:

  • Writing to and reading from Apache Kafka version 2.1
  • Real-time data delivery to HPE NonStop SQL/MX
  • Support for compressed data when reading from GoldenGate Trail Files
  • Support for NCLOB columns in log-based change data capture from Oracle databases

Following on to the 3.9 release, Striim 3.9.5 also added a few new features to improve Striim’s ease of use and manageability:

  • Striim’s users can now organize their applications with user-defined groups and see deployment status with color-coded indicators on the UI. This feature increases productivity, especially when there are hundreds of Striim applications running or in the process of being deployed, as many of our customers do.

 

 

 

 

 

 

 

 

 

 

 

 

 

  • New recovery status indicators in Striim 3.9.5 allow users to track when the application is in the replay mode for recovery versus in the forward processing mode after the recovery is completed.
  • Striim’s application management API now allows resuming a crashed application.
  • Last but not least, Striim 3.9.5 offers easier and more detailed monitoring of open transactions in Oracle databases sources.

For a deeper dive into the new features in Striim 3.9.5, please request a customized demo. If you would like to check out any of these features for yourself, we invite you to download a free trial.

Striim CTO Steve Wilkes in Denver; Why streaming analytics is a must!


This past week I made the trip down the major Colorado arterial highway that runs north – south from Wyoming to New Mexico. Striim was giving a presentation in the Denver Technical Center and Striim CTO, Steve Wilkes, was the keynote presenter. Organized by Radiant Advisers, an independent research and advisory firm that “delivers practical, innovative research and thought-leadership to transform today’s organizations into tomorrow’s data-centric industry leaders,” this was too close to my office to miss. I have known Steve for more than a decade and to have an opportunity to hear Steve always proves stimulating and as much as he has presented of late, he confided to me, his enthusiasm and interaction with the audience made the fast trip down to Denver worthwhile.

The presentation centered on “Modernizing Data Platforms with Streaming Pipelines” which Steve then expanded on, highlighting why “Steaming analytics and machine learning are essential for (such things as) Cybersecurity, AI, IoT, and much more!” However, what struck me most was the material Steve used to highlight just how many open source projects were out there and how many layers of middleware are involved to simply provide a way to ingest, process and then highlight important information gleaned from the multitude of transactions in flight at any point of time.

For instance, noted Steve, even with all of the open source projects out there, simply building a streaming analytics platform from open source is just too daunting a project to start from scratch. For an effective streaming engine to provide value you have to think of the choices you have for distributed high-speed message infrastructure where the options include ActiveMQ, Kafka, etc. Complementing your choice here for messaging infrastructure you then need to select an appropriate in-memory data grid, a role where products already familiar to the NonStop community provide an option and here I am referring to products like Redis.

From here it gets really messy for the average systems programmer – what about data collection, processing and analytics and yes, final data delivery. Products like Kafka might be reused but then again, you need to look at Cassandra and Hive as well as Flink, Storm and Nifi and Logstash. Finally, you may want to then store the results of the capture / analytics / delivery process and your choices here will include offerings like Cassandra and Hbase. Many of these are Apache projects, of course, but even with all the work Apache have fostered under its umbrella, it’s only the basics. Still more has to be done once you have walked through all your options for building your own streaming analytics platform.

A topic even more familiar to the NonStop community and that comes with the NonStop platform out-of-the-box involves developing what Steve referred to as the “glue-code” – all that is need to cluster, scale, be reliable and secure and yes, have all the associated monitoring and management any crucial subsystem is expected to have. Even with this architecture fully embraced, you still need to add the human element – the UI. Both graphical and command line interfaces are important as well as a comprehensive set of dashboards to display the all-important in-memory analytics produced by your streaming analytics platform. Getting the picture? You can’t simply say you are going to build out the equivalent platform solely with Kafka as so many pieces are missing once you get past the rudimentary components on offer today with Kafka. And this is where Striim provides businesses today with value – it comes with all of the above addressed via a mix of open source and all the integration or glue components included.

In the article just published in the October 2017, issue of NonStop Insider, Out of time but not out of options we make the observation that what Striim brings to the NonStop community is truly remarkable. Unfortunately, even as the core message of HPE is revolving around “Simplify Hybrid IT” it would seem that HPE really isn’t getting the message about the importance of analytics, particularly when it comes to transaction processing and the actionable data generated in real time. Streaming analytics platforms do not require a rip-and-replace attitude when it comes to implementation as it truly can be bolted onto what exists today which should appeal to HPE and to the NonStop community. On the other hand, HPE’s own IT department is already using Striim, so perhaps the message about the value proposition of Striim is beginning to percolate within HPE.

At the upcoming NonStop Technical Boot Camp there will be a very strong presence from the Striim team. The customer base has expanded considerably since the last event and the scope of the business problems being addressed by Striim is remarkable. While Steve gave an informative presentation on Striim in Denver this week, should you want to know more about just how well Striim can integrate with your NonStop solutions today and how its Change Data Capture elements allow it to be phased in to your production environment without causing any disruption to your traditional approaches to transaction processing, then stop by the Striim team booth at the Boot Camp exhibition. We look forward to seeing you there!

Striim Looks to Leverage New NonStop APIs in Support of Hybrid Infrastructures: Part 1

Itineraries are being developed and current workloads reevaluated. Yes, it’s the beginning of the events season for the NonStop community and it’s a time of dramatic change for even the most seasoned NonStop supporter. HPE has two primary focus areas that it aggressively promotes at every opportunity and they are Hybrid IT and Edge / IoT. These two themes dominate all of the discussions and those who have been to an HPE event of late would be hard pressed leaving these events with any doubts about HPE’s commitment to both.

If you have missed it, I recently posted to the LinkedIn blog, Pulse, an article on Striim. You can find it by following this link:  Striim looks to leverage new NonStop APIs in support of hybrid infrastructures. This post picks up on a topic I first addressed in an earlier post to the Striim blog: The Long Term Future of NonStop That Strikes a Chord With Striim!

In these posts I highlighted how the future for NonStop is certainly taking a number of unexpected turns as its value appeals to new verticals that must run continuously, without any downtime. I also pointed out that maybe for HPE having NonStop X and vNonStop (now officially called Virtualized NonStop), will usher in a new era of expansion and growth for NonStop.  The more important consideration for the NonStop community, as well as those considering the deployment of Striim, is the understanding that there are real time actions that need to be made while data is fresh and oftentimes, in flight! And there is little else to match NonStop when it comes to supporting the real time world – without ever crashing!

In the paid post by NASDAQ on February 10, 2017, The future of fintech: Top technology trends for market infrastructure providers, to the CNBC.com web site, there was a lot to read about data. “Regulation, market fragmentation and new types of risk have catalyzed change in the financial services industry over the last few years. Automation and standardization has changed the way customers interact with market infrastructure providers, leading to an explosion in data volumes.” All the while, avoiding making headlines as a result of a crash, as has not been the case for some financial exchanges in recent times.

Not surprisingly, NASDAQ made the observation that, “Being able to mine data, normalize it, update analytics in real time and present it in a consolidated view is a source of competitive advantage. Until now systems were rules-based, but we are now seeing a seismic change in the approach which will progressively shift toward using machine learning and artificial intelligence to eliminate bias in the analysis and discover new patterns in the data.”

The highlight of the event season, and the one many NonStop supporters will be attending, will be the early June, 2017, HPE Discover conference to be held in Las Vegas. NonStop doesn’t always feature as prominently as the NonStop community would like, but nevertheless, when it comes to running mission-critical application 24 x 7, NonStop has few peers within the HPE product portfolio. What these attendees will be watching closely will be information about NonStop participation in both the key focus areas of Hybrid IT and Edge/IoT – the former being an area where even today, NonStop is being shipped as a hybrid embodying both NonStop and Linux – NonStop for computer, Linux for access to networks and storage – that have been integrated and tested in the factory before ever making it onto a data center floor.

Taking this one step further and acknowledging where NonStop users are headed;  building their own configurations, combining NonStop systems with Linux systems has become a whole lot easier with the introduction by HPE NonStop development of NSADI. NSADI is an API to simplify memory to memory data transfers, the support for each system to directly access the memory of the other without any involvement of the operating system. Depending on customer take-up, the developers of Striim will take a look at this as it does offer benefits for users who wish to deploy Striim on an adjacent Linux system even as the data accumulates on the NonStop system.

As for the latter, concerning Edge / IoT, and a topic for another time, when it comes to Edge / IoT, where NonStop may make an appearance then Striim can add considerable value. If you missed it, at this year’s Hannover Fair, Striim announced that it was “showcasing their solution for enabling the real-time, continuous exchance and analysis of IoT data across all levels of an IoT infrastructure, powered by Micrsoft Azure.” For more about this, check out the Striim news release of April 23, 2017, Striim addresses core challenges in leveraging IoT for Industrie 4.0.

As NonStop continues to make unexpected turns – well, at least for some members of the NonStop community they may not be all that big a surprise – there is always opportunity and for vendors like Striim all that they see are new doors opening. Yes, “being able to mine data, normalize it, update analytics in real time and present it in a consolidated view is a source of competitive advantage” isn’t a message that’s falling on deaf ears among the team at Striim! Nor is the message that the future of NonStop will likely see it playing important role in both the Hybrid IT as well as Edge/IoT initiatives – a situation that will foster even greater interest in NonStop as it maintains its leadership role in supporting mission critical applications.

 

The Long Term Future of NonStop That Strikes a Chord With Striim!

It was only a few days ago that I sat through a presentation about the new NonStop systems by a senior manager from the NonStop development team. Covered in the presentation were NonStop running on physical systems as well as virtual systems – an update that is being given to everyone in the NonStop community and one that HPE product management will be providing to the Connect community shortly. At one point in the presentation, there was a PowerPoint slide with the heading “Long term future of NonStop – building on NonStop X, vNonStop and DBaaS to reach new markets.” Of course, whenever I see any heading that associates NonStop with new markets then the presentation has my complete, undivided, attention. What was included in the slide, though, was the material I had not previously seen in presentations featuring NonStop systems.

Top of the list? NonStop in support of Blockchain and it became evident that the architecture of NonStop would be extremely beneficial to those considering blockchain in support of currency exchanges, not to mention access to Bitcoin markets and even support for Bitcoin POS / ATM devices. When a country as conservative as Switzerland announced that its national railway company, Swiss Federal Railways (SBB), will “allow Swiss customers (to be) able to buy bitcoins from over 1000 SBB ticket machines” as from November 11, last year, – then there’s going to be a lot more attention given to the underlying blockchain technology. Seeing NonStop join the discussion, as HPE indicated was the case, certainly surprised those who had joined me for the presentation.

Other items included in the list were IoT and support for the utility market. While I have written numerous posts about NonStop and IoT it’s now a fixture on HPE presentations featuring NonStop. As far as HPE is concerned, NonStop could drive “emergency response, home monitoring (and) smart devices” from applications running on NonStop systems. As for the utility market, NonStop could drive “power stations, aqueducts, canals, dams and even satellites.” These should all resonate well with Striim as it is with applications in support of opportunities like this where massive amounts of data are involved and where catching any deviation from normal may easily be missed if left solely to human oversight where Striim truly shines.

As I looked at what I saw coming from HPE as far as the NonStop marketplace goes, it’s hard avoiding resorting to clichés and truisms. After all, HPE has invested a lot in NonStop over the past couple of years and we expect to see its latest addition to the NonStop family of systems – the new NonStop X system – shipping in volume. Probably just as importantly, Striim has now been selected by NonStop users for deployment in their enterprises and while the types of use case scenarios has varied widely, this too only adds to the value proposition of Striim. Big data, stream analytics, IoT – maybe not all items immediately associated with NonStop and yet, where HPE is taking NonStop in 2017, all apply and it’s a positive sign for the NonStop community that early adoption of Striim among NonStop users has commenced.

The above commentary comes from the latest article featured in the January, 2017, issue of NonStop Insider – Striim rocks the awards season. If as yet you have not become a subscriber to this digital publication you should take a look at the subscription page as each month, there are monthly updates featuring Striim. There will be many use-case scenarios where Striim provides value and already the early adoption among the NonStop users deploying Striim has been in support of traditional transaction processing applications. But when you look at Bitcoin, ticket machines and the Swiss Railways you quickly come to appreciate that it’s just an extension to what has been a very traditional, indeed quite staid, application.

“The new machines allow contactless payments using Postfinance-Card, which is a card for postal accounts, American Express, Mastercard’s PayPass, and VISA’s payWave credit cards. Change for up to 100 francs is available and large notes are accepted up to 200 francs/euros,” said the Swiss. But behind the scene will be blockchain technology in support of “the ledger” and as just a different type of database, it will likely need back-up and be subject to more traditional SLAs, all of which will need some form of real time monitoring – one of the more popular aspects of current Striim deployment. That this can all run on NonStop, whether physical or virtual, only makes it more appealing to Striim and I can only imagine how big a chord this strikes with Striim.

The future for NonStop is certainly taking a number of unexpected turns as its value appeals to new verticals that must run continuously, without any downtime. Some of these verticals will be new for NonStop while others will just see some of their more traditional capabilities extended to include new technologies and architectures. And as this future for NonStop unfolds, there’s nothing standing in the way of even greater opportunities for Striim as all involve massive amount of data all needing to be collected and analyzed, in real time. Maybe, in NonStop X and vNonStop, the new NonStop systems will usher in the era of expansion all within the NonStop community have anticipated seeing for some time and, with the arrival of this era, Striim will encounter a growing market of its own, supporting this future for NonStop!

Analytics, On the Edge, and NonStop!

It may have gone unnoticed by many viewers of the popular television program, Elementary, when its main character, Sherlock Holmes, heard the buzzing of a mosquito when it wasn’t mosquito season. The buzzing indeed was artificial and we were led to believe that a military subcontractor had reduced an intelligent “thing” performing visual reconnaissance to the size of a simple mosquito. It may have also gone unnoticed by some when more recently, in the television series, Madam Secretary, a SEAL team involved in an extraction mission sent in a slightly bigger insect that was also artificial and yet, carried with it a camera and comms package so that the SEAL team could locate the hostage.

Cozmo

Far- fetched? Preposterous? All quite fictional but somehow, unnerving as we continue to become aware of just how many things today are equipped with sensors and even stand-alone intelligence. Perhaps it is because this is the festive season where gift exchanges are on the minds of many of us, but I was reminded of both these fictional portrayals of new-age intelligence gathering things. However, reading the December 12, 2016, issue of Time Magazine I came across the article, Artificial Intelligence invades the home … in toys.

As the picture above depicts, these playful toys called Cozmo, brought to us by San Francisco startup Anki, have been reduced to the size of a small ball but they are packed with remarkable technology.  “It doesn’t like to stay put very long. Roused from slumber, the little robot’s face illuminates, and it begins zooming around the table in front of me. A moment later, it notices I’m watching and turns to greet me, saying my name with a computerized chirp.”

These toys also differ “In the way they interact with the people and objects around them, changing their behavior over time as their software ‘learns.’ Right out of the box, cameras and sensors allow Cozmo to recognize individuals, avoid falls or bumping into obstacles and play simple games like keep-away.” Furthermore, “‘Every input trigger, no matter what happens to him, will influence his future behavior,’ says Hanns Tappeiner, Anki’s president.”  Suddenly, the Internet of Things (IoT) takes on a whole new dimension – what may start out as a toy has a lot of industrial applications that I know have already started showing up in everything from warehouses to hospitals.

Smart things that can fit into the palm of our hands capable of learning and interacting with us tend to suggest that even smarter devices have already made it into the world’s stage so perhaps what may have been considered preposterous just a year or two ago actually exists. Coming at a time when many of us have taped over the web cam of our laptop, I have to wonder what else may be watching me that I innocently brought into the office. From tradeshows or as gifts awarded for one reason or another; as we accept presents this festive season, I wonder how many of us will be truly scrutinizing everything mechanical as we rip it from out of its wrapping paper.

In a paper focused on his Top 20 Predictions for 2016 that will be published shortly, CTO and Cofounder at Striim, Steve Wilkes, notes how “IoT Platforms will grow in strength and capability incorporating device registration, management and communication features as well as integration, analytics and machine learning.” And that, “Simple IoT use cases such as the real-time tracking of the movement of people or packages via geolocation and time windowing will become prevalent across healthcare, travel, manufacturing and logistics.”

Real-time tracking of the movement of people – well, looks like we already have begun pushing the boundaries of this frontier. IoT will usher us into another era when it comes to how we will buildout our infrastructure – there will be things, there will be centers, and in the middle there will be edges. And for the NonStop community, it’s a given that there will be a NonStop presence in the center but what about the edge? In his paper on predictions, Steve also notes how, in order for timely responses, “Reliability and security concerns will push real-time analytics to edge locations for IoT. This will become evident through connected cars, home IoT hubs, retail store-based gateways and other localized technology. Anonymized data will be pushed to the cloud for deeper analytics.”

In a post to the Striim blog of December 1, 2016, Marketing VP, Katherine Rincon, wrote that, “The edge processing power of the Striim platform running on a gateway based on the Intel® X86 architecture, offers efficiency and expediency to the development and deployment of IoT applications. With the power of the Intel® X86 technology, the Striim agent can seamlessly ingest, processes and analyze data in real time, generating insights in milliseconds.” Furthermore, “IoT applications do not exist in a vacuum. As such, IoT solutions not only need to incorporate a variety of IoT devices, but must also be integrated with the existing enterprise infrastructure.”

When it comes to edge products and the need for them to support analytics capable of operating in real time, such as Striim, the appearance of gateways based on x86 throws open the potential for running NonStop applications on the edge. While much of the arriving sensor information may be discarded following initial analysis, there will be a category of sensor information that simply cannot be overlooked no matter what. A critical message from a pacemaker, an alarm from an overturned vehicle, a mechanical malfunction in an oil refinery or nuclear power station – many events just have to make it into the center and the fault tolerance we have always appreciated from NonStop brings the NonStop system back into play as today, in its new virtual guise as vNonStop, it can run on any x86 server users elect to deploy.

Uncovering Gems – Striim, Interacting With Nonstop Solutions Leads to New Business Opportunities!

Living along Colorado’s front ranges, when it comes to the weather you have come to expect anything to happen. And at any time! When forecasters first talked of temperatures climbing into the low 80s many of us simply shrugged it off and yet, businesses dependent on much cooler temperatures are beginning to suffer. So much for the early arrival of new skis and bindings! So much for the early arrival of heavy clothing! There has been snow high up in the mountains along the continental divide but that was a while ago.

Models developed to help weather forecasters take into consideration past events and oftentimes overemphasize what happened decades ago. What has transpired in recent times is often discounted and again, business suffers revenue shortfalls. On the other hand, the local farmers are bringing in yet another harvest as crops continue to flourish. When I moved to Boulder very few homes were air-conditioned – my first home up in the front ranges certainly wasn’t until I called our friendly Sears store and they installed it for the comfort of my visiting parents – but today, it’s difficult to sell a home that isn’t air-conditioned. In colorful Colorado! While the pundits will put it down as just further evidence of global warming, and they may be right after all, it would be nice to know if we can plan on experiencing more warm weather or if it’s just another short-lived anomaly.

In the posts and commentaries I wrote last month featuring Striim and the NonStop community I wrote about how the catalysts for change are as varied as they are unpredictable. With this in mind, I wrote too of how there was a difference between Business Intelligence (BI) and Business Analytics (BA) and how BI was about what was going on behind you – in other words, mining historical data – whereas BA was all about what was going on in front of you. What is about to happen proving to be of more value to those IT systems processing transactions in real time and where you will find the majority of NonStop systems deployed.

Quoting Dipak Bhudia, Chief Product Architect, Clear Analytics, “Business Intelligence (BI) is essentially a noun, in that it is an umbrella term of the overall scope of acquiring, persisting, warehousing, analyzing and reporting insights along with everything else in its periphery. Business Analytics on the other hand is more of a verb, the act of discovering insights using any tooling or services at your disposal.” No better example of any of this, what’s in front versus what’s behind, can be provided than weather predictions. Get it all wrong and you may waste so many resources unnecessarily moving populations away from storms and subsequently wearing the consequences.

The time has come when the most available, and simply the best, transaction processing system needs to be better integrated with modern business analytic engines. Yes, it’s about spotting “information gems” as they appear in the constant ebb and flow of data streams passing by – the spoils from victory today are going to those businesses who respond in a more timely fashion to opportunities. This has been the major premise influencing the take-up of BA and something frequently referenced in the posts to this blog. However, what’s changing for the NonStop community is the reinstatement of their importance to business now that HPE is demonstrating a greater commitment to NonStop. It was difficult to justify further investments in future solutions on NonStop if there simply was no future for NonStop!

“What we are witnessing first hand is an elevated interest in better integrating the world of BA with NonStop solutions,” said Striim Cofounder and EVP, Sami Akbay. “It’s become evident of late that plans to scale back on NonStop investments may have been overstated even as the expectation that CIOs would be more likely to invest in open platforms than NonStop, resulting in many more queries coming to us from business heavily reliant on 24 X 7 transaction processing.” This isn’t entirely unexpected as it appears HPE is listening more closely to its big customers and the porting of the NonStop operating system and supporting stack to the Intel x86 architecture may prove to be just the beginning as HPE continues to explore additional ways to ensure the price points for NonStop systems continue to come down.

“It’s happening at a timely point for us,” added Akbay. “Shortly, we will be exhibiting at the annual NonStop Technical Boot Camp that is to be held in San Jose and the expectations are that several hundred of the biggest users in the world of transaction processing solutions will be in attendance and this is a marketplace where Striim has become well known. We are expecting to be engaged in numerous conversations as to how best Striim can help their business detect and then respond to those gems uncovered in data streams open to solutions running on NonStop.” Indeed, Boot Camp is always the premier place to find out more about new features and products and to converse with those individuals closest to the technology.

Boot Camp is an event that I attend with a measure of regularity that proffers me opportunities to talk to NonStop community leaders so I will be more than attentive to all that transpires with respect to NonStop and Big Data. HPE sees Big Data as a key influence on its vision for hybrid infrastructure as it weaves together the marriage between traditional IT and cloud computing. It’s a compelling story and reflects the growing trend within business to take an incremental, baby steps, approach to embracing clouds.

The very fact that NonStop is a part of this vision and that HPE continues to demonstrate its greater commitment to NonStop hasn’t escaped the attention of many CIOs. Likewise, the fact that Striim is targeting NonStop hasn’t escaped the attention of the NonStop community so it will be highly enlightening to see what transpires at Boot Camp, and no matter the outcome, there will certainly be a lot more written to this blog in the coming weeks. I look forward to hearing more from everyone in the NonStop community!

Business Intelligence Versus Business Analytics – Striim Helps You Get Out in Front of What’s Happening Today!

The catalysts for change are as varied as they are unpredictable. What may be driving one vendor to pursue an opportunity may generate little interest among other vendors and what might be an adequate business model for participants in one marketplace may not be a model for others in the same marketplace. Take for instance the news that came out of Ford motor company just recently. According to a September 15, 2016, article in the Wall Street Journal (WSJ), “It wants investors to view the auto maker more like a Silicon Valley company.”

Whenever there is disruption within an industry there is always a mad scramble as industry stalwarts play catch up. And this is definitely the situation at Ford. The company made this plea noted the WSJ, “promising lofty returns on future ventures while warning near term profit will be pinched by deep investment.” The root of Ford’s problem, the disruption to the industry it faces, and the catalyst for its change and subsequent plea? Rather than maintain its position as a car manufacturer it faces revolutionary forces that may lead to it upending its business model. As the WSJ reported, Ford “is scrambling to catch up with Uber Technologies, Inc., Alphabet Inc.’s Google, Tesla Motors Inc. and other nontraditional car companies ahead of Ford in electric-vehicle development, autonomous-vehicle testing and services allowing customer to share rides or cars.”

I was reminded of this when a colleague asked me about Business Intelligence (BI) and why there seemed to be less and less references being made to BI of late. As vendors universally hype their latest ventures in support of Data Warehouses, Big Data, and Data Lakes Streams, it would seem that talk of BI has been pushed to one side. It’s as though there is now an assumption that all businesses have become more attuned to the intelligence that resides in the data they capture and that they have all cycled to more openly discussing Business Analytics (BA). While it is important to look back in time, what’s happening right now is much more important to businesses everywhere.

In a blog post of July 1, 2016, that I happened upon, Business Intelligence vs. Business Analytics: What’s The Difference? I couldn’t help but home in on the central premise – maintaining versus revolutionizing. Justin Heinze, the Managing Editor of BI Software Insight, gathered a number of commentaries on just such a difference. “Business Intelligence is needed to run the business while Business Analytics are needed to change the business,” said Pat Roche, Vice President of Engineering, Noetix Products, Magnitude Software. “BI is focused on creating operational efficiency through access to real time data enabling individuals to most effectively perform their job functions. BI also includes analysis of historical data from multiple sources enabling informed decision making as well as problem identification and resolution.”

On the other hand, Roche considers “Business Analytics relates to the exploration of historical data from many source systems through statistical analysis, quantitative analysis, data mining, predictive modeling and other technologies and techniques to identify trends and understand the information that can drive business change and support sustained successful business practices.”  Just as informative is the response from Dipak Bhudia, Chief Product Architect, Clear Analytics. “Business Intelligence (BI) is essentially a noun, in that it is an umbrella term of the overall scope of acquiring, persisting, warehousing, analyzing and reporting insights along with everything else in its periphery. Business Analytics on the other hand is more of a verb, the act of discovering insights using any tooling or services at your disposal.”

Finally, the one comment that struck a chord with me, even if it only obliquely referenced automobiles, came from Mark van Rijmenam, CEO / Founder, BigData-Startups. “To me the difference between Business Intelligence is looking in the rearview mirror and using historical data from one minute ago to many years ago. Business Analytics is looking in front of you to see what is going to happen. This will help you anticipate in what’s coming, while BI will tell you what happened.”

Working with the team at Striim these past couple of years, I have come to realize the significance of observations such as BI is needed to run the business while BA are needed to change the business even as I am acutely aware that what’s behind me may not be quite as important as what is right in front of me, happening the instant I make the connection. The promise of Striim to combine both streaming data integration and streaming operational intelligence (yes, the two i’s in Striim) in a single platform thereby giving business the ability to perform analytics in real time and become more responsive to their ever-changing business environment.

While Ford acknowledges that it no longer can just maintain its business, stamping out new car after new car, but rather, it has to address more revolutionary factors impacting its ability to stay in business. Did this realization that dramatic change was in order come from deep dives into historical data or did it come about as it reacted to trends emerging in real time? While it’s hard to know for sure the processes involved the results speak for themselves – Ford’s insight most likely came from using every tool and service at its disposal and certainly, this included Business Analytics!

Data, Data Everywhere but Still Not Enough Information – Striim Does the Analysis, Just in Time!

A recent routine trip to the family doctor resulted in conversations with our primary physician, a specialist and a short trip to a medical facility for a MRI. Nothing extraordinary with the process, and somewhat expected, and yet the process held one last surprise. What hadn’t been expected was the work we had to do to interpret the results – three separate portals had to be accessed with the family intimately involved in correlating the data. As much as we always talk about a highly integrated world where external agencies are working on our behalf to correlate data, we have a long way to go.

Having just leased a new hybrid vehicle that is completely drive-by-wire, where it’s left to the computers to determine synchronization between two gearboxes, applying torque to the front wheels when needed and yes, providing just enough steering feedback to convince us we are sensing changing road condition, the days of autonomous driving no longer seems to me to be that far off in the future. As far-fetched as it was just a couple of years ago as we all saw the first pictures of Google’s weird egg-shape self-driving car, the emphasis today has been on where and when the first cities or their surrounding suburbs are zoned for autonomous driving only.

Data and the sensors that generate data are already having an impact on our everyday lives. Whether it’s the computer enhanced imaging the medical profession can produce for patients or simply the computer control that the auto industry can give drivers, a wealth of data is being generated. If we are on the sidelines about the potential impact on our business of the Internet of Things (IoT), then we may be missing the point – it’s already arrived and the only question now is how much more data is headed our way. Like everything encountered inside the data center, it’s going to come down to a question of scale – just what are we going to do with the data streaming by and where will we store data we deem relevant?

Talking to a number of vendors this month has really opened my eyes to just how many data center components are being shipped today that are capable of providing sensor data every minute of every day. For IT professionals looking outwards to what our business needs to support its customers, we have become a class of users and for the moment, we are still well behind in the race to make better use of what is now coming to us from our networks and supporting infrastructure. Like everyone else, I was shocked to hear of yet another major meltdown of an airline computer system.

While I am not confident that I know all the details, it looks like a glitch in the power brought down servers, many of which didn’t have backups, and the subsequent layered implementation of multiple applications meant that returning service to normal took a lot longer than anyone expected. Yes, there was a lot of data steaming into the data center but apparently not enough information to support a level of data center automation we would have expected. So often we marvel at manufacturers who today live by just-in-time deliveries of individual components as well as subassemblies, but when are we going to put the data center on a truly response-driven model that responds to just in time arrival of data?

The data center seems to be the logical choice for the first application of IoT data correlation and automated response. After all, turning data into information has been the mantra of IT professionals for decades and from the time the first database was deployed primitive applications performance monitoring solutions appeared. However, with what we see today, there’s just too much data to expect vendors providing such solutions to simply provide an add-on for analytics. There just has to be more intelligence provided to front-end the data that arrives at the data center. The image of trying to sip water from a fire hydrant comes to mind and continues to be the best way to illustrate the difficulties associated with turning data into information.

“With the onset of the Internet of Things, our universe is now flooded with events that require event processing. These events are introduced to the digital world with all the wonders of IoT and wearable computing that report on anything and everything that happens. But our ability to take advantage of the power of these events is currently quite limited,” writes Opher Etzion in a July 18, 2016, post to the Striim blog, Programming: A Shortfall of Legacy Event Processing. Etzion serves as Professor and Department Head, Information Systems at Academic College of Emek Yezreel. Previously he had worked for IBM where, most recently, he was the senior technical staff member and chief scientist of event processing having been the lead architect of event processing technology in IBM Websphere. He has also been the chair of EPTS (Event Processing Technical Society).

“The way that most programmers have approached event processing is by inserting all events into a database, and then executing ad-hoc or periodic queries to detect patterns. This is consistent with the ‘request-response’ paradigm of thinking in most programming,” wrote Etzion. The answer, when so much data is being generated, is obvious to Etzion when he adds, “Like the Internet that succeeded when browsing the web became possible for everybody, the Internet of Things will bloom when everybody can control an application’s logic by creating and modifying patterns based on multiple events.” This is exactly where Striim inserts itself into the data streams coming from every device that is continuously generating data.

When there is so much talk about the role of data stream analytics, particularly when it takes place in real time, with IT professionals stepping up to leverage these analytics in healthcare, automobiles, and manufacturing – isn’t it time to begin applying analytics to the operations of the data center itself? Whether it’s the oversight of the network, the monitoring of the applications or even the health of the physical data center itself, there is just so much data that traditional approaches to processing it all simply fail. It truly is coming down to how rapidly we respond to evolving patterns as they form and this has always been the design center for the data stream analytics solution from Striim.

Just in time processing of critical data, detecting patterns and then shaping useful information for all involved in the health of applications is perhaps the most useful early adoption of data analytics. The payback in terms of improved reliability as well as keeping well away from the media’s bright spotlight, are clearly the goals of most CIOs and data center managers. Manufacturing quickly learnt the merits of Just-in-Time supply chains but the question remains – can IT professionals everywhere respond as quickly as their manufacturing peers and embrace data just in time?

Changes at HPE Foster NonStop Partners

Bodes Well for Growing Ecosystem of NonStop Partners Including Striim

On July 19, I wrote about the many changes taking place at HPE. So what is the new HPE going to be doing as it continues to divest itself of services and software groups? As one executive suggested to me, “we’re essentially back to an infrastructure company: servers, storage, networking, IoT, and the services that go into operating that stuff.” Also, with Fink leaving, the services unit being spun off, and software likely to be spun off as well, HPE is turning to partners, a shift that will increase support of the growing ecosystem of NonStop partners.

HPE is working with Microsoft and Amazon on public clouds even as it focuses on the transformation to a hybrid infrastructure tying together traditional IT with new-age private clouds – all within the enterprise, of course, and under the oversight of existing human resources. But even here, partnerships will flourish, according to HPE, and this includes the NonStop development team.

Over the last decade we have seen the NonStop price book become inundated with third party products covering security, monitoring, data replication, and even complete communications stacks. This is sending a very positive message for NonStop partners as it becomes a major focus for NonStop development to foster a growing ecosystem centered on NonStop.

When it comes to the wider picture of clouds, security, big data and mobility (or, as HPE is now labelling them: transforming to a hybrid infrastructure, protecting your digital enterprise, empowering the data-driven organization, and enabling workplace productivity), this ecosystem of NonStop partners and even large users is going to become more important than it has ever been in the past. Furthermore, when it comes to big data and in particular, the more relevant real time data stream analytics, then this bodes well for Striim. Within the NonStop community it has no peers, and no vendor known to the NonStop community has more experience in data stream analytics than Striim.

It is my expectation that we will see a much closer working relationship develop between HPE NonStop and Striim in the near term. While I am not in the loop on this subject nor have I been a party to any in-depth discussions, I just cannot see NonStop customers not wanting NonStop to become more proactive in supporting a solution ideal for NonStop applications wanting to tap big data. In real time!

Change is continuing across HPE and I don’t think we have seen the last of “big announcements.” I suspect the focus on enabling transformation to a digital enterprise will only tighten further in the coming months. But for the NonStop community, the transactions that it processes are increasingly becoming “richer.” Even the hardiest of NonStop supporters see the separate transformation of NonStop to being data-driven as much as transaction-driven. This plays into the sweet spot for Striim.

Surprises Aplenty as HPE Refines Its Focus

Meg Whitman, CEO of HPEFor me, an industry analyst, commentator and blogger focused on HPE in general, and HPE Mission Critical Systems specifically, these past couple of days have provided material to fill an entire blog. HP signaled changes coming last year when it proposed splitting into two entities, HPE and HPQ, with the former focused solely on the enterprise.

Almost at the same time, it announced that it was no longer going to pursue public clouds, leaving the provision of public clouds to some of its major partners, including Microsoft.

However, since the split was formalized late last year and the two companies, HPE and HPQ, began operating independently, the changes at HPE have continued to come thick and fast. And it’s not just services or software, but people as well.

Who could have predicted that within its first year of operations HPE would spin out its Enterprise Services (ES) group which, combined with CSC, will operate as a new company based on a 50:50 split between HPE and CSC? While the deal won’t finalize until March 2017, we are already reading that there is more to follow with the potential of software units being sold off well before that March 2017 deadline, with Vertica the center of most speculations.

When it comes to people, there was no more surprising announcement coming from Meg Whitman, CEO HPE, than that her CTO, Martin Fink, would be retiring at the end of the year at a youthful 51. This has raised many questions, but perhaps the biggest surprise is that, not only is the role of CTO going to change, there will also be changes at the venerable HP Labs.

Hewlett Packard Labs remained the central research organization for Hewlett Packard Enterprise and there had been no name change. HP Labs is going to be broken apart, with pieces being scattered among the products groups. As Whitman noted in the announcement blog post, “To further accelerate the time it takes to drive technology from research and development to commercialization, we will move Hewlett Packard Labs into the Enterprise Group.”

Nothing is as saturated with innovation history or tradition as is HP Labs – after Xerox PARC (Palo Alto Research Center) and perhaps Bell Labs in New Jersey – it was the Labs that often led the way for the former HP and to many HP insiders, even after the split, HP Labs remained the cultural heart of HP!

But the industry isn’t the same as it was a decade or so ago. It’s the user experience that is driving everything! It’s microservices, with fast time to market! It’s clouds, big data and real time data stream analytics. And it’s all because, as Whitman herself intimated, “Tomorrow belongs to the fast. Digital Transformation is fueling the idea economy … driving better business outcomes.”

 

HPE NonStop – Striim is in the spotlight.

It’s All about Focus and When It Comes to HPE NonStop, Striim Is in the Spotlight

In my most recent travels I was able to work in almost a week in Las Vegas where I participated in the HPE big-tent marketing event, HPE Discover 2016. What made this particularly interesting was that it was the first event of its type that HPE has held in the Americas following the split. As we know, the former HP split in two, forming HPQ and HPE, where HPQ now focuses on consumer products including PCs and printers and HPE focuses on enterprise products and software, including servers, storage and networks. Central to all that HPE is doing is the digital transformation evolution currently under way as it fuels the “idea economy” to which HPE devotes so much of its marketing dollars.

HPE NonStop – Striim is in the spotlight.

The word focus came up time and time again and reinforced just how serious HPE is in its belief concerning the digital transformation that is under way. As HPE CEO Meg Whitman sees it, “digital transformation is no longer about choice; it’s mandatory. It’s real. (And it’s how) you reimagine your business.” Whitman also talked openly about the digital transformation as not being a one-off event or even a journey but rather, the pursuit of “multiple if not hundreds of small journeys!” As she came to the end of her opening remarks in her first keynote, Whitman made it clear that it was this focus by HPE on digital transformation that would “accelerate what’s next for you.”

According to Whitman, with the new HPE there will continue to be focus on secure, software-defined infrastructure while leveraging HPE hardware. Furthermore, according to Whitman, this HPE is a company that will “help run traditional IT better while bridging to clouds.” HPE will continue to pursue data center business, campus edge business as well as software with a renewed focus on “growing our big data analytic business.” But what is next for you? “Tomorrow belongs to the fast!” But in order to be fast it is clear that analytics, machine learning augmented reality will all play a role, according to Whitman.

In my post of June 20, 2016, to the NonStop community blog Real Time View, Hybrids, Connectivity, Analytics and Natural Language Processing … NonStop, of course! I dived into analytics and in so doing, focused on Striim. When it comes to data and data analytics, in my opinion, the premier vendor in this space that should be on the radar screens of every NonStop user is Striim. I noted at that time how the first PoCs by NonStop users were being done and shortly there would be news about a number of successful use-case scenarios everyone in the NonStop community will be able to relate to.

Perhaps a little more pragmatically, I also made the observation in that post of how today, Striim has become synonymous with data stream analytics and there will not be a transaction process solution operating anywhere without the deep ties to the environment that Striim so effectively supports. But still there are NonStop users who aren’t aware of what the Striim product can do and how it can help propel businesses into reimagining their business. Not only are there NonStop users who don’t know the value proposition of Striim but aren’t entirely sure of how Striim helps them integrate the world around NonStop with what is transpiring on NonStop.

In a recent publication, Striim Cofounder and EVP, Sami Akbay, said, “What our product does is it processes data from many sources and makes sense of it in real time, then it retains only the information that is interesting to the end-user, as defined by the user. The volume of data that most companies have access to is very large, but there is only a small amount that is useful. Our product finds that useful information and puts it back onto the NonStop.” The most important item to focus on here is that after analysis, Striim puts it back onto the NonStop – this is where the all-important integration of what is happening in the world outside of NonStop can integrate with the NonStop!

I continue to be surprised by how many times I am informed that no, we don’t want our NonStop exposed to the outside world. How short-sighted is this – businesses more critical applications often run on NonStop and that is why the NonStop system maintains a presence within the business. It can no longer operate in isolation but rather it too has to be a part of the many small journeys that are happening as business accelerates to what’s next. The idea economy is all about turning ideas into new business opportunities in hours if not minutes (and definitely not quarters or years) and to have any chance of doing this, integration between NonStop and the rest of the business world has to happen in real time!

“Striim helps companies leverage a wide variety of different types of streaming and operational data the instant it’s generated, before it lands on disk. From transaction or change data from Oracle, MySQL and HPE NonStop, to log files, social streams, sensor data, and much more. Then, this data can go through a context loop (see diagram on previous page), where results can be further analyzed to uncover additional business insights or fed back into the data pipeline to give context to streaming data,” explained Akbay. “Our platform is able to filter, aggregate, transform, and enrich data in-motion so companies can see what is happening as it happens.”

HPE is sparing no expense to inform the business world about its focus on digital transformation as it brings to the fore the concept of the idea economy. It’s all predicated on being able to respond to change in real time and to embrace new ideas as they appear where they bring value to the business. Striim plays such a critical role in making this happen and does so in real time just as HPE acknowledges it must happen. No matter how bright a spotlight shines on Striim, its focus on illuminating what’s happening as it happens echoes Whitman’s most critical message. When it comes to analytics then Striim is “no longer about choice; it’s mandatory. It’s real. (And it’s how) you reimagine your business.”

Running Real or Running Virtual, NonStop Remains in the Race; Striim Ensures NonStop Is Winning!

In a new post to the ATMmarketplace blog I just completed about how combinations of self-service devices and automation (including robotics) may lead to unanticipated combinations of ATMs and Kiosks – surprisingly, potentially devices that deliver burgers and cash. Without the ketchup on your banknotes, of course! However, it did raise the question about behavior – would the public at large welcome such convenience? Would the service sector decry the decimation of the workforce that would arise at a time when, after all, it is the service sector where the vast majority of workers are now employed?

Understanding the behavior of the public is becoming critical to all business decisions and tracking what’s trending can become what separates success from failure – a message trumpeted to every business via every media channel of late. No secrets here, but you really can’t assess success unless you initiate analytics and business analytics today is what is attracting the attention of IT everywhere. Whether you are running your application in the cloud, on a mainframe, or a cluster of commercial, off-the-shelf servers – there’s a compelling reason to pursue business analytics as the impact on business outcomes is inescapable.

You may not like getting burgers and fries to go along with your cash and statements and I am sure there will be many who do not warm to such a development, but if it turns out you are in the minority and didn’t adjust your mission critical systems accordingly, you may be in serious trouble. Nature abhors a vacuum and if you aren’t participating then of course, someone else will be. Eating your lunch takes on a completely new meaning in this instance.

Across the NonStop community, there’s been so much information forthcoming from HPE that it is hard to miss just how much choice is on offer to NonStop users. For the past six months HPE NonStop product managers and development engineers have been participating in NonStop events worldwide and the message has been unmistakable. What you once thought NonStop systems provided and what they looked like is rapidly fading into the background – it’s all new.

At the recent BITUG Big SIG event in London, U.K., head of NonStop sales for Europe, the Middle East and Africa (EMEA) talked of how just two years ago HPE announced NonStop would be ported to the Intel x86 architecture and a year later, HPE delivered. It was only a year ago that HPE introduced hybrid systems featuring NonStop and Linux / Windows supported by a InfiniBand cluster fabric together with a new API that made it easy to write directly into each other’s main memory, bypassing the operating system, and this is now being provided to early adopters as well as a select number of vendors.

More recently, HPE has announced the virtualization of NonStop and described not just a new vNonStop operating system but a new vCLIM storage and communication capability, all running atop of Linux that in turn is supporting Kernel-based Virtual Machine (KVM). I have written a lot about this over the past few months after a working environment complete with a telco application was demonstrated at the Mobile World Congress. There’s every indication that first customer deployment will have happened by the end of the year, possibly as soon as the next European 2016 HPE Discover event.

How is the NonStop community behaving following such announcements? Does it view this as a departure from the true world of NonStop? Is it anxious about being no longer able to wrap its collective arms around a NonStop system? For the moment, the responses paint a picture of mixed expectations as vendors wonder about the relevance of their middleware and tools offerings even as users wonder about whether the much-beloved attributes of NonStop are being diluted. Does the picture being painted of a new open, virtual NonStop even make sense?

Fortunately, the real response is that running NonStop on virtual machines isn’t a whole lot different to running NonStop on real machines – it will all comes down to configuration and your appetite for risk. Don’t configure your primary and backup processes in the same physical machine, obviously. However such considerations pale in light of what new tasks can be accomplished with hybrids, virtualization and really fast interconnect fabrics. Cheap server farms running Linux and Windows can be part of a NonStop solution in much the same way as disks and tapes were once viewed.

More importantly perhaps is that now solutions that didn’t run natively on NonStop can be a part of a NonStop solution – running on the adjacent Linux system participating in the hybrid. Big Data frameworks housing data lakes can be directly accessed by NonStop processes. Suddenly the much neglected integration of data analytics with transactional systems in real time is a lot nearer to reality than in the past. Across the InfiniBand fabric interconnect, oblivious of the underlying participating operating systems, everything can be accessed and everything can be retrieved. In real time!

It is into such contemporary configurations that Striim will have an immediate impact on NonStop systems. Whether Striim is simply keeping track of resources and ensuring SLA’s are maintained to where Striim may be proactively engaged in fraud protection, there’s no real barriers to how NonStop users can deploy Striim – Linux and Windows running natively are now an integral part of NonStop hybrids and while such adjacent systems may not share the same much-beloved attributes of NonStop, the heart of the applications running as they will on NonStop will continue to be as available as they have always been in the past – running real or virtual.

Understanding the behavior of systems and of individuals is critical in determining what is finding acceptance and what is not. Winning is still the all important consideration in business despite the murmurings of pundits to the contrary – you are either in business or out of business. Striim brings so much to the table to ensure staying in business (and indeed, growing) and with NonStop evolving as rapidly as it is, Striim is ensuring the winning ways of NonStop continue right along with the winning ways of the business!

 

 

Big Data Makes It into the HPE Nonstop Playbook …

There’s been a flurry of user focused events centered on HPE NonStop systems. From the Florida coast to the Arizona desert to European capitals there’s been a lot of interest in NonStop, coming from existing users of NonStop. The most recent user group event, GTUG, held in Berlin, Germany, wrapped up a week ago but already the presentations are being posted online for everyone to view and perhaps the most popular of all has been the presentation given by the head of NonStop R&D, Andy Bergholz.

Just the title alone is sure to attract attention – NonStop today, NonStop tomorrow – the future of Mission Critical Cloud Compute. Just click on this URL to check out Andy’s presentation for yourself. Surprisingly, and a clear break with traditions of the past, there is no need to complete any NDAs to see what is on the minds of those charged with leadership of development for NonStop. Along the foot of each slide was the usual disclaimer, “This is a rolling (three to five year) Statement of Direction and is subject to change without notice” but I am comfortable with that caveat as no major vendor is immune to wind shifts, no matter their strength, direction or timing.

While it is a product roadmap in that individual hardware and software components are mapped out in some detail, it is also a fascinating look into what new technologies are developing traction within the NonStop development organization – just the presentation title alone provides plenty of clues. Mission critical cloud compute? Who would have thought that we would see this label appended to a presentation on NonStop? And few would have expected to see statements about the near term and future for NonStop with this degree of clarity?

CloudImage

As you progress through the slides, the number of times data is referenced is hard to miss. “Solutions you can rely on. Without fail. For business processing and decision support apps and data vital to the enterprise,” is a clarion call that transaction processing isn’t what it used to be. It’s a recognition that data, and by “data” NonStop development wants you to know that they intend to empower you with real time analytics (whereby you can gain a competitive advantage with real-time, actionable insight from all your data) even as they enable productivity with data (improving the quality of the data businesses need to do their job). Yes, data is anchoring much of the activity within NonStop development.

“Data explosion and IT complexity will lead to multi-cloud environment with many different hybrid computing architectures,” according to one slide Andy used. Without looking too far ahead and referencing just what is coming to market shortly for NonStop and Linux hybrids, connected and accessed via Yuma (HPE NonStop software solution based on InfiniBand providing tighter coupling with virtual, as well as cloud, solutions), “focus is on Big Data / OLTP Database as a differentiator.” It’s at this point that I can make the observation that by all practical measures, HPE NonStop is catching up with the messages that have come from Striim for a number of years now – yes, any talk of data without the capacity to support data integration, continuously, and all-inclusively and from multiple sources, cannot support the apps and data which is vital to any enterprise!

Flipping through the slides that Andy used, much as a child does with primitive animation, it’s clear that data is once again a priority for HPE NonStop, but there’s more. It’s been some time now since any serious effort has been put into the NonStop SQL product – once a leader in SQL it has fallen behind. Not that the central attributes of being fault tolerant and capable of running mixed workloads is being ignored or in anyways compromised, but rather, NS SQL is being made to be compatible with other third party offerings. The goal, as HPE NonStop states, “is for customers and prospects to be able to bring Oracle applications to NS SQL/MX with little to no porting effort.” In other words, provide these users with the “scale out (of) Oracle applications without RAC (clustering) complexity and limitations.”

As HPE EMEA boss, Dave McLeod, said during his GTUG presentation, when it comes to NonStop platform today (and here, Dave sees NonStop as a great franchise) , “It’s a brave new world (even as we are) leaving nothing behind. As we go forward we MUST bring forward ALL our fundamentals. We MUST keep on walking the tightrope of Open vs Security. All the great franchises evolve (even as they) retain what made them great.” As HPE so often reminds the community, the challenge will continue to be less about determining what these technologies can do, and more about understanding how, and at what cost, these technologies can help you and your team deliver real business value.

Stiim is all about the ingestion of data streaming in from all manner of sources, be they databases, tables, transaction flows, etc. and processing, and indeed enriching, before the data passes into a data lake. However, what separates Striim from other offerings targeting the NonStop community is that the output from Striim can be turned back into the transaction processing platforms, in real time, where it can influence the course of subsequent transaction processing. When it comes to being prepared to handle more data, Computerworld cautioned its readers, in an April 27, 2016, article The Importance of Data in Digital Disruption that, “Today, timely, accurate, and relevant data is the fuel of a successful business. The question is, ‘Is your business ready for a digital disruptor challenging your space?’ It is becoming more and more clear that those organizations most likely to be disrupted do not have data as part of their core.”

If HPE NonStop has renewed its interest in data and is sensitive to the role data is playing across its installed base (even as it works to attract users of other database solutions to NonStop), then users of NonStop systems should know that Striim provides a solution to that often raised question – is the data timely, accurate and relevant? Does it hold the value required to better differentiate my business from that of my competitors? The HPE NonStop organization believes data makes a difference and yes, from what I am seeing of Striim working in the NonStop customer base, the differentiation is not only possible it may just be what every business needs!

Sensors, Heartbeats and Analytics – What Gets Our Blood Racing?

There continue to be ongoing discussions across the NonStop community about the benefits, real or otherwise, from tapping into data stream analytics. This dates back to the early days of NonStop systems when every processor cycle was important – when it comes to mission critical applications, latency is a very big concern. Very early on there were even concerns about the possible degradation simple application monitoring would produce, so much so that solutions aimed at better visualization of the systems and the networks they supported were late to come to NonStop. Transactions were lean and users could ill afford to enrich them for any reason at all.

However, this is no longer an issue for enterprises moving to the latest NonStop X systems. Based on the Intel x86 architecture, early results are indicating anywhere from 50% to as much as 200% performance gains over those equivalent NonStop i systems based on Itanium processors. Concerns over latency and degraded performance have been replaced with concerns about the amount of available headroom each processor now possesses. For NonStop users deploying NonStop systems with multiple processors oftentimes as many as ten, twelve, or sixteen processors per system, in order to ensure very rapid take-over times in situations where imminent software and / or hardware failures are detected by the OS, even reducing the number of processors in a NonStop X still gives them ample room to process richer transactions.

However today, according to Meg Whitman, HPE CEO, “IT strategy and business strategy are no longer separate, they have become inseparable … every business is a technology business today.” In other words, for business to survive and indeed thrive it needs to exploit technology advances on every occasion and perhaps the most important aspect of such advances is the benefits from being able to predict likely outcomes of customer situations. Knowing, with a degree of certainty that borders on the magical, that customers in a certain demographic will always pick the red sweaters over blue ones brings with it enormous benefits felt all the way from the cashier to the supply chain managers.

Magic is the first thing than came to mind after reading an article To Catch a Thief in the March 15, 2016, issue of Fortune Magazine. “A new generation of tools beyond fingerprints and iris scans can measure qualities like body temperature and blood circulation at a short distance and without alerting the subject,” notes reporter, Jeff John Roberts. Quoting Barnabas Szilagyi, a principal at Capco, a company that helps large banks identify security threats, “Each heart rate is unique. The new tech can sense what’s in your veins, your blood pressure and body heat, and identify with great accuracy who you are.”

More importantly,” Sensors are critical to these biometric tools, but it’s the analytics software that makes them shine.” Citing just one example, Fortune then described an East Coast fraud case where “Capco’s tech first determined that a spike in suspicious bank account applications occurred around the same time every month. It then cross-matched those dates with local shipping records to conclude that the ‘customers’ were linked to the monthly arrival of a ship from a small European country. Such records are just one of hundreds of external data sets that banks can use to spot patterns. Others include illegal ATM transactions and credit card payments to massage parlors—both of which soared when the phony bank customers came to town.”

Clearly, to echo Whitman’s observation, technology is the business as without the technology, separating the good from the bad let alone recognizing the need for red sweaters is only feasible when integrating technology with current business logic. At some point, crucial analysis outcomes need to be presented to transactions in flight and the work involved, even for the most jaded of NonStop pragmatists, will be completely acceptable. That is, almost undetectable by those heavy handed application monitoring products that they were so fearful of not all that long ago.

“It’s the analytics software that makes them shine,” a message that shouldn’t be lost on anyone in the NonStop community and products now coming to market, such as Striim, bring with them the technology the businesses need in order to shine, and shine businesses have to do simply to stay in business! Keys to the analytics software we have today are the algorithms, the modelling and the ease with which they can be tuned to look for patterns we value. Big Data may be all about data lakes but it’s the data streams feeding the lakes that are so important to the mission critical applications running on NonStop systems.

HPE has invested heavily on NonStop and with NonStop X is placing huge bets that shared nothing fault tolerant systems, with permanent availability, will still be driving customer-facing applications where money changes hands. That’s been the sweet spot for NonStop systems for more than four decades and financial institutions continue to depend upon them in this role. But monitoring sensors, picking out the bad guys, and watching for red sweaters to fly off the shelf require a little more than the basics and it is with the arrival of Striim whereby mission critical applications on NonStop can really lift their game.

Our blood may be boiling when we think of the possible intrusions into our lives this is making but seriously? Shouldn’t we be pleased to see our true needs being better addressed as a result and shouldn’t we be pleased to know that, as unique as we most definitely are, we can still open bank accounts, take out loans and obtain cash without becoming suspect of anything unsavory! And shouldn’t we be just as pleased to see our NonStop systems never missing a beat in the process.

Data Deluge — Striim and NonStop Manage Overwhelming Volume, Variety and Velocity of Data

The time for speculation is long gone. Predictions about the likely future challenges facing IT have become a reality, at least in one aspect. Simply put, we are facing a wall of data coming, rushing towards us and there are few places to hide. The pundits tell us that wisely used it is this data that will separate winners in business from losers. But is it possible to extol the virtue of water to a drowning man?

In Sydney’s famed Opera House, spread across a long curved wall in the vestibule at the rear of the concert hall, facing an amazing view of the harbor is a mural called Five Bells. However, to the locals it’s simply called the last thoughts of a drowning man as it drew inspiration from a poem of many years before by another artist that fell into the harbor and drowned. A nearby war ship evidently sounded five bells at the time the artist was drowning – hence the mural’s official name.

This past week I was driving to the ATMIA US Conference in New Orleans when the skies east of the city opened unleashing a deluge of water, I have not experienced anything like that in many years. Visibility evaporated and traffic was reduced to a crawl and it was a stark reminder of just how great the volume of data heading towards us has become. And the land quickly became water-logged!

Take just a few industries where the increase volume of data is becoming very apparent. This past holiday season ecommerce really took it to bricks and mortar retailers and from the data deluge there’s rising water where great waves are forming that show no signs of breaking any time soon. To the contrary, it is continuing to climb in height and the shoreline is nowhere to be seen. In the January 20, 2016, article Why brick-and-mortar retail faces a shakeout to the publication, Retail Dive, came the hews that, “‘We are right now in the middle of the biggest, most profound transformation in the history of retail,’ Robin Lewis, CEO of the Robin Report and a former executive at VF Corp. and Women’s Wear Daily.”

Furthermore, according to Retail Dive, “‘We’ve now gone to a business where your best customer can be standing in your best store and with three touches of their thumb to a piece of glass, they can buy from your biggest competitor,’ Fred Argir, Chief Digital Officer for Barnes & Noble, told Retail Dive in an interview. ‘That’s changed everything.’” To be fair, the publication also quoted Steve Barr, a partner and the US Retail and Consumer Sector leader at PricewaterhouseCoopers. “The great news is the retail store is not dead,” Barr said. “But the retail store that does not have a meaningful relationship with the consumer is dead.”

Meaningful relationships? Yes, a much better understanding of the behavior of consumers even as we determine the trends these consumers will likely embrace. This means tapping into data as its being generated and picking up the information gems we need as they pass by. And one sure data source clearly is the byproduct of the transactions customers initiate particularly should it involve your competition. I admit, I did hit “Purchase” button on amazon.com while looking at a book at Barnes & Noble, not only because the price was a tad lower but, more importantly, because the site provided the book’s reviews and more information!

When it comes to thumbing a piece of glass, the change under way from simply having a device from which to make a phone call to where communication is visual and global, is sapping resources from all over the digital footprint surrounding each and every smartphone. But the usage patterns that can be derived not only helps mobile phone operators to better tailor messages to individual subscribers but allows them to accumulate metadata on an unprecedented scale and it’s becoming extremely valuable to not just retailers or even bankers, but to industry as a whole.

Whether it’s the purchase of a new car or just an old pair of shoes, ensuring the right product is presented to a consumer motivated to buy it is paramount, but even so, making sense of so much data coming from each and every one of us makes the data deluge all the more difficult to process. Clearly, the former model of capture, store and process antiquated when the challenge lies in integration with the real time world of transaction processing.

With so much being written about the Internet of Things (IoT) and even the Internet of Everything (IoE) there is recognition that as more and more sensors come online the data deluge is going to become even greater. However, much closer to home for all of us – the human body – as we are beginning to hear about, is one giant collection of almost infinite sensors. Imagine the plight of the medical profession, including researchers of every discipline, as steps are taken to mandate real time processing of everything our body generates from its vast array of sensors?

The new model involving capture, process and store offers the only real way to make sense of it all. This is not to say that there remains little of value for data scientists to exploit when data is finally stored but rather, by industry and markets, data by necessity needs to be parsed with only pertinent data ever making it to storage. For the HPE NonStop community this is of growing importance, as NonStop systems are home to many of the most important real time transaction processing business applications on the planet.

Striim is beginning to gain a small foothold within the NonStop community, with its first deployments taking hold. And this is good news for everyone in the NonStop community as even NonStop users are finding it necessary to fight back to keep consumers engaged with their products and services. The data deluge is upon us and the waters are rising fast – bigger and bigger waves are breaking over companies old and new.

We may be in the biggest and most profound transformation in history and it’s not going away and it’s not a time to simply redirect the waters to lakes hidden from view. To survive we need to know our customers and we need to be sensitive to their ever changing behavior and doing so, even as they instigate a transaction, is critical. Drowning is simply not a viable business option!

Barriers Are Down: For NonStop, Striim Joins the Worlds of Stream Processing and Transaction Processing!

The spotlight is shining even more brightly on Big Data, with no evidence that the illumination is lessening in any way. Business is beginning to understand the value proposition of Big Data, and increasingly, it’s not just the process of saving everything an enterprise encounters as it pursues its business interests, but rather the impact processing the data in real time that matters most. Stream processing is adding the “special sauce” enterprises have been looking for to better assist them in identifying trends and patterns that develop and to integrate the knowledge about these trends and patterns into customer-facing solutions to be able to respond rapidly to changes in customers’ behavior.

While this is well understood by IT in general, among the NonStop community progress towards embracing stream processing has been slow and mirrors the overall rate of change occurring among the NonStop user community as a whole. The good news is that NonStop users rarely leave the platform, but it’s also very apparent that due to the nature of the applications NonStop supports, the highly important-customer facing solutions, change happens at what seems to be a glacial speed. Occupying highly visible positions within an enterprise results in a level of risk aversion that sometimes gets in the way of greater integration of NonStop with the rest of the enterprise, but that’s all about to change.

But first, what is stream processing and why is it being separated from the bigger story surrounding Big Data? According to a report published by InfoQ, an electronic publication aimed at facilitating “the spread of knowledge and innovation in professional software development,” stream processing and today’s real time transaction processing are joined at the hip. “Stream processing is designed to analyze and act on real-time streaming data, using ‘continuous queries’ (i.e. SQL-type queries that operate over time and buffer windows). Essential to stream processing is Streaming Analytics, or the ability to continuously calculate mathematical or statistical analytics on the fly within the stream. Stream processing solutions are designed to handle high volume in real time with a scalable, highly available and fault tolerant architecture. This enables analysis of data in motion.”

Just as importantly for the NonStop community, the explanation of stream processing goes even further, according to InfoQ in this post of Sep 10, 2014, Real-Time Stream Processing as Game Changer in a Big Data World with Hadoop and Data Warehouse. “In contrast to the traditional database model where data is first stored and indexed and then subsequently processed by queries, stream processing takes the inbound data while it is in flight, as it streams through the server. Stream processing also connects to external data sources, enabling applications to incorporate selected data into the application flow, or to update an external database with processed information.” It is this concept of “enabling applications to incorporate selected data into the application flow” that makes embracing stream processing attractive to all enterprises, even those exploiting the real time transaction processing capabilities of NonStop systems.

There are often discussions within the NonStop community about the relevance of Big Data and, by implication, stream processing. Oftentimes, Big Data is being confused with really big SQL databases – a label that might have its place at some enterprises, but in general, trivializes much of the value proposition of stream processing and indeed, Big Data. I see little evidence across the community of support for running Big Data directly on NonStop. What I do see is the first trembling footsteps being taken to pull data from out of Big Data platforms even as the arrival of stream processing attracts more vendors from among the real time transaction processing solutions community.

At a time when Martin Fink, EVP & HPE CTO, talks openly of the mainstreaming of NonStop he also talks about strategy when it comes to what’s next for NonStop. “The mainstream world today tends to revolve around 4 things: Linux, Virtualization, Open Source (and) Clouds.” Furthermore, according to Fink, “I also believe that we can start to make NonStop available as a service. The mixed-mode NonStop SQL/MX engine could be offered as a service that delivers a database engine that’s currently unmatched in performance capabilities, scalability and resilience.” What’s missing? There’s no reference by Fink whatsoever to including Big Data in discussions about the mainstreaming of NonStop within HPE.

There are no barriers for applications running on NonStop to consider integration with Big Data – Big Data platforms, such as any popular HADOOP distribution, run external to NonStop. All it takes is a pipe and a process – think back to the days when the all-important consideration for the NonStop community was how to network with IBM mainframes via SNA. It required implementing SNA on NonStop and adding networking connectivity hardware so that connections could be established and data could be exchanged. Think of much the same need for stream processing, only this time with the energy HPE NonStop development is putting into the support of hybrid infrastructure, including the Yuma project with its APIs in support of InfiniBand, and the inclusion of stream processing becomes a lot less risky proposition than deploying SNA all those years ago.

Striim, is a stream processing platform coming from a development team very familiar with NonStop. Many of the members of this team were the same folks that brought GoldenGate to market, so pulling information from log files, via change-data-capture (CDC), isn’t a difficult concept to grasp – think of stream processing much as we used to think of intermediary capture with trail file processing. There are new process names, certainly, but the capture side is something the NonStop community should be very familiar with – detecting changes in behavior, after all, is simply a matter of correlation of changes coming from many sources to determine commonality. Patterns and trends, essentially!

NonStop will not be immune to stream processing just as it’s highly unlikely that any Big Data platform will find its way onto NonStop. It’s not necessary, given today’s IT transformation to hybrid architectures. Furthermore, IT will not be immune to feeding the results of stream processing to applications running on NonStop – advanced system and network monitoring, fraud detection, SLA compliance, database replication synchronization all come to mind. And for the team at Striim, we have access to folks who know as much about NonStop as anyone belonging to the NonStop community. Whatever barriers we thought were present, preventing us from further consideration of integrating the worlds of stream processing with transaction processing, all in real time, are definitely down and for that, the NonStop community wins yet again.

The Striim™ Platform – Data, the Instant It’s Born, Becoming Useful to HPE NonStop!

The weather changes in an instant in the Colorado Rockies. I’ve lived here long enough never to take the weather for granted. It’s not a cliché but rather sound advice based on years of experience so, if you like the weather you’re having, don’t expect it to last as it will change. You want four seasons in a day then welcome to Colorful Colorado!

Tourist promotions coming out of Colorado feature many messages highlighting change – if you like to ski then certainly, this should be at the top of your list of destinations. On the other hand, information about what to expect on any given day can be obtained from any of the major television networks, in real time, as all of them support apps on your smartphones and tablets. No risk of something unexpected overtaking you!

Furthermore, performing any kind of transaction where the application has a view to your location you will likely see a sidebar display of the weather showing you conditions as they change. There’s really no excuse to be caught out in conditions you hadn’t expected as a simple inquiry will give you the complete picture. Lifesaving insight beamed right into your hands. Messages and transactions! Location transmission! Real time updates and complete picture (as to what to expect) – all at your fingertips even as you are more likely than not aware of what is happening behind the scenes.

Processes and databases are far from our minds as we hit the slopes and yet, in today’s always-connected, always-on, fully engaged world, so much tailored (indeed, targeted) information is directed at us that all too often we simply assume that this is just the normal way of things. The ether is alive with data that packaged appropriately allows us the freedom the previous generations never experienced.

A recent posting to the LinkedIn group, Connect HP User Group Community, included a link to a blog post heralding Big Data is Dead. All Aboard the AI Hype Train! While focused more on the hype surrounding AI, its conclusion reminded me of just how far we have to go with respect to applying the results from data stream analytics to business problems, and of course, how early we are in the lifecycle of extracting meaningful data and integrating with transactional systems. As the author of the post concluded, “But for now and the foreseeable future, the best way to attack your business problems is still done the old fashioned way: creative, smart, and curious people who can ask the right questions and know how to get them answered. Big, dumb algorithms and warehouses of data are useless without them.”

And yet, it’s exactly the work being done by creative, smart, and curious people at WebAction, Inc. that continues to keep me interested in the potential of Striim™ to make an impact on transactional systems, particularly when it comes to those applications running on HPE NonStop systems. Striim draws attention to this on its web site; becoming involved in addressing such business problems, we may be forgiven for overlooking just how successful Striim is becoming in simplifying the use of algorithms.

The Striim platform hides the reality that data stream analytics, as supported by Striim in its current iteration, isn’t about a new data warehouse or even operational data store utility or tool but rather, a fully functioning and operating non-intrusive, system-independent, platform. One that is fully capable of being deployed, interfaced, turned-on, and adding valuable insight to applications from the get-go. From the instant data is born, Striim can be configured to make it useful. The Striim platform begins sifting through a universe of information, continuously being generated all around us, to ensure that any piece of information potentially unique (in the instant of time it’s captured) can influence the behavior of all that it comes into contact with. Dramatic?

Yes, after a fashion, but more to the point, the new reality. Knowing this, enterprises are turning to a new breed of platforms of which Striim offers a degree or originality not seen in other implementations that ensures enterprises every piece of information, relative to your business, is identified, presented and potentially acted upon depending on the ongoing circumstances. Returning to the concept of Striim being more than a utility or tool and instead, being a platform, the story becomes even more compelling as you become aware of how Striim uniquely combines both streaming data integration and streaming operational intelligence in a single platform.

Indeed, Striim is the only end-to-end solution for streaming integration and intelligence, a claim that I haven’t seen any other vendor attempt to push back on. Cool! An interesting perspective of itself but the reality is that much of the frontier Striim is pushing beyond has to do with how Striim integrates the here and now with as much of its surroundings as an enterprise can expose to Striim. Which brings me to the important value proposition of the Striim platform; Striim will add relevant context to the data.

For enterprises wanting to not just act on the data the instant it’s born but to have it mapped into information that they can use, the data streaming past the enterprise needs to be enriched with reference and historical data, for instant, and usable context, and yes, at-speed and at-scale. And that’s the really difficult part of what enterprises truly require and yet, it’s the forte of Striim. Data, the instant it’s born, massaged by a platform that’s non-intrusive yet capable of enriching it in real time rapidly, and in a manner that can keep on scaling out, provides behavioral insight unimaginable by our predecessors. Creative, smart, and curious people still do make a difference.

I live in the HPE NonStop world where my clients rely on NonStop systems for handling all the transactional loads you see coming the way of financial institutions, mobile phone operators, healthcare workers and manufacturing lines, 24 X 7 X forever. The ability of the Striim platform to work in this environment is testament that the needs of enterprises to know of what is coming and to react accordingly has become paramount to the business health. Selling swimsuits during a blizzard isn’t a winning strategy for any merchant looking to prosper.

In Colorado, much has been done to keep communities informed of change as it happens so rapidly. That’s just the byproduct of living alongside the continental divide. For NonStop and Striim, the challenges are similar. A byproduct of processing mission critical transactions, in real time, with a need to be kept informed, as change happens just as rapidly. Make data useful the instance it’s born? Absolutely; maybe it’s time for you to take a closer look at the Striim website for yourself!