MySQL to Google BigQuery using CDC

Tutorial: Migrating from MySQL to BigQuery for Real-Time Data Analytics

 

 

In this post, we will walk through an example of how to replicate and synchronize your data from on-premises MySQL to BigQuery using change data capture (CDC).

Data warehouses have traditionally been on-premises services that required data to be transferred using batch load methods. Ingesting, storing, and manipulating data with cloud data services like Google BigQuery makes the whole process easier and more cost effective, provided that you can get your data in efficiently.

Striim real-time data integration platform allows you to move data in real-time as changes are being recorded using a technology called change data capture. This allows you to build real-time analytics and machine learning capabilities from your on-premises datasets with minimal impact.

Source MySQL Database

Before you set up the Striim platform to synchronize your data from MySQL to BigQuery, let’s take a look at the source database and prepare the corresponding database structure in BigQuery. For this example, I am using a local MySQL database with a simple purchases table to simulate a financial datastore that we want to ingest from MySQL to BigQuery for analytics and reporting.

I’ve loaded a number of initial records into this table and have a script to apply additional records once Striim has been configured to show how it picks up the changes automatically in real time.

Targeting Google BigQuery

You also need to make sure your instance of BigQuery has been set up to mirror the source or the on-premises data structure. There are a few ways to do this, but because you are using a small table structure, you are going to set this up using the Google Cloud Console interface. Open the Google Cloud Console, and select a project, or create a new one. You can now select BigQuery from the available cloud services. Create a new dataset to hold the incoming data from the MySQL database.

Once the dataset has been created, you also need to create a table structure. Striim can perform the transformations while the data flies through the synchronization process. However, to make things a little easier here, I have replicated the same structure as the on-premises data source.

You will also need a service account to allow your Striim application to access BigQuery. Open the service account option through the IAM window in the Google Cloud Console and create a new service account. Give the necessary permissions for the service account by assigning BigQuery Owner and Admin roles and download the service account key to a JSON file.

Set Up the Striim Application

Now you have your data in a table in the on-premises MySQL database and have a corresponding empty table with the same fields in BigQuery. Let’s now set up a Striim application on Google Cloud Platform for the migration service.

Open your Google Cloud Console and open or start a new project. Go to the marketplace and search for Striim. A number of options should return, but the option you are after is the first item that allows integration of real-time data to Google Cloud services.

Select this option and start the deployment process. For this tutorial, you are just using the defaults for the Striim server. In production, you would need to size appropriately depending on your load.

Click the deploy button at the bottom of this screen and start the deployment process.

Once this deployment has finished, the details of the server and the Striim application will be generated.

Before you open the admin site, you will need to add a few files to the Striim Virtual Machine. Open the SSH console to the machine and copy the JSON file with the service account key to a location Striim can access. I used /opt/striim/conf/servicekey.json.

You also need to restart the Striim services for these setting and changes to take effect. The easiest way to do this is to restart the VM.

Give these files the right permissions by running the following commands:

chown striim:striim <filename>

chmod 770 <filename>

You also need to restart the Striim services for this to take effect. The easiest way to do this is to restart the VM.

Once this is done, close the shell and click on the Visit The Site button to open the Striim admin portal.

Before you can use Striim, you will need to configure some basic details. Register your details and enter in the Cluster name (I used “DemoCluster”) and password, as well as an admin password. Leave the license field blank to get a trial license if you don’t have a license, then wait for the installation to finish.

 

When you get to the home screen for Striim, you will see three options. Let’s start by creating an app to connect your on-premises database with BigQuery to perform the initial load of data. To create this application, you will need to start from scratch from the applications area. Give your application a name and you will be presented with a blank canvas.

The first step is to read data from MySQL, so drag a database reader from the sources tab on the left. Double-click on the database reader to set the connection string with a JDBC-style URL using the template:

jdbc:mysql://<server_ip>:<port>/<database>

You must also specify the tables to synchronize — for this example, purchases — as this allows you to restrict what is synchronized.

Finally, create a new output. I called mine PurchasesDataStream.

You also need to connect your BigQuery instance to your source. Drag a BigQuery writer from the targets tab on the left. Double-click on the writer and select the input stream from the previous step and specify the location of the service account key. Finally, map the source and target tables together using the form:

<source-database>.<source-table>,<target-database>.<target-table>

For this use case this is just a single table on each side.

Once both the source and target connectors have been configured, deploy and start the application to begin the initial load process. Once the application is deployed and running, you can use the monitor menu option on the top left of the screen to watch the progress.

Because this example contains a small data load, the initial load application finishes pretty quickly. You can now stop this initial load application and move on to the synchronization.

Updating BigQuery with Change Data Capture

Striim has pushed your current database up into BigQuery, but ideally you want to update this every time the on-premises database changes. This is where the change data capture application comes into play.

Go back to the applications screen in Striim and create a new application from a template. Find and select the MySQL CDC to BigQuery option.

 

Like the first application, you need to configure the details for your on-premises MySQL source. Use the same basic settings as before. However, this time the wizard adds the JDBC component to the connection URL.

When you click Next, Striim will ensure that it can connect to the local source. Striim will retrieve all the tables from the source. Select the tables you want to sync. For this example, it’s just the purchases table.

Once the local tables are mapped, you need to connect to the BigQuery target. Again, you can use the same settings as before by specifying the same service key JSON file, table mapping, and GCP Project ID.

Once the setup of the application is complete, you can deploy and turn on the synchronization application. This will monitor the on-premises database for any changes, then synchronize them into BigQuery.

Let’s see this in action by clicking on the monitor button again and loading some data into your on-premises database. As the data loads, you will see the transactions being processed by Striim.

Next Steps

As you can see, Striim makes it easy for you to synchronize your on-premises data from existing databases, such as MySQL, to BigQuery. By constantly moving your data into BigQuery, you could now start building analytics or machine learning models on top, all with minimal impact to your current systems. You could also start ingesting and normalizing more datasets with Striim to fully take advantage of your data when combined with the power of BigQuery.

To learn more about Striim for Google BigQuery, check out the related product page. Striim is not limited to MySQL to BigQuery integration, and supports many different sources and targets. To see how Striim can help with your move to cloud-based services, schedule a demo with a Striim technologist or download a free trial of the platform.

Why Choose Striim + Snowflake

Why Choose Striim + Snowflake

 

 

With greater speed, ease, and flexibility, Snowflake’s cloud data warehouse helps you gain meaningful insights by providing the performance and simplicity that traditional data warehouse offerings could not support.

Adopting a data warehouse in the cloud with Snowflake requires a modern approach to the movement of enterprise data. This data is often generated from diverse data sources deployed in various locations – including on-prem data centers, major public clouds, and devices. Snowflake users need real-time data movement capabilities to realize the full potential of data warehousing in the cloud, and benefit from more meaningful operational intelligence.

Why Choose Striim + SnowflakeWhile there are many vendors that provide data movement to Snowflake, Striim’s real-time, enterprise-grade streaming ETL solution offers advanced capabilities that other vendors can’t match, including the ability to:

  • Bring a wide range of data sets (including security log data, Kafka/messaging, IoT data, OLAP and OLTP) in a consumable format to Snowflake to achieve rich, timely, and reliable insights fast
  • Use robust, reliable, low-impact change data capture (CDC) from major enterprise database
  • Aggregate, filter, denormalize, enrich, mask real-time data in-flight using a SQL-based language before delivering to Snowflake to rapidly gain time-sensitive insights
  • Combine and correlate machine data, OLAP, OLTP and IoT data with other data sources in-flight for complete and rich operational intelligence
  • Perform online migration from existing on-prem data warehouses such as (Oracle Exadata or Teradata) to Snowflake with minimized interruption and risks
  • Offer an enterprise-grade solution designed for mission-critical, high data volume environments with built-in HA, scalability, exactly-once-processing (no data loss or duplicates) and security, all in a patented distributed platform

In fact, many current Striim + Snowflake customers previously deployed a solution from another vendor, only to find that the solution did not meet their needs for scalability, reliability, in-flight processing, or, simply, data access.

Let’s drill down on the ways Striim supports Snowflake’s advanced analytics applications with modern, enterprise-grade streaming ETL, and further allows customers to gain operational value from their Snowflake solutions.

Moving data in real time from diverse sources

Data ingestion providers that only collect data from a limited range of sources cannot support complete and rich operational intelligence in Snowflake. The Striim platform ingests real-time streaming data from a variety of sources out-of-the box, including data warehouses databases (including Oracle, SQL Server, HPE NonStop, MongoDB, Amazon RDS, and MySQL), log files from security devices and other systems, sensors, messaging systems, and Hadoop solutions.

Avoiding batch ETL-related inefficiencies

Data integration providers that use traditional batch ETL for the movement of data into Snowflake are unable to support real-time operational intelligence and time-sensitive analytics use cases in Snowflake. When users adopt an ELT architecture by using replication solutions with another tool for in-target transformations, this architecture creates complexities, especially during process recovery. Striim offers an end-to-end solution with a simplified solution architecture to bring a wide range of data in real time.

Minimizing source impact and interruption

While there are providers that offer very simplistic movement of file data, and do not support CDC from databases, these solutions cannot scale and may require the use of additional products to be integrated into the configuration. Using Striim, businesses can easily adopt cloud data warehouse with online migration from existing data warehouses with minimal disruption and risk.

Assuring security and reliability

Striim is an enterprise-grade solution with built-in HA, security, scalability, exactly once processing for no data loss or duplication for business-critical production systems.

Applying transformations and enrichments

Striim applies filtering, transformations, aggregations, masking, and enrichment using static or streaming reference datain real time – as the data is being delivered into Snowflake, in a consumable format to Snowflake – to accelerate the delivery of rich, timely, and reliable insights.

Striim + Snowflake Customer Use Cases

To illustrate the above points, we would like to share a couple of Snowflake customers that chose Striim to gain operational intelligence for critical operations.

A Leading International Cruise Line

  • Ingests real-time POS data from spa, casinos, and stores to enterprise data warehouse on Snowflake to generate near real-time offers for travelers.
  • Striim moves spending data from Oracle, SQL Server databases and GoldenGate Trail Files to Snowflake.
  • The solution provides real-time promotional offers and discounts to customers based on their spending behavior to improve both satisfaction and revenue.

A European HR Solutions Provider

  • Ingests data from Oracle and SQL Server database using real-time CDC from to Snowflake.
  • Chose Striim for low data latency with built-in scalability, security, and reliability.
  • Zero maintenance solution and pay-per-use models also were key considerations.

By streaming enterprise data to Snowflake with built-in scalability, security, and reliability, Striim simplifies the adoption of a modern, cloud data warehouse for time-sensitive, operational decision making.

We invite you to learn more about Striim’s Snowflake offering by visiting our Snowflake solutions page. Feel free to contact us if we can help you reliably move data in real time to Snowflake.

Real-Time Data Integration to Google Cloud Spanner

Striim Announces Real-Time Data Migration to Google Cloud Spanner

The Striim team has been working closely with Google to deliver an enterprise-grade solution for online data migration to Google Cloud Spanner. We’re happy to announce that it is available in the Google Cloud Marketplace. This PaaS solution facilitates the initial load of data (with exactly once processing and delivery validation), as well as the ongoing, continuous movement of data to Cloud Spanner.Real-Time Migration to Google Cloud Spanner

The real-time data pipelines enabled by Striim from both on-prem and cloud sources are scalable, reliable and high-performance. Cloud Spanner users can further leverage change data capture to replicate data in transactional databases to Cloud Spanner without impacting the source database, or interrupting operations.

Google Cloud Spanner is a cloud-based database system that is ACID compliant, horizontally scalable, and global. Spanner is the database that underlies much of Google’s own data collection, and it has been designed to offer the consistency of a relational database with the scale and performance of a non-relational database.

Migration to Google Cloud Spanner requires a low-latency, low-risk solution to feed mission-critical applications. Striim offers an easy-to-use solution to move data in real time from Oracle, SQL Server, PostgreSQL, MySQL, and HPE NonStop to Cloud Spanner while ensuring zero downtime and zero data loss. Striim is also used for real-time data migration from Kafka, Hadoop, log files, sensors, and NoSQL databases to Cloud Spanner.

While the data is streaming, Striim enables in-flight processing and transformation of the data to maximize usability of the data the instant it lands in Cloud Spanner.

To learn more about Striim’s Real-Time Migration to Google Cloud Spanner, read the related press release, view our Striim for Google Cloud Spanner product page, or provision Striim’s Real-Time Data Integration to Cloud Spanner in the Google Cloud Marketplace.

CDC to Cloudera

Real-Time Database CDC to Cloudera

 

 

As Cloudera increasingly invests in its Enterprise Data Cloud, the ability move data via change data capture or CDC to Cloudera has never been more important. Database CDC to Cloudera helps Cloudera users gain more operational value from their analytics solutions by loading critical database transactions in real time.CDCtoCloudera-featuredgraphic.jpg”>CDCtoCloudera-featuredgraphic.jpg” alt=”CDC to Cloudera” width=”442″ height=”233″ />

The timely ingestion of large volumes of data to Cloudera is imperative to realizing the true operational value of the platform. The explosion in the amount of data generated and the variety of data formats residing in traditional relational databases and data warehouses requires an ingestion process that is real-time and scalable.

Traditional methods or batch ETL uploads fall short in today’s business timeframes. Latency renders operational and transactional data obsolete and unable to provide Cloudera solutions with the real-time data required for operational intelligence and reporting. The negative performance impact of batch processing on transactional databases is also a major reason to move only the changed data in a continuous fashion.

To address the concerns mentioned above, there is a solution to ingest changed data in real time from databases: CDC to Cloudera from Striim. This enterprise-grade streaming data integration solution for Cloudera supports high-volume environments and allows users to move real-time data from a wide variety of sources without impacting source systems.

By moving only change data – continuously and with essential scalability – Cloudera users can rely on the Striim platform for the delivery of data. Data can be loaded as-is, or with a variety of processing, transformations or enrichments applied, all with sub-second latency and in the right format to support specific use cases.

A one-time initial load with continuous change updates ensures up-to-the-second data delivery to Cloudera to support operational decision making. Striim also offers real-time pipeline monitoring with alerting, which is particularly important in the context of mission-critical solutions.

Striim currently offers low-impact, log-based CDC to Cloudera from the following data sources: Oracle, Microsoft SQL Server, MySQL, PostgreSQL, HPE NonStop SQL/MX, HPE NonStop SQL/MP, HPE NonStop Enscribe, MongoDB, and MariaDB. All of these databases can be accessed via Striim’s easy-to-use Wizards and drag-and-drop UI, speeding delivery of CDC to Cloudera solutions. In addition, Striim offers pre-built starter integration applications, such as PostgreSQL CDC to Kafka, that can be leveraged to significantly reduce development efforts of any CDC-based application.

For more information on Striim’s solutions for real-time database CDC to Cloudera, please visit our Cloudera solutions page at: https://www.striim.com/partners/striim-for-cloudera/

If you’d like a brief walk-through of Striim’s CDC to Cloudera offering, please schedule a demo.

Enabling Real-Time Data Warehousing with Azure SQL Data Warehouse

In this post, we will discuss how to enable real-time data warehousing for modern analytics through streaming integration with Striim to Azure SQL Data Warehouse.

Azure SQL Data Warehouse provides a fully managed, fast, flexible, and scalable cloud analytics platform. It enables massive parallel processing and elasticity working with Azure Data Lake Store and other Azure services to load raw and processed data. However, much of your data may currently reside elsewhere, for example, locked up on-premises, in a variety of clouds, in Oracle Exadata, Teradata, Amazon Redshift, operational databases, and other locations.

A requirement for real-time data warehousing and modern analytics is to continuously integrate data into Azure cloud analytics so that you are always acting on current information. This new hybrid cloud integration strategy must enable the continuous movement of enterprise data – to, from, and between clouds – providing continuous ingestion, storage, preparation, and serving of enterprise data in real time, not batch. Data from on-prem and cloud sources need to be delivered into multiple Azure endpoints, including a one-time load and continuous change delivery, with in-flight processing to ensure up-to-the-second information for analytics.

Striim is a next-generation streaming integration and intelligence platform that supports your hybrid cloud initiatives, enabling integration with multiple Azure cloud technologies. Please watch the embedded video to see how Striim can provide continuous data integration into Azure SQL Data Warehouse via Azure Data Lake Store through a pipeline for the ingestion, storage, preparation, and serving of enterprise data.

  • Ingest. Striim makes it easy to continuously and non-intrusively ingest all your enterprise data from a variety of sources in real time. In the video example, Striim collects live transactions from Oracle Exadata orders table.
  • Store. Striim can continuously deliver data to a variety of Azure targets including Azure Data Lake Store. Striim can be used to pre-process your data in real time as it is being delivered into the store to speed downstream activities.
  • Prep & Train. Azure DataBricks uses the data that Striim writes to Azure Data Lake Store for machine learning and transformations. Results can be loaded into Azure SQL Data Warehouse, and the machine learning model could be used by Striim for live scoring.
  • Model & Serve. Striim orchestrates the process to ensure fast, reliable, and scalable poly-based delivery to Azure SQL Data Warehouse from Azure Data Lake Store, enabling analytics applications to always be up-to-date.

See how Striim can enable your hybrid cloud initiatives and accelerate the adoption of Azure SQL Data Warehouse for flexible and scalable cloud analytics. Read more about Striim for Azure SQL Data Warehouse. Get started with Striim now with a trial download on our website, or via Striim’s integration offerings in the Azure Marketplace.

Striim’s Latest Releases Boost Cloud Integration Capabilities, Ease of Use, and Extensibility – Part 1

The Striim team has been busy! With a focus on cloud integration and extensibility of the Striim platform, we have delivered two new releases in the last two months. We are excited to share with you what’s new.

In late June 2018, we released version 3.8.4 which brought several features that improve the manageability and the extensibility of the platform, while making it easy for you to offload critical analytics workloads to the cloud. Earlier this month, we released Striim version 3.8.5 which includes a platform as a service (PaaS) offering for real-time data integration to Azure SQL Data Warehouse. In this blog post, you can find an overview of the new features of the latest Striim releases. Let’s start with cloud integration.

Cloud Integration with a Broader Set of Targets

Available as a cloud service, Striim offers continuous real-time data movement with scalability, enabling faster time to market so you can reap the agility and cost-savings benefits of cloud-based analytics. Striim can now deliver real-time data to additional cloud services, such as Azure SQL Data Warehouse, Azure Database for PostgreSQL, Azure Database for MySQL, and Google Cloud SQL. The solutions for Azure SQL DW, Azure SQL DB, Azure HDInsight and Azure Storage are also available as subscription-based services in the Azure Cloud. If you are an Azure user, you can get started with these solutions in minutes.

As you may have read in prior blog posts, Striim is known for its low-impact change data capture (CDC) feature to ingest real-time data from enterprise databases. With the version 3.8.5, we’ve also introduced an Incremental Batch Reader that can collect low-latency data in mini-batch mode from databases that do not support CDC. The source databases for incremental batch loading include Teradata, Netezza, or any other JDBC-compliant database. One prevalent use case for this new feature is enabling a near real-time data pipeline from existing data warehouses to Azure SQL Data Warehouse to ease and accelerate the transition of analytics workloads to the cloud.

With a broad and continually growing set of cloud targets, Striim allows you to create enterprise-grade, real-time data pipelines to feed different layers of your cloud-based solutions such as:

  • Analytics services and data warehousing solutions, such as Azure SQL Data Warehouse and Google BigQuery, that directly support end users with timely intelligence
  • Data management and analytics frameworks, such as Azure HDInsight, which support interactive analysis or creating machine learning models
  • Storage solutions, such as Amazon S3 or Azure Data Lake Storage (ADLS), from on-premises and other cloud-based data sources in real time
  • Staging solutions, such as HDFS, S3, and Azure Data Lake Storage, which are used by other cloud services and components

In short: to get the most out of your cloud-based analytics, you need continuous data flows to different components of your architecture. Striim supports all key layers of your cloud-based analytics architecture with enterprise-grade solutions to enable continuous data flows where needed.

In Part 2 of 2 of this blog post, I will discuss several new features that bolster the ease of use and extensibility of the Striim platform. In the meantime, I invite you contact us to schedule a demo, or experience Striim v. 3.8.5 by downloading the Striim platform.

Striim Joins Microsoft, Statistica, Fujitsu, Dell at Hannover Messe

The Striim Team is excited to join with key partners including Microsoft, Statistica, Dell and Fujitsu at Hannover Messe 2017. Through joint demos, presentations and interactive experiences, Striim is showcasing a wide variety of real-time IoT integration and analysis solutions to address the needs of Industrie 4.0.

Taking place April 24-28, 2017 in Hannover, Germany, Hannover Messe is the world’s leading industrial trade show. This year’s lead theme is Integrated Industry, and features over 500 Industrie 4.0 solutions. Look for Striim at the Striim + Statistica Booth – Digital Factory, Hall 6, Booth G52.

Participation with Microsoft

We’ve joined with Microsoft to highlight a demo of our integrated solution enabling the continuous exchange and analysis of IoT data across all levels of an IoT infrastructure. This solution, which provides an edge-to-cloud smart data architecture, helps fulfill the Industrie 4.0 promise of enabling industries to be more intelligent, efficient and secure. To learn more, please click on the following links to watch a short video and read the related press release. Or stop by the Microsoft booth in the Digital Factory, Hall 7, Booth C40 to see the demo.

For more information regarding integration of the Striim platform with Microsoft IoT Azure technologies, check out the Striim solutions on the Microsoft Azure Marketplace.

Presentation in Microsoft Booth

Steve Wilkes, founder and CTO of Striim, will present the following session in Microsoft’s booth:

Ensure Manufacturing Quality, Safety and Security
Through Digital Transformation at the Edge
Wednesday, April 27
10:00am local time (GMT+2)
Digital Factory, Hall 7, Microsoft Booth C40

Participation with Microsoft, Statistica, Dell

Striim, Statistica, Microsoft and Dell have joined their IoT hardware and software to enable digital transformation through IoT, integrating machines, devices, sensors and people. A live and interactive demo at the Striim booth will feature a fully functional, model-sized factory floor that provides true-to-life sensor readings and events, feeding an end-to-end solution for real-time data processing, analytics, visualization and statistical analysis/model building. Stop by the Striim + Statistica booth in the Digital Factory, Hall 6, Booth G52 to experience first-hand the relationship between a factory’s IoT systems, and the real-time integration and analysis of the IoT data. To learn more, click here to view a short video.

Participation with Fujitsu

Furthermore, we’ve joined forces with Fujitsu to bring an advanced security appliance for discrete manufacturing companies using IoT edge analytics. The Striim platform, powered by Fujitsu servers, analyzes machine log data with sensor data from physical devices for reliable and timely assessment of any potential breach affecting the factory floor. To learn more about the Striim Fujitsu Cybersecurity Appliance, click here to watch a short video, or stop by the Striim + Statistica booth.

Please reach out if you are interested in scheduling a demo or a briefing during Hannover Messe, or would like additional materials to learn more.

You may also wish to download Gartner’s 2017 Market Guide to In-Memory Computing Technologies, and see why Striim is one of only a few vendors to address 4 out of 5 areas of In-Memory Computing, and the only vendor to do so in a single, end-to-end platform.

Striim Integrates with Google BigQuery

Enables easy ingestion, change data capture, processing and delivery to BigQuery

As companies move toward hybrid cloud infrastructures, they need to be able to easily move data from on-premise to Cloud environments in real time. With this in mind, the Striim Platform seamlessly integrates with BigQuery, Google’s fully managed, petabyte scale, low-cost analytics data warehouse.

With Striim, users can ingest and process data from virtually any data source – including transactional data from enterprise databases such as Oracle, MS SQL Server and My SQL – and push that data in real time into their BigQuery environment.

BigQuery is Google’s fully managed, petabyte-scale, low-cost analytics data warehouse. Because it is serverless, users can quickly and easily deploy large-scale databases and stream massive volumes of data into BigQuery to enable real-time analysis of the data.

Striim is the de facto standard for streaming Change Data Capture (CDC), enabling fast ingestion of transactional data from enterprise databases into a wide variety of Cloud, Big Data and traditional data stores. Striim can also ingest data from log files, message/event queues, IoT sensors, etc., making it the perfect data integration platform for heterogeneous environments.

Google BigQueryAs the data is streaming, Striim is able to filter and pre-process data, making it more valuable in milliseconds. The Striim platform then delivers that processed data in real time into BigQuery, empowering analytics in a variety of different verticals such as financial services, retail, and IoT.

The Striim platform makes it easy to move data from virtually any source into BigQuery, in real time. Striim provides a Wizard to build data pipelines and connect to Google’s data warehouse in hours. With Striim’s drag-and-drop user interface and SQL-like programming language, it’s simple to iterate and tailor this streaming integration solution to the exact needs of the project. And Striim’s enterprise-grade features around security and reliability align with BigQuery’s replicated storage strategy, ensuring an environment you control.

In a Realtime World, NonStop Customer Experience Can Change in an Instant!

The highways are a mess at this time of year. Yes, it’s winter in the Rockies, and Television and the Internet are providing a steady flow of updates about the weather, even as vehicle’s info/entertainment centers highlight changed traffic conditions. So much information coming so quickly, it’s not easy to process it all when battling traffic. Do I exit this freeway or should I stick with it? Should I stay south or perhaps turn west?

Facing such a flood of information is not only confusing, but just as likely, distracting and operating a vehicle in inclement weather isn’t helped when drivers are trying to process all the information coming their way. In the realtime world, and driving is a good example of the demands that come with living in a realtime motoring world, timely information remains important, but all too often it’s just streams of data where a single missed item can lead a driver astray. It takes very little to turn a minor mishap into a deadly encounter.

The IT world of hybrid computers, clouds, and meeting SLA’s demands realtime.

In the realtime IT world systems, platforms, operating systems, middleware and applications are all providing updates about their operational status. A constant barrage of data makes tracking the performance of an application difficult; who can tell whether basic SLA metrics are being met? Add into this the added complexity that comes with hybrid computers, not to mention the use of clouds and the task befalling those responsible for ensuring customers are being served in a timely manner appears more like black magic than pure science. There’s nothing that changes a customer’s experience faster than misinformation, or worse, no information at all.

There’s no escaping the reality that whenever the subject of hybrids is raised, at a minimum there will be two of everything. Correlating, and then responding when required, has just moved everyone in IT further up the complexity curve. Each processor will have system logs of one type or another, application logs, database logs and so on – but in the end, they will need to conform to SLAs and that’s when the difficulties arise. And in the realtime IT world, where hybrids exist, there’s a real need for business users and company analysts to leverage the capabilities of essentially self-service products like WebAction.

In a November, 2014 report Data Preparation Is Not an Afterthought, Gartner analysts note that companies’ leadership needs to “Use self-service interactive data preparation tools to enhance analyst productivity.” Furthermore, according to Gartner, these leaders should “Introduce data preparation techniques that assess and improve the quality of data from new and diverse data sources.” In a subsequent Novemebr 2014 report of Gartner’s take on what companies can expect to see appearing on the BI landscape, Predicts 2015: Power Shift in Business Intelligence and Analytics Will Fuel Disruption, other analysts noted that it’s important to, “Evaluate new product offerings in self-service data preparation and smart data discovery against the road map of your current BI platform vendor to determine whether an integrated or best-of-breed approach will best meet future needs”.

Realtime process monitoring, alerting, and workflow enablement with NonStop

For the NonStop community, the emergence of hybrid computers based on NonStop X and Linux X Zeon blades will require self-service products like WebAction to better ensure SLAs are being met. WebAction is well suited to the task of SLA monitoring all that Gartner describes – being able to consume multiple streams is a fundamental strength of WebAction. When it comes to “realtime process monitoring, alerting, and workflow enablement across hybrid systems including NonStop X and Linux X hybrids,” said WebAction Cofounder and EVP, Sami Akbay, “This is just one area where WebAction excels. We see this service as part of the future landscape for all enterprise systems.”

In winter our highways may be a mess and traversing a city blanketed in snow is a trying experience. However, in the realtime IT world, this doesn’t have to be the case and the ever-growing streams of data being produced from within the data center prove easy for WebAction to consume – there’s little reason why anyone should be left uninformed about the experiences being enjoyed by their customers. None whatsoever – and it only takes one instance for customers to jump to the competition, and that’s not an experience any CIO wants to have!

 

In Novel Ways … WebAction for NonStop X Will Change the Way Business Operates!

linked-open-data-cloudWe all continue to count down the days before the latest NonStop systems begin to ship in volume. NonStop X, as the members of the family of NonStop systems based on the Intel x86 architecture are now being called, brings NonStop back into the mainstream, with the last vestige of proprietary completely removed. To all CIOs demanding x86 solutions HP has responded, and in ways few expected just a year ago – you want clusters based on x86? Well NonStop X is the ultimate cluster in a box, and it continues to be every bit as NonStop as all of its predecessors of the past four decades.

In his post of January 22, 2015, WebAction, Big Data Stream Analytics for HP NonStop, WebAction’s Jonathan Geraci made the observation that, “With the arrival of HP NonStop X, companies will have the opportunity to use NonStop in novel ways.” Furthermore, posted Geraci, “The continuous integration of realtime and historical information provides up-to-the-millisecond visibility into customer experience and business health. Identify issues instantaneously and in-time to effectively resolve them.” NonStop systems have always been about ease of use and ease of management, but in supporting x86 the value proposition of NonStop takes a big turn in the right direction – lower overall Total Cost of Ownership (TCO).

Elsewhere in posts of late I made reference to Australian industry watcher and commentator, Len Rust. In his January 27, 2015, Rust Report newsletter his editorial under the heading, To the cloud and infinity included the key observation, “Although technology trends seem to come and go with frightening regularity, some have a lasting impact on business. These are ones that change the way businesses operate and provide dramatic improvement for those that adopt them.” While written about Cloud Computing it can be just as easily said about Big Data. And when it comes to novel ways, nothing may be as novel as seeing NonStop actively engaged in solutions that make a lasting impact on business.

NonStop X and Realtime

WebAction support of NonStop X is a given –initial steps taken in building WebAction included support of feeds coming from NonStop, as any move towards supporting realtime obviated the need to include transactional systems, and NonStop is among the premier platforms when it comes to supporting the most mission-critical of transactional systems. Said another way, leaving out feeds from NonStop from WebAction would have only led to lesser appeal of the product, a situation all involved with WebAction have now avoided, to their credit.

However, using NonStop in novel ways can benefit from the presence of code stubs, applications and templates as it is well known that facing a blank screen isn’t the best way to come up with a real world solution providing business value. In explaining how WebAction works to the majority of NonStop users, it takes a while (and numerous examples) before it registers just how powerful a product WebAction truly is – identifying changing consumer behavior as it happens and then modifying applications on the fly to better appeal to this changed constituency is a godsend for every business.

Top 10 Companies to Watch in 2015

In his post of January 8, 2015, The Bloor Group Lists WebAction as “One of the 10 Companies to Watch in 2015” Jonathan Geraci posts of how the Bloor Group likes the fact that “WebAction provides a library of pre-built applications which are the latest development in the evolution of stream processing designed to (a)apply business logic for relevant, actionable and predictable information in real-time, (b) effortlessly scale-out high volumes of different types of business data in-memory, and (c) deliver intelligence in an accessible, easy, and secure manner.” In other words, extending the novelty of Big Data applied to realtime transactional processing by providing usable applications.

Not everyone relishes designing something from scratch. Nor do even the smartest folks like sitting in front of a blank screen. At a recent presentation given by a vendor specializing in monitoring, when it came to designing new dashboards, the presenter didn’t hold back making sure you borrow from what already works. This will be key for the NonStop community as they look to take advantage of the new NonStop X systems, changing the way businesses operate will be a lot easier with WebAction and with the apps now on offer. And isn’t this what every business today is looking for as it faces a constantly changing and often very fickle consumer community?

WebAction, Big Data Stream Analytics for HP NonStop

HP-NonStop-logo-150x150WebAction, partnered with HP, brings realtime stream analytics that work in harmony with the NonStop ecosystem. HP NonStop is the leader in continuous availability systems, and the most valuable data in the enterprise resides in NonStop. With the arrival of HP NonStop X, companies will have the opportunity to use NonStop in novel ways. WebAction makes Big Data accessible to NonStop and also makes NonStop transaction data  accessible to other applications in the Big Data ecosystem.

High-Velocity Big Data Analytics

From the data and applications veterans at WebLogic and Golden Gate, WebAction is the next step in the evolution of stream analytics. WebAction is the most comprehensive realtime stream analytics platform. Quickly build tailored enterprise-scale Big Data applications that assimilate, correlate and analyze disparate, high-velocity data. The continuous integration of realtime and historical information provides up-to-the-millisecond visibility into customer experience and business health. Identify issues instantaneously and in-time to effectively resolve them. The most valuable data in the enterprise is now immediately and securely available for use by people in departments across the enterprise such as: operations, customer experience, marketing, risk, and supply chain.

Example Realtime Use Cases

So what can be done with streaming Big Data in the context of NonStop? Some general themes come to mind as we think about use cases.

  • Anomaly Detection: Pulling historical norms from data stores to compare against signatures of current events, triggering real-time alerts when issues arise
  • Predictive Analytics: Analyzing real-time data streams to build predictions about future events. Predictions are refined as time progresses.
  • Stream Filtering: Filtering by either removing data fields from a stream, or by shortening fields in a stream
  • Stream Enrichment: Adding history or context data to make a piece of data more robust or otherwise actionable
  • Stream Aggregation: Monitoring of specific data streams for specific events
  • Stream Correlations: Combining 1-n data streams and data sets using specific data element, in real-time.

To learn more about the WebAction Platform, read the Bloor Group paper: The Genesis of the Real-time Enterprise: How WebAction Enables Truly Responsive IT Applications

Is There a Hybrid in Your Future Based on NonStop?

hybrid computing, NonStop | RBWhen HP Vice President & General Manager, Integrity Server, Randy Meyer, presented to the gathering of Canadian NonStop community in October, 2013, the theme of his talk resonated with all present. Meyer elected to build on key focus items that HP CEO, Meg Whitman, had highlighted only a few months earlier during her presentation at the 2013 HP Discover event in Las Vegas. Whitman spoke about Security, Mobility, Clouds and Big Data and highlighted them as the key areas of interest that would anchor HP’s programs going forward. On the other hand, Meyer pointed to just Big Data, Hybrid Computing and Big Data and told the audience, when it comes to HP NonStop programs, “We want to make the investment that helps our customers leverage these Megatrends!”

A few weeks later, at the November, 2013, NonStop Advanced Technical Boot Camp Meyer unveiled the NonStop support for Intel x86 architecture where he included a video cameo by Whitman. “Today, enterprises operate in a world where the demand for continuous application availability is growing exponentially. The need to choose the right computer for the right workload at the right economics has never been so important … we are on the path to redefine mission critical computing,” said Whitman. Then tying in with the bigger message from HP, Whitman concluded with “Our NonStop customers truly make it matter!”

NonStop X and Streaming Analytics

Digging deeper into the weeds, as I have been doing all January, I have been asking HP executives about plans for NonStop X – the family of products becoming generally available in mid-March that are all based on the Intel x86 architecture – and there’s every indication that there is work already being done to roll-out hybrid systems. With common x86 blades being shared by Unix and Linux systems as well as NonStop, there are apparently HP enterprise chassis being tested that include a mix of NonStop and Linux blades in place, according to one HP source, so as to deliver “the right computer for the right workload at the right economics”.

Recently, Justin Simonds, Master Technologist at Hewlett-Packard, posted a comment to the LinkedIn’s Tandem User Group, in response to the discussion that began with the posting of discussion A twofold test … with no prizes awarded (unfortunately) but of value all the same … When it comes to the direction Simonds expects to see “the new NonStop X head, then second from the top of his list for NonStop is the tight integration with Linux for composite type applications, where portions of a service run on Linux and portions run on NonStop with NonStop overseeing the service.” Could this become a growth area for HP and for NonStop?

HP NonStop Hybrid Opportunities

We hear a lot about hybrid clouds, where enterprises integrate public clouds with their own private, or more-often, managed clouds, but the story of hybrids has deeper roots. Anyone familiar with IBM mainframes knows how IBM today ships a mix of zOS and zLinux in the one mainframe. For HP to be planning the release of enterprise systems that are a mix of NonStop and Linux seems completely rational – there’s just too much software coming to market for Linux that would benefit from either the NonStop SQL database or what “the permanent availability of the transactional system” on NonStop. No matter the final decisions taken on usage, there are plenty of opportunities for such a hybrid computer, including a greater presence in completely new verticals.

As for WebAction and the deployment of its product on an adjacent Linux system (all part of the NonStop hybrid) would be a likely boon for WebAction. This is a possibility I highlight in the upcoming January issue of the eNewsletter, Tandemworld, where I quote WebAction Cofounder and EVP, Sami Akbay.  “At WebAction we agree with Justin that there are verticals where NonStop X should make it easier for HP to establish a greater presence,” said Akbay. “We are especially interested in these new growth areas. When it comes to the NonStop community, where we excel is in quickly correlating and aggregating multiple streams of data – including what is being posted to logs on the NonStop application – so that developing patterns or trends can be quickly recognized and then acted upon.”

The signs have been observed for some time, and even as speculation persists, this is a real case of there being fire amidst all the smoke. NonStop as part of hybrids appears inevitable and with that, further opportunities for WebAction, and ultimately, this may prove to be the opening WebAction can leverage to penetrating the bigger NonStop market. These are weeds I am only too happy to go digging into even more deeply!

 

 

For NonStop Users, No Sleep Lost from Deploying WebAction!

When it comes to truisms in my family the ones that stand out include never making our bed when we are tired, never cooking when we are hungry and never buying a car when we need one. Clearly, performing any of these tasks when not necessary has proven the more beneficial as well as being the least taxing on our sleep patterns, our waistlines or our pockets.

For CIOs, particularly those who embrace NonStop systems in support of mission critical applications, some other truisms apply. Never upgrade your system when it’s so “old and tired” that there are few specialists left in the data center who know anything about it; never try throwing all your application “ingredients” together under a new services paradigm when new SLAs have just been negotiated; and never start looking at security when it becomes clear you have to! Sounds simple enough, but today when so much is in transition and when there’s so much external pressure to embrace more modern solutions, the temptation to do all of the above may prove hard to resist.

Awareness that change is needed is just one byproduct of using big data analytics – highly skilled data scientists within your enterprise will recognize the symptoms long before CIOs initiate any knee-jerk responses. However, when it comes to the NonStop user community, are commitments being made to ensure these data scientists are well-educated about NonStop? In the post of Jan 5, 2015, to this blog,  A Solution for the Data Scientist Shortage of 2015 it was noted that, “As more companies explore the potential of big data analytics for competitive advantage, operational intelligence, risk management, customer experience, and marketing opportunities, the demand for skilled engineers is compounding at an alarming rate.”

Challenges to Maintaining a Competitive Advantage for NonStop Users

Perhaps more daunting for the NonStop user community, even when investments in data scientists familiar with NonStop have been made, the recent note from McKinsey (according to CNBC: Big data’s big stumbling block) that asks, “What if number crunchers aren’t enough? After all, if a great insight derived from advanced analytics is too complicated to understand, business managers just won’t use it.” And herein lies an important issue for everyone in the NonStop community. As a group, NonStop users are slow to adopt new technologies – the nature of the applications running on NonStop often deters CIOs from making changes – so what if the business owners of solutions on other platforms become better informed about their clients changing usage patterns long before anyone responsible for NonStop has picked up on a developing trend?

This is precisely why WebAction is including NonStop systems in their solution – simple to build and deploy Data Driven Applications. It lessens the need for the data scientists even as it makes it a lot easier to communicate to management exactly what is going on. Even with the visible reluctance by CIO’s to mess with their NonStop systems, the non-intrusive nature of WebAction makes adoption rather painless. And this will prove to be an action they will not lose sleep over!

Coincidence; Surely Not. WebAction Intersects with NonStop to Give You a Better Steak!

By chance this week I happened upon a website of a well-known steakhouse – one I have often visited, although not my absolute top pick. On their website I was greeted with a personalized offer to pair my next steak dinner with one of three really good wines, all deeply discounted. A quick exploration of what it entailed only lasted a few minutes before I moved on to something else, but when I opened my email inbox, there was an additional enticement to the same restaurant. Simply from casually browsing of the website I had become a target for even more marketing emails.

For those working with Big Data, this would appear ho-hum. A simple example of correlating where I lived and where I was at the time I went online combined with collected history of previous meals and tailoring something just for me – a fine 22oz bone-in rib eye with a bottle of Silver Oak. Clearly a good setup for a dinner for two, and at a price that I almost gave in to – if it wasn’t so cold outside I may have just taken them up on the offer.

At this time of the year the NonStop community is becoming somewhat jaded from the many user events being held, but fortunately with just a few weeks remaining before the year winds down, there’s only a couple more – DUST in Arizona and InNUG in India. However, there’s been plenty of upsides and last week I posted about joining the WebAction team in presenting to those attending the annual NonStop Technical Boot Camp. What came from this, almost by coincidence, is the recognition from those I talked to – users and vendors alike – that the days of keeping NonStop isolated from major initiatives (whether overtly or by indifference) were coming to an end. There were simply too many good reasons for NonStop solutions to integrate with the rest of the enterprise’s IT deployments, particularly when it comes to data.

NonStop and WebAction Intersect for a Targeted Big Data Niche

In the post of December 3, 2014, Niches open and niches close and yet the versatile NonStop prevails! to the NonStop community blog, Real Time View, I touched on the impact NonStop has in certain niches, or submarkets as they are often called. Pulling from a previously published quote by WebAction Cofounder, Sami Akbay, “When it comes to the bigger picture of computers worldwide – then yes, NonStop is a niche,” I highlighted just how important NonStop had become when it came to serving select submarkets. And here’s where the coincidence becomes better understood – WebAction is targeting a particular sector of the marketplace even as NonStop dominates select niches. When they intersect, they provide a compelling case for just why Big Data is as important as it now is. The pendulum has swung far away from batch and is closing in on the world of real time processing.

If you haven’t read the Bloor Report, The Genesis of the Real-time Enterprise, How WebAction Enables Truly Responsive IT Applications then you are missing some key points about this world of real time processing.  “There is obvious merit in companies,” the report states, “ingesting real-time streams of data from multiple sources, processing them as a time series and automatically adjusting their behavior to proactively respond to their environment as it changes.” More to the point, and yet another coincidence, is the example that’s taken from the retail marketplace. With WebAction (and yes, with data from NonStop), “can identify when top customers are on site, adding contextual and historical information to enable the retailer to proactively reward loyalty or instigate a sale.” Yes, I can hear that rib eye sizzling even now as it calls out my name; and yes, of course, it’s a Big Steak!

View the WebAction Mobile Customer Experience Management Case Study: Veloxity

See how Veloxity, a mobile Customer Experience Management (CXM) company, uses WebAction to turn the tables on the telecom industry’s CXM standards. Co-founder and CEO, Bahadir Kuru, explains how the company processes data for millions of  mobile devices in real time to offer carriers a window into each and every consumer’s wireless experience. “When our first customers, one with 7.5 million Wi-Fi hotspots and another with 20 million subscribers, deployed our technology, using WebAction allowed us to focus on our business problems and Key Performance Indicators (KPIs) instead of the underlying data infrastructure,” commented Veloxity CEO, Bahadir Kuru.

Data Driven Customer Experience Management for Mobile Telecommunications

Veloxity now empowers carriers with an instant “crowdsourced” quality of service profiling of their entire wireless network infrastructure as experienced by the customer, with segmentation by device or other categories. “WebAction enables us to achieve our vision by acquiring and processing very large amounts of data and making it available for analytics and dashboards in real-time,” BoraEristurk, VP of Business Development. For a deeper understanding of how Veloxity is using WebAction to fuel their wireless performance management platform, check out the Veloxity Case Study.

Read the Case Study

Time for an Ever-watchful Guardian…

This past week I joined the HP NonStop systems community for their annual technical boot camp held in San Jose. In what is clearly trending upwards, this event started out quietly enough but quickly outgrew its initial event space and with an even stronger showing this year, has clearly outgrown its second venue. Pre-conference sessions, included as informal educational opportunities, were also well-attended. I had the opportunity of opening the session hosted by WebAction. Setting the stage for a look at the impact Big Data has on mission-critical applications typical of NonStop systems deployments, as my presentation set out to address, I was very pleased to see the venue almost packed even though it was very early Sunday morning.

In an upcoming post to the ATMIA industry blog at ATMmarketplace.com, I write of the necessity of being prepared, even as I touch briefly on an almost taboo topic for NonStop – no, no mission-critical application running on NonStop has ever been compromised by an outside agency. On the other hand, while no information on the NonStop has been compromised, I also wrote, that’s not to say there’s any lessening in the need for including fraud detection software as part of every payments solution implemented on a NonStop system. But even here, the landscape is getting more complex as companies increasingly network with each other to the point where analytics performed against network traffic has to take place in real time. Each and every system, NonStop included, needs to have an external agent as a guardian, ever watchful for the first signs of trouble.

In a post by ATMmarketplace editor, Suzanne Cluckey, she writes about the presentation, Casing out financial cyber crime: A federal agent’s point of view. While the post was about one of the last presentations given at the BAI Retail Delivery, it was “FBI Special Agent Patrick Geahan shar(ing) intelligence on the evolving cybersecurity landscape and emerging threats, and preemptive measures organizations can take to stop cyberthieves” that attracted a crowd. “What that means for you folks is that your weakest security link isn’t necessarily in your building or in your infrastructure. It might be someone that’s connected to you,” Geahan said. Added Cluckey “the best example of this is the recent Target hack, which occurred not within that company’s system, but within that of a heating contractor that had a VPN connection to the Target network.”

What came out of the HP NonStop Boot Camp’s pre-conference session in which I participated was that it was through products like WebAction that steps could be taken to step back from individual systems and, much like the military provides these days via their own hi-tech radar representations, take a God’s Eye view of all that is transpiring on your system, on adjacent systems, on network components – even inside a shop with just a single POS device – no matter, patterns could be quickly identified and potential fraudulent attacks detected even as they first appear. Someone connected to you could become the source of unwanted attention being given to your system. NonStop systems may have never been broken into but that still doesn’t rule out the arrival of a fraudulent transactions and this is possibly the immediate opportunity for many Financial Institutions to find value in having WebAction standing guard!

Veloxity Uses WebAction to Facilitate Exponential Growth for their Customer Experience Management (CXM) Solution.


Veloxity_LogoVeloxity is an up-and-coming mobile Customer Experience Management (CXM) company committed to understanding the wireless experience from the customer perspective. Telecom industry experts at Veloxity have developed a real-time, crowd-sourced CXM solution that enhances network and device management by collecting diagnostic data such as bandwidth usage and signal strength to assess overall network quality. “We are looking at the networks from the consumer’s eyes,” explains Bahadir Kuru, CEO. Veloxity offers carriers the transparency needed to pinpoint problems within the network as they occur.

WebAction for Mobile Customer Experience Management

WebAction empowers Veloxity by acquiring and processing all of their mobile handset data in order to meet their goal of profiling every customer’s experience in real time. Analytics that were never before possible are now quick and easy to accomplish, such as profile an entire mobile network in seconds. Read the Case Study to learn how Veloxity is propelling mobile Customer Experience Management forward and how the WebAction Real-time App Platform is fueling this paradigm shift. “WebAction enables us to achieve our vision by acquiring and processing very large amounts of data and making it available for analytics and dashboards in real-time,” explains Bora Eristurk, VP of Business Development.

Read the WebAction Veloxity Case Study

Value Networking? Nonstop Community Certainly Does!

LinkedIn is a good networking tool, especially for IT professionals, and having a place to go to check out topics of interest is every bit as good as the networking opportunities themselves. We are by now all familiar with the many groups catering to our needs and equally familiar with the tabs Discussions and Promotions. For the NonStop communities, participation in one or more LinkedIn groups is a must, as apart from an annual conference and a couple of regional events, staying current with all that’s happening can be difficult. Simply knowing someone in the same space, who shares your interests, makes exchanges lively!

This month I set up a new LinkedIn group – Big Data, integrated with NonStop. While this group is specific to the needs of the NonStop community, it is a companion to another LinkedIn group with a much broader goal – Data Driven Apps. This new group is similar in intent to another LinkedIn group – Clouds, powered by NonStop, and although membership in this group is about ten times as large as in its Big Data sibling, given the age difference I expect the membership gap to be bridged very shortly. All the same, joining these networks to see what’s happening is every bit as important as connecting with peers and long-lost relatives!

Together, the groups related to Big Data are where conversations will evolve and where events I support will be promoted so check them all out and look at what’s currently being discussed. My most recent discussion asked if it’s still relevant – I’d rather be a hammer than a nail … where the subject looks at whether real time mission critical applications, often sourced from a solutions vendor, can benefit from our own initiatives to integrate with what’s being stored in Big Data frameworks. Can we break off just a small piece and approach such integration in small increments? Will a small claw hammer suffice, rather than a 20lb sledgehammer? This too was the subject of a blog post to the NonStop community blog, Real Time View, and if you missed it, check out the post – I need data I can digest, in small bites, please!
WebAction by the very nature of the control ultimately being given to system architects can generate web actions as verbose or as targeted as we need – it’s really the ultimate control gate when it comes to turning on the flow of Big Data to time-sensitive mission critical applications. In my recent exchanges with folks inside HP as well as the third-party vendors community, there is a perception that by being Big Data, mission critical applications will be overwhelmed by the volume of data hitting these applications. In talks with WebAction, this is just not the case, and you may want to check out some of the targeted areas for Data-Driven Apps already identified by WebAction and viewable on their web site. And if you want to know more about what’s Big Data topics interest me, maybe it’s time to revisit LinkedIn and join a group!