Real-Time Offloading, Analytics and Reporting for Azure Database for PostgreSQL.

Striim ROAR enables you to keep your core database on-premise, but modernize it with elastic real time analytics in the cloud. By delivering data in real-time from existing data sources to Azure DB for PostgreSQL, you can take advantage of low latency highly elastic operational data stores and data marts for secure, reliable and scalable real-time analytics.

In this demo, you are going to see how you can use Striim ROAR to continuously move data from Oracle to Azure DB for PostgreSQL. We will show you how to use Striim’s wizards and intuitive UI to build data flows; run the data flows to collect data from Oracle using Change Data Capture, and deliver it in real-time to Azure DB for PostgreSQL; and see continuous monitoring of your cloud migration solution.

Prior to starting the data movement, Striim ROAR provides schema conversion utilities to create target PostgreSQL schemas from source databases. In this case we are converting a number of table definitions in our Oracle database to PostgreSQL tables. The conversion occurs without any issue, and we are ready to move onto the real-time data movement.

Performing streaming data integration with Striim starts with our wizards. We will select Oracle as the source, and Azure DB for PostgreSQL as the target. After clicking the wizard and entering a name for our data flow, you just need to complete a few simple steps.

First, you will configure the source. Enter the necessary information to connect to the source and click on next. Don’t worry, any secure information like passwords is encrypted. The wizard will check that the connection information is correct, and that the connection has the correct privileges and supports change data capture. 

Next you select the tables that you are interested in collecting real-time data from. You can change this selection afterwards, so start with a few tables initially. Finally you need to configure the target connection information, including how the source data is mapped to target tables. 

When you complete the wizard, a data flow is created from the information you entered. You can see the source and target configuration here. To start the data flow, first deploy it to get it ready to run, then start it to begin collecting data from Oracle and delivering it to Azure DB for PostgreSQL

Initially, there is no data flowing, because we are not generating any new data in Oracle. You can see from the UI for Azure DB for PostgreSQL that there is no data present in any of the target tables. 

Now we will run a data generator for Oracle that creates a set of inserts, updates and deletes. You can see the data in the data flow preview window, and view the rate of data collection and delivery in the UI. We can also look at the application progress here to see a summary view of your tables. After a number of operations have been generated, we can check back with the Azure DB for PostgreSQL UI and see the data in the target tables.

Of course, Striim can perform initial loads as well through similar data flows. Here we are moving a million rows from tables in Oracle to Azure DB for PostgreSQL using our smart delivery pipeline. You can monitor the progress through the Striim UI, and, if we switch to the Azure DB for PostgreSQL UI, you can see the data in the target.

We can also use the Striim monitor UI to look at overall metrics, and drill down to see the application stats, and detailed information for each of the application components.

This has been a quick demo of using Striim ROAR to deliver data continuously from Oracle to Azure DB for PostgreSQL. Please go to our website to try Striim yourself, find Striim in the Azure Marketplace, or contact us to learn more.