Striim Migration Service for Google Cloud Documentation

Create and test the streaming integration application

The easiest way to create the streaming integration application is to export the initial load application to a TQL file, make a few changes, and import it.

  1. Select Configuration > Export > Export. There is no need to provide a passphrase.

    exportInitialLoad.png
  2. Open the exported file. It should look something like this:

    CREATE APPLICATION InitialLoadApp USE EXCEPTIONSTORE TTL : '7d' ;
    
    CREATE FLOW InitialLoadApp_SourceFlow;
    
    CREATE SOURCE InitialLoadApp_DBSource USING DatabaseReader  ( 
    Tables: 'HR.COUNTRIES;HR.DEPARTMENTS;HR.EMPLOYEES;HR.JOBS;HR.JOB_HISTORY;HR.LOCATIONS;HR.REGIONS', 
      Username: 'qatest', 
      DatabaseProviderType: 'ORACLE', 
      Password: '9PUuzS4m9LlRbTUind0vsg==', 
      FetchSize: 10000, 
      Password_encrypted: 'true', 
      QuiesceOnILCompletion: 'true', 
      ConnectionURL: 'jdbc:oracle:thin:@//192.0.2.0:1521/XE' ) 
    OUTPUT TO InitialLoadApp_OutputStream  ;
    
    END FLOW InitialLoadApp_SourceFlow;
    
    CREATE  TARGET InitialLoadApp_PostgresTarget1 USING DatabaseWriter  ( 
    StatementCacheSize: '50', 
      ConnectionRetryPolicy: 'retryInterval=30, maxRetries=3', 
      ConnectionURL: 'jdbc:postgresql://198.51.100.0:5432/HR', 
      DatabaseProviderType: 'Default', 
      BatchPolicy: 'EventCount:1000,Interval:60', 
      CheckPointTable: 'CHKPOINT', 
      Password_encrypted: 'true', 
      PreserveSourceTransactionBoundary: 'false', 
      Password: 'jsPf42jdUYqHsNei2SIiRHmqPD1mjLsQcSyYr50gkg0=', 
      Username: 'postgres', 
      Tables: 'HR.%,HR.%', 
      CommitPolicy: 'EventCount:1000,Interval:60' ) 
    INPUT FROM InitialLoadApp_OutputStream;
    
    END APPLICATION InitialLoadApp;

    Make the following changes:

    1. Change the eight occurrences of InitialLoadApp (in the CREATE APPLICATION, CREATE FLOW, CREATE SOURCE, OUTPUT TO, END FLOW, CREATE TARGET, INPUT FROM, and END APPLICATION lines) to StreamingIntegrationApp.

    2. Change USING DatabaseReader to USING OracleReader.

    3. Delete the entire line QuiesceOnILCompletion: 'true',

    4. Save the file as StreamingIntegrationApp.tql.

    The new application will look something like this:

    CREATE APPLICATION StreamingIntegrationApp USE EXCEPTIONSTORE TTL : '7d' ;
    
    CREATE FLOW StreamingIntegrationApp_SourceFlow;
    
    CREATE SOURCE StreamingIntegrationApp_DBSource USING OracleReader  ( 
    Tables: 'HR.COUNTRIES;HR.DEPARTMENTS;HR.EMPLOYEES;HR.JOBS;HR.JOB_HISTORY;HR.LOCATIONS;HR.REGIONS', 
      Username: 'qatest', 
      DatabaseProviderType: 'ORACLE', 
      Password: '9PUuzS4m9LlRbTUind0vsg==', 
      FetchSize: 10000, 
      Password_encrypted: 'true', 
      ConnectionURL: 'jdbc:oracle:thin:@//192.0.2.0:1521/XE' ) 
    OUTPUT TO StreamingIntegrationApp_OutputStream  ;
    
    END FLOW StreamingIntegrationApp_SourceFlow;
    
    CREATE TARGET StreamingIntegrationApp_PostgresTarget1 USING Global.DatabaseWriter  ( 
    StatementCacheSize: '50', 
      ConnectionRetryPolicy: 'retryInterval=30, maxRetries=3', 
      ConnectionURL: 'jdbc:postgresql://198.51.100.0:5432/HR', 
      DatabaseProviderType: 'Default', 
      BatchPolicy: 'EventCount:1000,Interval:60', 
      CheckPointTable: 'CHKPOINT', 
      Password_encrypted: 'true', 
      PreserveSourceTransactionBoundary: 'false', 
      Password: 'jsPf42jdUYqHsNei2SIiRHmqPD1mjLsQcSyYr50gkg0=', 
      Username: 'postgres', 
      Tables: 'HR.%,HR.%', 
      CommitPolicy: 'EventCount:1000,Interval:60' ) 
    INPUT FROM StreamingIntegrationApp_OutputStream;
    
    END APPLICATION StreamingIntegrationApp;
  3. In SMSGC, select ≡ > Apps.

    Screen_Shot_2016-04-22_at_10.42.46_AM.png
  4. Select Add App > Import Existing App, navigate to and double-click the StreamingIntegrationApp.tql file, and click Import.

  5. Deploy and start the application.

  6. At this point, you may test the streaming integration application by using sqlplus or any other Oracle client to add some data to the source, then use Cloud Shell or any other PostgreSQL client to verify that it is replicated to the target (this may take up to 60 seconds). When you are through testing, delete the test data from the source and it will automatically be deleted from the target.