Striim 3.9.4 / 3.9.5 documentation

Using the Oracle to Kafka template

Before following these instructions:

  • Striim must be running.

  • The Forwarding Agent in Oracle must be connected to Striim.

You will need the following information to complete the wizard:

  • Striim

    • VM user name

    • VM user password

    • Striim cluster name

    • Striim admin password

    • DNS name (displayed on the Essentials tab for the Striim VM)

  • Oracle (source)

    • connection URL in the format <IP address>:<port>:<SID>, for example, 198.51.100.0:1521:orcl

    • login name and password

    • source table names

  • Kafka (target)

    • topic name

    • broker address

    • optionally, any Kafka producer properties required by your environment (see "KafkaWriter" in the "Adapters reference" section of the Striim Programmer's Guide)

  1. Log into the Striim web UI at <DNS name>:9080 using admin as the user name and the Striim admin password.

  2. Click Oracle CDC to Kafka 0.8, Oracle CDC to Kafka 0.9,or Oracle CDC to Kafka 0.10. If using 0.10 or 0.11, see the known issues in Configuring an app template target.

  3. Enter names for your application (for example, Oracle2Kafka) and new namespace (do not create applications in the admin namespace) and click Save.

  4. Enter the name for the Oracle source component in the Striim application (for example, OracleSource), the connection URL, user name, and password.

  5. Select LogMiner as the log reader.

  6. Optionally, specify a wildcard string to select the Oracle tables to be read (see the discussion of the Tables property in OracleReader properties).

  7. Set Deploy source on Agent on (if the Forwarding Agent is not connected to Striim, this property does not appear) and click Next.

  8. If Striim's checks show that all properties are valid (this may take a few minutes), click Next.

  9. If you specified a wildcard in the Oracle properties, click Next. Otherwise, select the tables to be read and click Next.

  10. Press Next to skip mapping and filtering (or see Using Advanced Setup to map tables to streams for instructions on using this advanced feature).

  11. Enter the name for the Kafka target component in the Striim application (for example, KafkaTarget), the topic name and the broker address.

  12. For Input From, select the only choice. (This is OracleReader's output stream, and its name is <application name>_ChangeDataStream.)

  13. Enter the topic name, and the broker address.

  14. Optionally, click Show optional properties and specify any Kafka producer properties required by your environment. Leave Mode set to sync.

  15. Select AvroFormatter and specify its schema file name. This file will be created when the application is deployed (see AVROFormatter).

  16. Click Save, then click Next. (Click Create Target only if you specified maps or filters and want to create more than one target.)

  17. Striim will create your application and open it in the Flow Designer. It should look something like this:

    ora2kafka.png
  18. Select Configuration > App settings, set the recovery interval to 5 seconds, and click Save.

    Screen_Shot_2017-11-02_at_12_16_27_PM.png
    Screen_Shot_2017-11-02_at_12_17_28_PM.png
  19. Select Configuration > Export to generate a TQL file. It should contain something like this (the password is encrypted):

    CREATE APPLICATION Oracle2Kafka RECOVERY 5 SECOND INTERVAL;
    
    CREATE SOURCE OracleSource USING OracleReader ( 
      FetchSize: 1,
      QueueSize: 2048,
      CommittedTransactions: true,
      Compression: false,
      Username: 'myname',
      Password: '7ip2lhUSP0o=',
      ConnectionURL: '198.51.100.15:1521:orcl',
      FilterTransactionState: true,
      DictionaryMode: 'OnlineCatalog',
      ReaderType: 'LogMiner',
      Tables: 'MYSCHEMA.%'
     ) 
    OUTPUT TO OracleSourcre_ChangeDataStream;
    
    CREATE TARGET KafkaTarget USING KafkaWriter VERSION '0.8.0' ( 
      Mode: 'Sync',
      Topic: 'MyTopic',
      brokerAddress: '198.51.100.55:9092'
    ) 
    FORMAT USING AvroFormatter ( schemaFileName: 'MySchema.avro' ) 
    INPUT FROM OracleSourcre_ChangeDataStream;
    
    END APPLICATION Oracle2Kafka;

Note that FetchSize: 1 is appropriate for development, but should be increased in a production environment. See OracleReader properties for more information.