GoldenGate Questions for Onboarding

List source and target databases and geographic locations involved in the replication?

What is the network bandwidth?

List important considerations for bi-directional replication?

The customer should consider the following points in an active-active replication environment.

Primary Key: Helps to identify conflicts and resolve them.

Sequences: Are not supported. The work around is use to use odd/even, range or concatenate sequences.

Triggers: These should be disabled or suppressed to avoid using uniqueness issue

Data Looping: This can easy avoided using OGG itself

LAG: This should be minimized. If a customer says that there will not be any LAG due to network or huge load, then we don’t need to deploy CRDs. But this is not the case always as there would be some LAG and these can cause conflicts.

CDR (Conflict Detection & Resolution): OGG has built in CDRs for all kind of DMLs that can be used to detect and resolve them.

Packaged Application: These are not supported as it may contain data types which are not support by OGG or it might not allow the application modification to work with OGG.

Is a data pump process required?

The Data Pump (not to be confused with the Oracle Export Import Data Pump) is an optional secondary Extract group that is created on the source system. When Data Pump is not used, the Extract process writes to a remote trail that is located on the target system using TCP/IP. When Data Pump is configured, the Extract process writes to a local trail and from here Data Pump will read the trail and write the data over the network to the remote trail located on the target system.

The advantages of this be it protects against a network failure as in the absence of a storage device on the local system, the Extract process writes data into memory before the same is sent over the network. Any failures in the network could then cause the Extract process to abort (abend). Also, if we are doing any complex data transformation or filtering, the same can be performed by the Data Pump. It will also be useful when we are consolidating data from several sources into one central target where data pump on each individual source system can write to one common trail file on the target.



What transaction types does Goldengate support for Replication?

What are the supplemental logging pre-requisites?

The following supplemental logging is required.

Database supplemental logging

Object level logging

Why is Supplemental logging required for Replication?

When a transaction transaction is committed on the source database, only new data is written to the Redo log. However for Oracle to apply these transactions on the destination database, the before image key values are required to identify the effected rows. This data is also placed in the trail file and used to identify the rows on the destination, using the key value the transactions are executed against them.

What type of Encryption is supported in Goldengate?


Oracle Goldengate provides 3 types of Encryption.

Data Encryption using Blow fish.

Password Encryption.

Network Encryption.


What are the different password encrytion options available with OGG?

You can encrypt a password in OGG using

Blowfish algorithm and

Advance Encryption Standard (AES) algorithm


Is there a way to check the syntax of the commands in the parameter file without running the GoldenGate process?

Yes, you can place the SHOWSYNTAX parameter in the parameter file and try starting. If there is any error, you will see it.


How can you increase the maximum size of the read operation into the buffer that holds the results of the reads from the transaction log?

If you are using the Classical Extract you may use the TRANSLOGOPTION ASMBUFSIZE parameter to control, the read size for ASM Databases.

What information can you expect when there us data in the discard file?

When data is discarded, the discard file can contain:

Discard row details

Database Errors

Trail file number

What command can be used to switch writing the trail data to a new trail file?

You can use the following command to write the trail data to a new trail file.