Updating an Avro Schema for Kafka Targets After Running Data Replication
Updating an Avro Schema for Kafka Targets After Running Data Replication
If the columns in a mapped source table definition change and you want to reflect that change in Kafka target messages, you must manually update the column mappings in the Data Replication Console and then delete the Avro schema cache file.
Before performing these steps, stop all replication tasks that are running for the configuration.
If a column in a mapped source table is dropped, the Data Replication Console can automatically update the column mappings when you open the configuration. However, you still must delete the Avro schema cache file so that a new schema cache file can be re-created.
Open the configuration and switch to Edit mode.
Click the
Map Columns
tab.
If source columns were dropped, verify that the Console automatically removed the mappings for those columns. Then skip to step 4.
If new source columns were added, perform the following steps to map them:
In the
Source Table
list, select the table that contains the new column.
The corresponding target table appears in the
Target Table
list.
Select the new source column row.
Select the target column row.
Click
Map
.
Save the configuration.
Navigate to the directory to which Data Replication writes the cache files that contain Avro schema information.
The default directory is
DataReplication_installation
/output/
configuration_name
/overflow. Another directory might have been specified in the
apply.avro.avro_schema_cache_directory
parameter on the
Runtime Settings
tab >
Advanced Settings
view. Avro schema cache file names use the following format: key
<key_number>
_
<source_owner>
_
<source_table_name>
.che.
Delete the Avro schema cache file that contains the source table name for which a column was added or removed.
Restart all replication tasks.
The next time the Applier runs, Data Replication will create a new file that contains updated information about the Avro schema for the source table.