Your organization needs to denormalize customer key, name, address, and other details. The customer details are stored in Avro files in HDFS. Import the Avro file object as a source. Create a mapping that reads all the customer details from the avro files in HDFS, and writes the customers details to an Oracle target.
You can use the target data for business analytics.
You can use the following objects in the HDFS mapping:
HDFS Inputs
The Customer_Details_Avro file is an Avro files stored in HDFS.
HDFS Output
The Customer_Oracle_Target file is an Oracle object.
Create a Complex File Data Object
Create a complex file data object to read data from an Avro file. Verify that you select Avro as Resource Format. The following image shows the sample selection:
When you create the complex file data object, the read and write operations are created by default. You can view the columns present in the Avro file. The following image shows the sample data object read operation:
The Enable Column Projection is selected by default. You can view or update the associated schema and column mapping.
The following image shows the sample mapping:
When you run the mapping, the Data Integration Service reads the input Avro files and writes the hierarchical output directly to the Oracle target.
You can configure the mapping to run in a native or Hadoop run-time environment.
Perform the following tasks to configure the mapping:
Create an HDFS connection to read Avro file from the Hadoop cluster.
Create a complex file data object to import the Avro file. You must select Avro as Resource Format. Configure the read operation properties.
Create an Oracle database connection to write data to the Oracle target.
Create an Oracle data object and configure the write operation properties.
Drag the complex file data object read operation and Oracle data object write operation into the mapping.