You can add all the Source transformation or transformation ports to the target dynamically when enable a mapping to run dynamically using the
Mapping Flow
option. You can then use the dynamic ports in the Write transformation.
When you select the
Mapping Flow
option, the Data Integration Service allows the Target transformation to override ports of the Write transformation with all the updated incoming ports from the pipeline mapping and loads the target file with the ports at runtime.
To enable a dynamic mapping using the
Mapping Flow
option, select the value of the
Columns defined by
property as
Mapping Flow
in the
Ports
tab in the Write transformation.
When you use the
Mapping Flow
option to read data from a flat file that contains a port of Integer or Double data type, the mapping runs successfully. However, the Data Integration Service does not write the data of the port with Integer or Double data type and the consecutive ports regardless of the data type.
When you run a dynamic mapping on the Spark or Databricks Spark engine using the
Mapping Flow
option to fetch the metadata changes from any source that contains a FileName port, the mapping fails. You must add a transformation and configure the
Input Rules
in the
Ports
tab of the transformation to exclude the FileName port from the Write transformation. Then, map the rest of the ports.