You can enable a mapping to run dynamically using the
At runtime, get data object columns from data source
option in the
tab of the Read and Write transformations when you create a mapping.
When you add or override the metadata dynamically, you can include all the existing source and target objects in a single mapping and run the mapping. You do not have to change the source schema to update the data objects and mappings manually to incorporate all the new changes in the mapping.
You can use the mapping template rules to tune the behavior of the execution of such pipeline mapping.
When the Source or Target transformation contains updated ports such as changes in the port names, data types, precision, or scale, the Data Integration Service fetches the updated ports and runs the mapping dynamically. You must ensure that at least one of the column name in the source or target file is the same as before refreshing the schema to run the dynamic mapping successfully.
Even though the original order of the source or target ports in the file changes, the Data Integration Service displays the original order of the ports in the file when you refresh the schemas at runtime.
If there are more columns in the source file as compared to the target file, the Data Integration Service does not map the extra column to the target file and loads null data for all the unmapped columns in the target file.
If the Source transformation contains updated columns that do not match the Target transformation, the Data Integration Service does not link the new ports by default when you refresh the source or target schema. You must create a run-time link between the transformations to link ports at run time based on a parameter or link policy in the
tab and update the target schema manually. For information about run-time linking, see the
Informatica Developer Mapping Guide
Even though you delete the FileName port from the Source transformation, the Data Integration Service adds the FileName port when you refresh the source schema.
When you refresh a schema of a flat file, the Data Integration Service writes all data types as String data types.