PowerExchange Adapters for Informatica 10.2.2 HotFix 1
- PowerExchange Adapters for Informatica 10.2.2 HotFix 1
- All Products
When you create a dynamic mapping to read multiple files from a directory and override the directory, the mapping fails with the following error:
Workaround: Verify that the override directory contains a source file with the same name as the imported object and rerun the dynamic mapping.
When you use the
Create Targetoption to create a Microsoft Azure Blob Storage target and select Flat as the Resource Format, fields are not getting propagated to the target.
Workaround: Enable column projection and create fields manually in the target file and run the mapping.
When you enable Mapping Flow in a mapping that reads data from a flat file source and writes to a flat file target, the mapping fails with the following error in the native environment:
Workaround: Remove the FileName field from the imported source object and rerun the mapping.
When you create a Microsoft Azure Blob Storage data object, the value of the folder path is displayed incorrectly in the
When you read or write a blob that has special characters, the mapping fails on the Spark engine.
For the write operation, when you run a mapping on the Spark engine and the folder path contains special characters, the Data Integration Service creates a new folder.
When a JSON file contains special characters, the Data Integration Service does not read the data correctly in the Spark mode.
The write operation fails for a flat file in the native environment when single or double quotes are selected as text qualifier.
The Data Integration Service adds an extra blank new line at the end when you read or write a flat file in the native environment or in the Spark mode.
When you read data from or write data to Microsoft Azure Blob Storage, the entire blob gets downloaded in the staging directory even if you cancel the mapping.