PowerExchange Adapters for Informatica
- PowerExchange Adapters for Informatica 10.4.0
- All Products
Bug
| Description
|
---|---|
BDM-19847
| For the write operation, when you run a mapping on the Spark engine and the folder path contains special characters, the Data Integration Service creates a new folder.
|
OCON-24942
| When you refresh the source or target schema at runtime in a dynamic mapping, the values that you specify for the delimiter, text qualifier, and escape character for a flat file are not honored and the default values are used instead. This might lead to unexpected results in the target.
|
OCON-22511
| When you read data from a Microsoft Azure SQL Data Warehouse source and use the
Create Target option to create a Microsoft Azure Blob Storage target, if the Microsoft Azure Blob Storage connection uses SAS authentication, the mapping fails.
|
OCON-20605
| When you run a mapping, in the native environment, to read a flat file that has unicode characters, a space, null values, single quotes, or a value that starts with a dollar sign, the Data Integration Service adds double quotes to the values when writing data to the target.
|
OCON-17642
| When you enable Mapping Flow in a mapping that reads data from a flat file source and writes to a flat file target, the mapping fails with the following error in the native environment:
Workaround: Remove the FileName field from the imported source object and rerun the mapping.
|
OCON-17443
| When you use the
Create Target option to create a Microsoft Azure Blob Storage target and select Flat as the Resource Format, fields are not getting propagated to the target.
Workaround: Enable column projection and create fields manually in the target file and run the mapping.
|
OCON-17082
| When you import an object from sub directories with names having a space, data preview fails.
|
OCON-12420
| When you read or write a blob that has special characters, the mapping fails on the Spark engine.
|
OCON-12352
| When a JSON file contains special characters, the Data Integration Service does not read the data correctly in the Spark mode.
|
OCON-12318
| The Data Integration Service adds an extra blank new line at the end when you read or write a flat file in the native environment or in the Spark mode.
|
OCON-10125
| When you read data from or write data to Microsoft Azure Blob Storage, the entire blob gets downloaded in the staging directory even if you cancel the mapping.
|