PowerExchange Adapters for Informatica
- PowerExchange Adapters for Informatica 10.4.0
- All Products
Bug
| Description
|
---|---|
OCON-23230
| You cannot use multiple level partitioning when you run a mapping to write data to a complex file target with Filename port enabled.
|
OCON-23124
| When you run a mapping in the native environment to write data to a complex file using filename port and mapping flow is enabled, the Data Integration Service generates an incorrect folder structure and writes data to a single file.
|
OCON-23122
| When you run a mapping in the native environment to read or write data to a complex file object in ORC format, the mapping fails
|
OCON-23084
| When you run a mapping to read from a complex file source and write to a complex file target and the source object contains unsupported data types in the schema, the mapping fails.
|
OCON-21852
| When you import a complex file data object in JSON format, the import fails with the following error:
Array must contain at least 1 element for projection |
OCON-17103
| When you run a mapping on the Spark engine to read data from a complex file and the source path has a wildcard character, then the log file does not display the source file names.
|
OCON-16280
| When you create a complex file data object from a JSON file, the task fails with the following error:
Encountered an error saving the data object |
OCON-15862
| When you run a mapping to read data from a complex file source in JSON format and enable compression, the mapping runs successfully but the Data Integration Service fails to read data from source.
|
OCON-12579
| If you set the Hive warehouse directory in a Hadoop connection to an encrypted HDFS directory and the impersonation user does not have the DECRYPT_EEK permission, complex file mappings run indefinitely on the Hive engine.
|