PowerExchange Adapters for Informatica 10.4.1
- PowerExchange Adapters for Informatica 10.4.1
- All Products
You cannot use multiple level partitioning when you run a mapping to write data to a complex file target with Filename port enabled.
When you run a mapping in the native environment to write data to a complex file using filename port and mapping flow is enabled, the Data Integration Service generates an incorrect folder structure and writes data to a single file.
When you run a mapping in the native environment to read or write data to a complex file object in ORC format, the mapping fails
When you run a mapping to read from a complex file source and write to a complex file target and the source object contains unsupported data types in the schema, the mapping fails.
When you import a complex file data object in JSON format, the import fails with the following error:
Array must contain at least 1 element for projection
When you run a mapping on the Spark engine to read data from a complex file and the source path has a wildcard character, then the log file does not display the source file names.
When you create a complex file data object from a JSON file, the task fails with the following error:
Encountered an error saving the data object
When you run a mapping to read data from a complex file source in JSON format and enable compression, the mapping runs successfully but the Data Integration Service fails to read data from source.
If you set the Hive warehouse directory in a Hadoop connection to an encrypted HDFS directory and the impersonation user does not have the DECRYPT_EEK permission, complex file mappings run indefinitely on the Hive engine.