PowerExchange Adapters for Informatica 10.2.2 HotFix 1
- PowerExchange Adapters for Informatica 10.2.2 HotFix 1
- All Products
When you run a mapping on the Spark engine to read data from a local complex file source, if the folders inside the source directory contains files with same names and you use a wildcard pattern to read the entire parent directory, the Data Integration Service reads data from any one of the files with the same names and there is no impact in reading from the other files.
When you use the FileName port in a complex file target in Binary format and run a mapping in the native environment, the Data Integration Service writes all the files to a single folder.
When you run a mapping to write data to a complex file using
Create targetoption for Avro/Parquet formats with mapping flow enabled, the schema is created with primitive datatypes and it skips the rows having null values.
When you run a mapping on the Spark engine to read data from a complex file with ORC (Optimized Row Columnar) dayatype using the wildcard character questionmark '?' in MapR distro, the mapping fails.
When you run a mapping on the Spark engine to read data from a complex file and the source path has a wildcard character, then the log file does not display the source file names.
When you create a complex file data object from a JSON file, the task fails with the following error:
Encountered an error saving the data object
When you run a mapping in the native environment with decimal data type in the source, the mapping runs successfully but the Data Integration Service writes an empty file to the target.
When you validate a mapping and select a connection parameter type that is not valid, the parameter name appears incorrectly.
This error occurs when you import a flat file from the Hadoop environment, parameterize the connection name, and change the parameter type to a type that is not valid.
If you set the Hive warehouse directory in a Hadoop connection to an encrypted HDFS directory and the impersonation user does not have the DECRYPT_EEK permission, complex file mappings run indefinitely on the Hive engine.