PowerExchange Adapters for Informatica 10.4.1
- PowerExchange Adapters for Informatica 10.4.1
- All Products
When you run a mapping in the native environment to write data to a complex file target and choose to overwrite the target data and the target filename does not contain the file format extension such as ".avro or .parquet", the Data Integration Service does not delete the target data before writing data.
When you run a mapping on the Blaze engine to read data from and write data to sequence complex file data objects using a Kerberos connection, the mapping fails with the following exception:
"GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos target)"
When you run a mapping on the Spark engine to read data from an empty JSON complex file source and write data to a complex file target, the mapping should fail, but the mapping runs successfully and the Data Integration Service generates an empty target file.