Table of Contents

Search

  1. Abstract
  2. PowerExchange for Amazon Redshift
  3. PowerExchange for Amazon S3
  4. PowerExchange for Cassandra
  5. PowerExchange for Cassandra JDBC
  6. PowerExchange for DataSift
  7. PowerExchange for Facebook
  8. PowerExchange for Google Analytics
  9. PowerExchange for Google BigQuery
  10. PowerExchange for Google Cloud Storage
  11. PowerExchange for Greenplum
  12. PowerExchange for HBase
  13. PowerExchange for HDFS
  14. PowerExchange for Hive
  15. PowerExchange for JD Edwards EnterpriseOne
  16. PowerExchange for LDAP
  17. PowerExchange for LinkedIn
  18. PowerExchange for MapR-DB
  19. PowerExchange for Microsoft Azure Blob Storage
  20. PowerExchange for Microsoft Azure Cosmos DB SQL API
  21. PowerExchange for Microsoft Azure Data Lake Store
  22. PowerExchange for Microsoft Azure SQL Data Warehouse
  23. PowerExchange for Microsoft Dynamics CRM
  24. PowerExchange for MongoDB
  25. PowerExchange for Netezza
  26. PowerExchange for Salesforce
  27. PowerExchange for SAP NetWeaver
  28. PowerExchange for Snowflake
  29. PowerExchange for Tableau
  30. PowerExchange for Tableau V3
  31. PowerExchange for Teradata Parallel Transporter API
  32. PowerExchange for Twitter
  33. PowerExchange for Web Content-Kapow Katalyst
  34. Informatica Global Customer Support

PowerExchange Adapters for Informatica Release Notes

PowerExchange Adapters for Informatica Release Notes

PowerExchange for HDFS Known Limitations (10.2.2)

PowerExchange for HDFS Known Limitations (10.2.2)

The following table describes known limitations:
Bug
Description
OCON-19803
When you run a mapping on the Spark engine to read data from a local complex file source, if the folders inside the source directory contains files with same names and you use a wildcard pattern to read the entire parent directory, the Data Integration Service reads data from any one of the files with the same names and there is no impact in reading from the other files.
OCON-18292
When you use the FileName port in a complex file target in Binary format and run a mapping in the native environment, the Data Integration Service writes all the files to a single folder.
OCON-18208
When you run a mapping to write data to a complex file using
Create target
option for Avro/Parquet formats with mapping flow enabled, the schema is created with primitive datatypes and it skips the rows having null values.
OCON-18169
When you run a mapping on the Spark engine to read data from a complex file with ORC (Optimized Row Columnar) dayatype using the wildcard character questionmark '?' in MapR distro, the mapping fails.
OCON-17103
When you run a mapping on the Spark engine to read data from a complex file and the source path has a wildcard character, then the log file does not display the source file names.
OCON-16280
When you create a complex file data object from a JSON file, the task fails with the following error:
Encountered an error saving the data object
OCON-15845
When you run a mapping in the native environment with decimal data type in the source, the mapping runs successfully but the Data Integration Service writes an empty file to the target.
BDM-14811
When you validate a mapping and select a connection parameter type that is not valid, the parameter name appears incorrectly.
This error occurs when you import a flat file from the Hadoop environment, parameterize the connection name, and change the parameter type to a type that is not valid.
OCON-12579
If you set the Hive warehouse directory in a Hadoop connection to an encrypted HDFS directory and the impersonation user does not have the DECRYPT_EEK permission, complex file mappings run indefinitely on the Hive engine.