Table of Contents

Search

  1. Abstract
  2. PowerExchange for Amazon Redshift
  3. PowerExchange for Amazon S3
  4. PowerExchange for Cassandra
  5. PowerExchange for DataSift
  6. PowerExchange for Facebook
  7. PowerExchange for Greenplum
  8. PowerExchange for HBase
  9. PowerExchange for HDFS
  10. PowerExchange for Hive
  11. PowerExchange for JD Edwards EnterpriseOne
  12. PowerExchange for LDAP
  13. PowerExchange for LinkedIn
  14. PowerExchange for MapR-DB
  15. PowerExchange for Microsoft Azure Blob Storage
  16. PowerExchange for Microsoft Azure Data Lake Store
  17. PowerExchange for Microsoft Azure SQL Data Warehouse
  18. PowerExchange for Microsoft Dynamics CRM
  19. PowerExchange for MongoDB
  20. PowerExchange for Netezza
  21. PowerExchange for OData
  22. PowerExchange for Salesforce
  23. PowerExchange for SAP NetWeaver
  24. PowerExchange for Snowflake
  25. PowerExchange for Tableau
  26. PowerExchange for Teradata Parallel Transporter API
  27. PowerExchange for Twitter
  28. PowerExchange for Web Content-Kapow Katalyst
  29. Informatica Global Customer Support

PowerExchange Adapters for Informatica Release Notes

PowerExchange Adapters for Informatica Release Notes

PowerExchange for Amazon S3 Fixed Limitations (10.2.1)

PowerExchange for Amazon S3 Fixed Limitations (10.2.1)

Review the Release Notes of previous releases for information about previous fixed limitations.
The following table describes fixed limitations:
Bug
Description
OCON-25561
When you run a mapping that reads data from or writes data to a flat file and select the text qualifier as
NONE
, the
NONE
text qualifier is not honored. Instead, the default double quotes text qualifier is written to target.
OCON-12394
When you set the
Compression Format
type as none and run a mapping on the Spark engine to write an Avro file to an Amazon S3 target, the mapping runs successfully. However, the Data Integration Service compresses the target Amazon S3 file using the snappy compression format.
OCON-11798
When you select an encryption type and run a mapping in the native environment to read or write Avro and Parquet files, the mapping runs successfully. However, the encryption type is not honoured.
OCON-10806
When you run a mapping to read data from a Parquet source and if the precision for the string values is greater than 4000, the mapping runs successfully.
However, the Data Integration Service only writes string values with precision up to 4000 in the target and the remaining data gets truncated.
OCON-10804
When you run a mapping to write data to a Parquet target and if you do not connect all the ports in the target, the mapping fails with the following error message:
java.lang.Exception: [MPSVCCMN_10094] The Mapping Service Module failed to run the job with ID [YZ-LZro4EeeVGlU8guu1DA] because of the following error: [LDTM_0072] java.lang.RuntimeException:
OCON-10802
When you run an Amazon S3 mapping to read data from a Parquet file that contain null values, the data preview fails with the following error message:
java.lang.RuntimeException: java.lang.RuntimeException: