Table of Contents

Search

  1. Abstract
  2. PowerExchange for Amazon Redshift
  3. PowerExchange for Amazon S3
  4. PowerExchange for Cassandra
  5. PowerExchange for DataSift
  6. PowerExchange for Facebook
  7. PowerExchange for Greenplum
  8. PowerExchange for Google BigQuery
  9. PowerExchange for HBase
  10. PowerExchange for HDFS
  11. PowerExchange for Hive
  12. PowerExchange for JD Edwards EnterpriseOne
  13. PowerExchange for LDAP
  14. PowerExchange for LinkedIn
  15. PowerExchange for MapR-DB
  16. PowerExchange for Microsoft Azure Blob Storage
  17. PowerExchange for Microsoft Azure Data Lake Store
  18. PowerExchange for Microsoft Azure SQL Data Warehouse
  19. PowerExchange for Microsoft Dynamics CRM
  20. PowerExchange for MongoDB
  21. PowerExchange for Netezza
  22. PowerExchange for OData
  23. PowerExchange for Salesforce
  24. PowerExchange for SAP NetWeaver
  25. PowerExchange for Tableau
  26. PowerExchange for Tableau V3
  27. PowerExchange for Teradata Parallel Transporter API
  28. PowerExchange for Twitter
  29. PowerExchange for Web Content-Kapow Katalyst
  30. Informatica Global Customer Support

PowerExchange Adapters for Informatica Release Notes

PowerExchange Adapters for Informatica Release Notes

PowerExchange for Amazon Redshift Known Limitations (10.2)

PowerExchange for Amazon Redshift Known Limitations (10.2)

The following table describes known limitations:
Bug
Description
OCON-9834
When you run an Amazon Redshift mapping on the Spark engine to read from or write to an Amazon Redshift cluster that has Version 4 authentication, the mapping fails with the following error message:
Bad Request (Service: Amazon S3; Status Code: 400; Error Code: 400 Bad Request; Request ID: 9BDDEEB8241688A2)
This issue occurs when you use the Hortonworks 2.6 distribution.
OCON-9827
When you run an Amazon Redshift mapping on the Spark engine to read from or write to an Amazon Redshift cluster that has Version 4 authentication, the mapping fails with the following error message:
com.amazonaws.services.s3.model.AmazonS3Exception: Status Code: 400, AWS Service: Amazon S3, AWS Request ID: 5EEB36DFAC18DE3B, AWS Error Code: null, AWS Error Message: Bad Request
This issue occurs when you use the IBM BigInsights 4.2 distribution.
OCON-9663
When you run an Amazon Redshift mapping to read or write data, the
Stop on Errors
property does not work.
OCON-8022
If you import an Amazon Redshift table that has a single quote (') in the column name, the mapping fails with the following error message:
[LDTM_0072] [Amazon](500051) ERROR processing query/statement. Error: Parsing failed, Query: unload ('SELECT "adpqa"."sq_col"."id'" FROM "adpqa"."sq_col"') TO 's3://infa.qa.bucket/ 0b0ad503-1c2c-4514-95ac-85a5adb71b3b1489385038407/sq_col_' credentials 'aws_access_key_id=********;aws_secret_access_key=********' ESCAPE DELIMITER ','
OCON-7965
An Amazon Redshift mapping fails on the Blaze engine when it reads from or writes to an Amazon Redshift cluster that uses the Hortonworks 2.3 distribution and requires Version 4 authentication.
OCON-7909
When you run an Amazon Redshift mapping on the Blaze engine to read data from or write data to an Amazon Redshift cluster that requires Version 4 authentication, the mapping fails. This issue occurs if you use the MapR 5.2 distribution.
OCON-7322
If you import an Amazon Redshift table that has a single quote (') or a backslash (\) in the table name, the read and write operations fail.
OCON-6929
If you do not connect all the ports in an Amazon Redshift mapping and run the mapping on the Hive engine, the mapping fails with the following error:
java.lang.RuntimeException [HIVE_1070]The Integration Service failed to run Hive query [exec3_query_2] for task [exec3] due to following error: Hive error code [10,044] , Hive message [FAILED: SemanticException [Error 10044]
OCON-6921
When you run an Amazon Redshift mapping that contains a timestamp field in the native environment, the Data Integration Service truncates the ultraseconds values to milliseconds.
OCON-6883
When you run an Amazon Redshift mapping on the Blaze engine to read from or write to an Amazon Redshift cluster that has Version 4 authentication, the mapping fails.
This issue occurs when you use the IBM BigInsights 4.2 distribution.
OCON-6785
When the Amazon Redshift source contains both double quotes (") and the delimiter you specified in the mapping, double quotes are truncated in the target. Also, the escape character is retained in the target.
OCON-6583
If you set the Parallel option off in the unload command and run an Amazon Redshift mapping on the Blaze engine, all the rows from the source are not written to the Amazon Redshift target even though the mapping runs successfully.
OCON-6505
If you specify user impersonation in a Hadoop connection and run an Amazon Redshift mapping on the Hive engine, no data is written to the target even though the mapping runs successfully.
OCON-6346
When you run an Amazon Redshift mapping on the Blaze engine, the success and error files are not generated.
OCON-6266
When you run an Amazon Redshift mapping that compresses the staging files on the Blaze engine, the mapping fails. The staging files compression is ignored.
OCON-6260
When you run an Amazon Redshift mapping on the Blaze engine, the tasklet log does not display the row statistics even if the mapping runs successfully.
OCON-6252
When you run a mapping on the Blaze engine, the Real and Double data type values are rounded off.
Workaround: Use the Numeric data type in place of Real and Double data types.
OCON-361
An Amazon Redshift mapping fails on the Hive engine for a Hadoop cluster that uses Kerberos authentication.
OCON-1297
When you configure the following attributes and run an Amazon Redshift mapping in the Hadoop environment, the mapping might fail based on the engine selected for mapping execution:
Read Operation Attributes:
  • S3 Client Encryption
  • Staging Directory Location
Write Operation Attributes:
  • Enable Compression
  • CopyOptions Property File
  • Null value for CHAR and VARCHAR data types
  • S3 Server Side Encryption
  • S3 Client Side Encryption
  • Staging Directory Location
  • Success File Directory
  • Error File Directory
OCON-1275
A mapping with more than one RedShift object fails in the Hadoop run-time environment for the MapR distribution.
OCON-10209
When you run an Amazon Redshift mapping on the Spark engine to read from or write to an Amazon Redshift cluster that has Version 4 authentication, the mapping fails with the following error message:
com.amazonaws.services.s3.model.AmazonS3Exception:
This issue occurs when you use the MapR distribution.
OCON-10194
When you run a Redshift mapping on the Spark engine and the mapping fails, the Data Integration Service does not delete the Amazon S3 files from the Amazon S3 staging directory.