Table of Contents

Search

  1. Abstract
  2. PowerExchange for Amazon Redshift
  3. PowerExchange for Amazon S3
  4. PowerExchange for Cassandra
  5. PowerExchange for DataSift
  6. PowerExchange for Facebook
  7. PowerExchange for Greenplum
  8. PowerExchange for Google BigQuery
  9. PowerExchange for HBase
  10. PowerExchange for HDFS
  11. PowerExchange for Hive
  12. PowerExchange for JD Edwards EnterpriseOne
  13. PowerExchange for LDAP
  14. PowerExchange for LinkedIn
  15. PowerExchange for MapR-DB
  16. PowerExchange for Microsoft Azure Blob Storage
  17. PowerExchange for Microsoft Azure Data Lake Store
  18. PowerExchange for Microsoft Azure SQL Data Warehouse
  19. PowerExchange for Microsoft Dynamics CRM
  20. PowerExchange for MongoDB
  21. PowerExchange for Netezza
  22. PowerExchange for OData
  23. PowerExchange for Salesforce
  24. PowerExchange for SAP NetWeaver
  25. PowerExchange for Tableau
  26. PowerExchange for Tableau V3
  27. PowerExchange for Teradata Parallel Transporter API
  28. PowerExchange for Twitter
  29. PowerExchange for Web Content-Kapow Katalyst
  30. Informatica Global Customer Support

PowerExchange Adapters for Informatica Release Notes

PowerExchange Adapters for Informatica Release Notes

PowerExchange for Microsoft Azure SQL Data Warehouse Known Limitations (10.2)

PowerExchange for Microsoft Azure SQL Data Warehouse Known Limitations (10.2)

The following table describes known limitations:
Bug
Description
OCON-10181
When you read data from Microsoft Azure SQL Data Warehouse and the table contains special character, the mapping fails.
OCON-10141
When you run a mapping on the Hive engine to read data from or write data to Microsoft Azure SQL Data Warehouse, the intermediate files get downloaded in the staging directory even if you cancel the mapping.
OCON-10128
When you create a data object to read data from a large table in Microsoft Azure SQL Data Warehouse, preview the data, and set the
Read up to how many rows
field to 1000, the Data Integration Services downloads the entire table in the staging directory.
OCON-8793
You cannot run a Microsoft Azure SQL Data Warehouse mapping on the Blaze engine.
OCON-972
The Hadoop job log does not display reader logs.
OCON-844
The Data Integration Service reads a blank char, varchar, nchar, or nvarchar data type record from Microsoft Azure SQL Data Warehouse as Null.
OCON-811
When a mapping fails or when you cancel an operation, the Data Integration Service does not delete the external table and staging blob files. You must manually delete the files.
OCON-585
When an Azure table contains a bad record, the Data Integration Service fails the mapping instead of rejecting the bad record.
OCON-533
You cannot run a mapping in the Hadoop environment to delete data.
OCON-399
When you run a mapping on the Hive engine, the Data Integration Service submits the job as Yarn user instead of Data Integration Service user.