Table of Contents

Search

  1. Informatica Bug Tracking System Change
  2. PowerExchange for Amazon Redshift
  3. PowerExchange for Amazon S3
  4. PowerExchange for Cassandra
  5. PowerExchange for DataSift
  6. PowerExchange for Facebook
  7. PowerExchange for Greenplum
  8. PowerExchange for HBase
  9. PowerExchange for HDFS
  10. PowerExchange for Hive
  11. PowerExchange for JD Edwards EnterpriseOne
  12. PowerExchange for LDAP
  13. PowerExchange for LinkedIn
  14. PowerExchange for MapR-DB
  15. PowerExchange for Microsoft Azure Data Lake Store
  16. PowerExchange for Microsoft Azure SQL Data Warehouse
  17. PowerExchange for Microsoft Azure Blob Storage
  18. PowerExchange for Microsoft Dynamics CRM
  19. PowerExchange for MongoDB
  20. PowerExchange for Netezza
  21. PowerExchange for OData
  22. PowerExchange for Salesforce
  23. PowerExchange for SAP NetWeaver
  24. PowerExchange for Tableau
  25. PowerExchange for Teradata Parallel Transports API
  26. PowerExchange for Twitter
  27. PowerExchange for Web Content-Kapow katalyst
  28. Informatica Global Customer Support

PowerExchange Adapters for Informatica Release Notes

PowerExchange Adapters for Informatica Release Notes

PowerExchange for Microsoft Azure SQL Data Warehouse Known Limitations (10.2)

PowerExchange for Microsoft Azure SQL Data Warehouse Known Limitations (10.2)

The following table describes known limitations:
Bug
Description
OCON-10181
When you read data from Microsoft Azure SQL Data Warehouse and the table contains special character, the mapping fails.
OCON-10141
When you run a mapping on the Hive engine to read data from or write data to Microsoft Azure SQL Data Warehouse, the intermediate files get downloaded in the staging directory even if you cancel the mapping.
OCON-10128
When you create a data object to read data from a large table in Microsoft Azure SQL Data Warehouse and preview the data and set the Read up to how many rows field to 1000, the Data Integration Services downloads the entire table in the staging directory.
OCON-8793
You cannot run a Microsoft Azure SQL Data Warehouse mapping on the Blaze engine.
OCON-972
The Hadoop job log does not display reader logs.
OCON-844
The Data Integration Service reads a blank char, varchar, nchar, or nvarchar datatypes record from Microsoft Azure SQL Data Warehouse as Null .
OCON-811
The Data Integration Service does not delete the external table and staging blob files when the mapping fails or when you cancel an operation. You should manually delete the files.
OCON-585
When an Azure table contains a bad record, the Data Integration Service fails the mapping instead of rejecting the bad record.
OCON-533
You cannot delete data in Hadoop mode.
OCON-399
The DistCp jobs are submitted as Yarn user instead of Data Integration Service user. A DistCp job should be submitted with disuser or impersonation user only.