Table of Contents

Search

  1. Abstract
  2. PowerExchange for Amazon Redshift
  3. PowerExchange for Amazon S3
  4. PowerExchange for Cassandra
  5. PowerExchange for Google BigQuery
  6. PowerExchange for Google Cloud Storage
  7. PowerExchange for Greenplum
  8. PowerExchange for HBase
  9. PowerExchange for HDFS
  10. PowerExchange for Hive
  11. PowerExchange for JDBC V2
  12. PowerExchange for JD Edwards EnterpriseOne
  13. PowerExchange for Kudu
  14. PowerExchange for LDAP
  15. PowerExchange for Microsoft Azure Blob Storage
  16. PowerExchange for Microsoft Azure Cosmos DB SQL API
  17. PowerExchange for Microsoft Azure Data Lake Storage Gen1
  18. PowerExchange for Microsoft Azure Data Lake Storage Gen2
  19. PowerExchange for Microsoft Azure SQL Data Warehouse
  20. PowerExchange for MongoDB
  21. PowerExchange for Netezza
  22. PowerExchange for OData
  23. PowerExchange for Salesforce
  24. PowerExchange for SAP NetWeaver
  25. PowerExchange for Snowflake
  26. PowerExchange for Teradata
  27. Informatica Global Customer Support

PowerExchange Adapters for Informatica Release Notes

PowerExchange Adapters for Informatica Release Notes

PowerExchange for Microsoft Azure Blob Storage User Guide (10.5)

PowerExchange for Microsoft Azure Blob Storage User Guide (10.5)

Fixed Issues

The following table describes fixed issues:
Bug
Description
OCON-26004
When you run a mapping in the native environment to read a complex file, where the Microsoft Azure Blob Storage connection uses the shared access signature authorization type and the Source transformation contains an override for the container name, the mapping fails with the following error:
java.lang.Exception: [MPSVCCMN_10094] The Mapping Service Module failed to run the job with ID [XK6nqtfUEeqypGWC8kJ51Q] because of the following error: [EdtmExec_00007] Exception: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
OCON-17082
When you import an object from sub directories with names having a space, data preview fails.

Known Issues

The following table describes known issues:
Bug
Description
BDM-19847
For the write operation, when you run a mapping on the Spark engine and the folder path contains special characters, the Data Integration Service creates a new folder.
OCON-22511
When you read data from a Microsoft Azure SQL Data Warehouse source and use the
Create Target
option to create a Microsoft Azure Blob Storage target, if the Microsoft Azure Blob Storage connection uses SAS authentication, the mapping fails.
OCON-20605
When you run a mapping, in the native environment, to read a flat file that has unicode characters, a space, null values, single quotes, or a value that starts with a dollar sign, the Data Integration Service adds double quotes to the values when writing data to the target.
OCON-17642
When you enable Mapping Flow in a mapping that reads data from a flat file source and writes to a flat file target, the mapping fails with the following error in the native environment:
java.lang.Exception: [MPSVCCMN_10094] The Mapping Service Module failed to run the job with ID [Ic2j9ASPEemTlSYmtVHPww] because of the following error: [EdtmExec_00007] Exception: /tmp/insertd29a7def_bb59_452d_8051_ea4b4630807b9132318161205585091.azb (No such file or directory)
Workaround: Remove the FileName field from the imported source object and rerun the mapping.
OCON-17443
When you use the
Create Target
option to create a Microsoft Azure Blob Storage target and select Flat as the Resource Format, fields are not getting propagated to the target.
Workaround: Enable column projection and create fields manually in the target file and run the mapping.
OCON-12420
When you read or write a blob that has special characters, the mapping fails on the Spark engine.
OCON-12352
When a JSON file contains special characters, the Data Integration Service does not read the data correctly in the Spark mode.
OCON-12318
The Data Integration Service adds an extra blank new line at the end when you read or write a flat file in the native environment or in the Spark mode.
OCON-10125
When you read data from or write data to Microsoft Azure Blob Storage, the entire blob gets downloaded in the staging directory even if you cancel the mapping.