Table of Contents

Search

  1. Abstract
  2. PowerExchange for Amazon Redshift
  3. PowerExchange for Amazon S3
  4. PowerExchange for Cassandra
  5. PowerExchange for Google BigQuery
  6. PowerExchange for Google Cloud Storage
  7. PowerExchange for Greenplum
  8. PowerExchange for HBase
  9. PowerExchange for HDFS
  10. PowerExchange for Hive
  11. PowerExchange for JDBC V2
  12. PowerExchange for JD Edwards EnterpriseOne
  13. PowerExchange for Kudu
  14. PowerExchange for LDAP
  15. PowerExchange for Microsoft Azure Blob Storage
  16. PowerExchange for Microsoft Azure Cosmos DB SQL API
  17. PowerExchange for Microsoft Azure Data Lake Storage Gen1
  18. PowerExchange for Microsoft Azure Data Lake Storage Gen2
  19. PowerExchange for Microsoft Azure SQL Data Warehouse
  20. PowerExchange for MongoDB
  21. PowerExchange for Netezza
  22. PowerExchange for OData
  23. PowerExchange for Salesforce
  24. PowerExchange for SAP NetWeaver
  25. PowerExchange for Snowflake
  26. PowerExchange for Teradata
  27. Informatica Global Customer Support

PowerExchange Adapters for Informatica Release Notes (10.5.0.1)

PowerExchange Adapters for Informatica Release Notes (10.5.0.1)

PowerExchange for Google Cloud Storage (10.5)

PowerExchange for Google Cloud Storage (10.5)

Known Issues

The following table describes known issues:
Issue
Description
OCON-27691
When the Spark engine runs a mapping on Amazon EMR 5.29 or Hortonworks HDP_3.1.5 cluster to read data from a Google Cloud Storage source, the mapping fails with the following error:
"Caused by: java.lang.IllegalArgumentException: Invalid bucket name or object name"
OCON-26458
When you import a Google Cloud Storage data object in JSON format and the file contains more than 400 rows, the mapping fails.
OCON-17932
When you specify the
Google Cloud Storage Path
in the
gs://<bucket name>
format and run a mapping on the Spark engine to write data to a Google Cloud Storage target, the mapping fails.
Workaround: Specify the
Google Cloud Storage Path
in the following format:
gs://<bucket name>/<folder_name>