Table of Contents

Search

  1. Abstract
  2. 2020 - August
  3. 2020 - July
  4. 2020 - April
  5. 2020 - February
  6. 2020 - January
  7. 2019 - November
  8. 2019 - September
  9. 2019 - August
  10. 2019 - July
  11. 2019 - June
  12. 2019 - May
  13. 2019 - April
  14. 2019 - March
  15. Informatica Global Customer Support

Connector Release Notes

Connector Release Notes

Microsoft Azure Blob Storage V3 Connector Known Limitations

Microsoft Azure Blob Storage V3 Connector Known Limitations

The following table describes known limitations:
CR
Description
IF-12413
When you edit metadata in the
Fields
tab in a mapping, all native data types change to Bigint and you cannot change the scale and precision of data types except string.
CCON-19476
When you run a mapping to write a JSON file that contains unicode characters in column names to a Microsoft Azure Blob Storage V3 target, the mapping fails.
CCON-18341
When you run a mapping to read data from or write data to Microsoft Azure Blob Storage, by default, the scale of a double data type is set to 0 in the target file. This might result in data loss.
CCON-16295
When you run a mapping to read an Avro file that contains binary fields, data preview and the mapping fail.
CCON-16164
When the field names in the source or target object contain special character or unicode characters, the mapping fails.
CCON-15075
When you configure a proxy server through Informatica Cloud Secure Agent user interface, the session log does not log the proxy server details.
Workaround: Configure the proxy server by setting the JVMOptions for your secure agent in the Administrator service.
CCON-14123
When you read or write a compressed file, data preview does not work.
CCON-12636
When you write a JSON file of size 1GB or more, the task fails with the java heap space error.
Workaround: Set the JVM options for type DTM to increase the -Xms and -Xmx values in the system configuration details of the Secure Agent.
CCON-12616
When you write a JSON file in which column names include unicode characters, the task fails.
CCON-12612
When you write an Avro or a Parquet file that contains null values, the mapping task writes incorrect data in the target for the fields having null values.
CCON-12566
When you read or write a JSON file and select append blob as blob type, the mapping fails with the following error:
Cannot create write operation. All fields support read operation only.
CCON-12300
When you use Create Target to write an Avro file, the schema is created with primitive data types without providing an option to allow null values.
Workaround: Manually edit the schema to allow null values as required. For example:
{"type":"record","name":"S3_Avro_CT_N","fields":[
{ "name":"c_custkey" , "type":["int","null"]},
{ "name":"c_name" , "type":"string"}
{ "name":"c_nationkey" , "type":["long","null"]}
]}
CCON-11979
Microsoft Azure Blob Storage V3 Connector displays the following common error message for all failed mappings:
The Integration Service found Informatica Hadoop distribution directory [/data/home/cloudqa/staging1/apps/Data_Integration_Server/52.0.16/ICS/main/distros/cloudera_cdh5u8/lib and /conf] for Hadoop class loader.


Updated August 02, 2020