PowerExchange Adapters for Informatica
- PowerExchange Adapters for Informatica 10.5.3
- All Products
Issue
| Description
|
---|---|
OCON-27505
| If you parameterized an Amazon Redshift table name and upgraded from Informatica version 10.2.2 to version 10.5, the parameterized value of the table name does not appear after the upgrade.
|
Bug
| Description
|
---|---|
OCON-23228
| When you run a mapping that creates a Amazon Redshift target at run time and column names from the output ports of a transformation in the pipeline contain mix of upper case and lower case, the mapping fails with the null pointer exception.
Workaround: Use an expression or a case converter transformation to convert the column names into lower case before you select
Create Target .
|
OCON-10209
| When you use the MapR distribution, Amazon Redshift mapping fails on the Spark engine when it reads from or writes to an Amazon Redshift cluster that has Version 4 authentication with the following error message:
|
OCON-9834
| When you use the Hortonworks 2.6 distribution, Amazon Redshift mapping fails on the Spark engine when it reads from or writes to an Amazon Redshift cluster that has Version 4 authentication with the following error message:
|
OCON-9663
| When you run an Amazon Redshift mapping to read or write data, the
Stop on Errors property does not work.
|
OCON-8022
| If you import an Amazon Redshift table that has a single quote (') in the column name, the mapping fails with the following error message:
|
OCON-7965
| When you run an Amazon Redshift mapping on the Blaze engine to read data from or write data to an Amazon Redshift cluster that requires Version 4 authentication, the mapping fails. This issue occurs if you use the Hortonworks 2.3 distribution.
|
OCON-7909
| When you run an Amazon Redshift mapping on the Blaze engine to read data from or write data to an Amazon Redshift cluster that requires Version 4 authentication, the mapping fails. This issue occurs if you use the MapR 5.2 distribution.
|
OCON-7322
| If you import an Amazon Redshift table that has a single quote (') or a backslash (\) in the table name, the read and write operations fail.
|
OCON-6921
| When you run an Amazon Redshift mapping that contains a timestamp field in the native environment, the Data Integration Service truncates the ultraseconds values to milliseconds.
|
OCON-6785
| When the Amazon Redshift source contains both double quotes (") and the delimiter you specified in the mapping, double quotes are truncated in the target. Also, the escape character is retained in the target.
|
OCON-6583
| If you set the Parallel option off in the unload command and run an Amazon Redshift mapping on the Blaze engine, all the rows from the source are not written to the Amazon Redshift target even though the mapping runs successfully.
|
OCON-6346
| When you run an Amazon Redshift mapping on the Blaze engine, the success and error files are not generated.
|
OCON-6266
| When you run an Amazon Redshift mapping that compresses the staging files on the Blaze engine, the mapping fails. The staging files compression is ignored.
|
OCON-6260
| When you run an Amazon Redshift mapping on the Blaze engine, the tasklet log does not display the row statistics even if the mapping runs successfully.
|
OCON-6252
| When you run a mapping on the Blaze engine, the Real and Double data type values are rounded off.
Workaround: Use the Numeric data type in place of Real and Double data types.
|
OCON-1297
| When you configure the following attributes and run an Amazon Redshift mapping in the Hadoop environment, the mapping might fail based on the engine selected for mapping execution:
Read Operation Attributes:
Write Operation Attributes:
|
OCON-1275
| A mapping with more than one RedShift object fails in Hadoop run-time environment for MapR distribution.
|