Table of Contents

Search

  1. Abstract
  2. Installation and Upgrade
  3. 10.1.1 HotFix 1 Fixed Limitations and Closed Enhancements
  4. 10.1.1 HotFix 1 Known Limitations
  5. 10.1.1 Update 2 Fixed Limitations and Closed Enhancements
  6. 10.1.1 Update 2 Known Limitations
  7. 10.1.1 Update 1 Fixed Limitations and Closed Enhancements
  8. 10.1.1 Update 1 Known Limitations
  9. 10.1.1 Fixed Limitations and Closed Enhancements
  10. 10.1.1 Known Limitations
  11. Informatica Global Customer Support

Third-Party Limitations (10.1.1)

Third-Party Limitations (10.1.1)

The following table describes third-party known limitations:
Bug
Description
PLAT-14849
On AIX operating systems, when you enable secure communication to an SAP HANA database on AIX with the SSL protocol, mappings terminate unexpectedly.
SAP ticket reference number: 0001101086
(410495)
PLAT-14827
Mapping fails in the Hive environment if the user name or password for a target IBM DB2 table is more than eight characters. The following error appears in the Hadoop cluster logs:
Caused by: java.io.IOException: Mapping execution failed with the following error: WRT_8001 Error connecting to database... WRT_8001 [Session Write_EMP_OUT5_MAPPING_3285816766724683 Username test_it2 DB Error -1 [IBM][CLI Driver] SQL30082N Security processing failed with reason "24" ("USERNAME AND/OR PASSWORD INVALID"). SQLSTATE=08001
Workaround: Verify that the IBM DB2 database user name and password is less than eight characters. (410437)
PLAT-14796
When a MySQL table name contains special characters, the Developer tool does not import all the columns. This issue occurs when you use the DataDirect ODBC and JDBC drivers to import the metadata. (395943)
DataDirect ticket reference number: 00322369
PLAT-14658
When you preview data from the SAP HANA database for a decimal data type with a precision of 38 digits, the data preview runs continuously. When you run the mapping, the mapping run fails with an error. (414220)
SAP ticket reference number: 0000624569 2015
(414220)
PLAT-14653
When you import a Timestamp with Time Zone metadata, the scale appears as 0 instead of 6 for the data type.
DataDirect reference number: 00310850
(413119)
PLAT-14061
Sessions that read data from an Oracle source or write data to an Oracle target might fail when secure communication is enabled for the Oracle database. A session is more likely to fail when it performs a database lookup against a secure Oracle database.
Workaround: Contact Informatica Global Customer Support. Reference Oracle SR number: 3-8287328531. (373732)
PLAT-14060
You cannot create an Oracle resource when secure communication is enabled for the Oracle metadata source. Similarly, you cannot set up the Metadata Manager repository on an Oracle database when secure communication is enabled. (370702)
Oracle SR number: 3-8287328531
PLAT-13951
You cannot configure an Oracle 12c database for Kerberos authentication. (393899)
Oracle SR number: 3-8990776511
OCON-847
When you import data from an Oracle database through Sqoop and the database contains a column of the Clob data type, the mapping fails. (457560)
Sqoop ticket reference number: SQOOP-2945
OCON-7219
When you run a Sqoop mapping on the Blaze engine to export Teradata float data, the data is truncated after the decimal point.
Cloudera support ticket number: 113716
OCON-7218
When you run a Sqoop mapping on the Blaze engine to export byte and varbyte data to Teradata, the Sqoop program does not insert null rows into the target.
SDC JIRA issue number: SDC-2612
OCON-7214
Sqoop mappings fail on the Blaze engine if you use a custom query with the Order By clause to import data.
Sqoop JIRA issue number: SQOOP-3064
OCON-7213
The Sqoop program does not honor the --num-mappers argument and -m argument when you export data and run the mapping on the Blaze engine.
Sqoop JIRA issue number: SQOOP-2837
OCON-7211
When you run a Sqoop mapping to import data from or export data to Microsoft SQL Server databases that are hosted on Azure, the mapping fails.
Sqoop JIRA issue number: SQOOP-2349
OCON-7207
Sqoop mappings randomly fail on the Blaze engine if you import clob data from Oracle.
Sqoop JIRA issue number: SQOOP-2945
OCON-7077
When you run a Sqoop mapping on the Blaze engine to export time or timestamp data with nanoseconds, the Sqoop program writes only the first three digits to the target.
Cloudera support ticket number: 113718
OCON-6698
When you run a Sqoop mapping on the Blaze engine to export data from Teradata to Oracle, float values are corrupted. This issue occurs when all the following conditions are true:
  1. You use a Teradata JDBC driver or the Sqoop Cloudera Connector Powered by Teradata.
  2. You run the mapping on a Cloudera 5.8 cluster.
Cloudera support ticket number: 113716
OCON-618
When you use an ODBC connection to write data to a Teradata client version 15.10.0.1, the Data Integration Service rejects data of the numeric data type. (442760)
Teradata ticket reference number: RECGNXLML
OCON-6143
When you run a Sqoop mapping on the Blaze engine to import data from or export data to Teradata and override the owner name at run time, the Sqoop program does not honor the owner name. This issue occurs when all the following conditions are true:
  1. You use a Teradata JDBC driver or the Sqoop Cloudera Connector Powered by Teradata.
  2. You run the mapping on a Cloudera 5.8 cluster or Hortonworks 2.5 cluster.
Workaround: Enter the owner name in the JDBC connection string.
Cloudera support ticket number: 117697
OCON-5769
If you did not specify a precision value when you create the table in Teradata, and you use the JDBC connection to import the Number data type from Teradata, the Developer Tool imports the Number data type metadata with an incorrect precision value.
OCON-568
When you use a JDBC connection in the Developer Tool to import a Netezza source object that contains the time data type, the data preview fails. (459901)
OCON-2847
Loading a Microsoft SQL Server resource fails when TLS encryption is enabled for the source database and the Metadata Manager repository is a Microsoft SQL Server database with TLS encryption enabled. (452471)
Data Direct case number: 00343832
OCON-1308
If a Teradata target contains a column of the CHAR or VARCHAR data type at the fifth position, the Data Integration Service writes NULL values to the column. This issue occurs when you use an ODBC connection to write data. (439606)
DataDirect case reference number: 00324380
OCON-1081
When you use the Teradata ODBC driver and write Unicode data to a Teradata database, the mapping might fail when the Teradata target contains Varchar columns. The mapping fails because of a DataDirect Driver Manager issue. (458899)
DataDirect reference number: 00343606
IDE-1677
When you run a data domain discovery profile with multiple data domains on MapR 4.0.2 Yarn or MapR 4.0.2 classic Hadoop distribution files, profile run fails. (448529)
BDM-4291
When you run a mapping with a bucketed Hive target on the Spark engine, the mapping ignores the bucketing information of the Hive table and writes data to a single bucket.
BDM-3276
Sqoop mappings fail on the Blaze engine if you use the Sqoop Cloudera Connector Powered by Teradata and set the output method to
internal.fastload
.
Cloudera support ticket number: 117571
BDM-1992
If you set the Operating System Profile and Impersonation to true for the Data Integration Service and the Available Operating System Profile to OSP1 in the Developer client, and run a Teradata mapping in native mode, the mapping fails.
Workaround: Set the Operating System Profile and Impersonation in the Data Integration Service to false and then run the mapping. (458500)
Teradata case number: RECGV4J3Q