Table of Contents

Search

  1. Abstract
  2. Installation and Upgrade
  3. Support Changes
  4. 10.5 Fixed Issues and Closed Enhancements
  5. 10.5 Known Issues
  6. Cumulative Known Issues
  7. Emergency Bug Fixes Merged into 10.5

Release Notes

Release Notes

Data Engineering Integration Known Issues (10.5)

Data Engineering Integration Known Issues (10.5)

The following table describes known issues that were found in 10.5:
Issue
Description
OCON-27886
When the Spark engine runs a Sqoop mapping on Cloudera CDH version 6.3.4 to write data with the Date data type to a Greenplum target, the mapping fails with the following error:
org.postgresql.util.PSQLException: ERROR: date out of range: "444001-10-28 BC +05:30"
PLAT-28258
A SAML login attempt from the Administrator console with an invalid credential fails as expected, but appears as a login attempt from the Native security domain. The attempt should appear in the log as a login attempt from a blank security domain.
BDM-38142
Developer Tool users without administrator-level permissions cannot view run-time applications or application objects.
BDM-37934
When you run a mapping on a complex file source in Spark mode, the mapping does not write data to the target file if both the following conditions are true:
  • The Hadoop distribution of the cluster is Cloudera CDH version 6.3.4.
  • The complex file source has a binary or custom input file format.
BDM-37712
The filemanager utility fails with a permission error if you run a command from a domain that does not contain your user account.
BDM-37594
The Cluster Configuration creation interface in the Administrator Console erroneously lists CDH 6.1 among the versions of Hadoop clusters from which to create a cluster configuration. Cloudera CDH 6.1 is not supported in Informatica version 10.5.
BDM-37568
After you create a SQL data service and try to generate a mapping for a SQL query using the deployed SQLDS command, you get an error message similar to the following:
Command [prepareMapping] failed with error [[SQLCMN_10034] : The SQL Service Module encountered the following error while executing the SQL against the SQL data service: [com.informatica.ds.sql.jdbcdrv.PreparedStatementImpl cannot be cast to com.informatica.ds.sql.jdbcdrv.StatementImpl]]
BDM-37537
Mappings with a Python transformation succeed on a Databricks 5.5 cluster but fail on a version 7.x cluster with an error like:
Job aborted due to stage failure: Task <n> in stage <n.n> failed 4 times, most recent failure: Lost task <n.n> in stage <n.n> ...
BDM-37351
When the Spark engine runs a mapping that includes an Update Strategy transformation that performs an INSET, UPDATE or DELETE operation and the source includes duplicate rows, the mapping fails with a "Cardinality Violation in Merge statement" error.
Workaround: Disable the option "Use Hive Merge."
BDM-37300
The remove command fails if you use a special character in the file name.
BDM-37158
The permission and connection check takes more than 30 minutes if there are more connections in a domain.
BDM-37148
When you run a mapping with an audit and you use a JDBC V2 connection for the audit results, the Data Integration Service does not write the results to the database and logs the following exception:
SEVERE: Data integration service failed to create DTM instance … Caused by: java.lang.IllegalArgumentException: Not relational: com.informatica.adapter.jdbc_v2.connection.JDBC_V2ConnectInfo
BDM-37084
The filemanager command fails if you use unicode characters in a file name or path on Microsoft Windows.
BDM-37081
When you rename or move a file to a target directory that already exists, the filemanager creates a new subdirectory under the existing target directory. This issue occurs if you use ADLS Gen 1 storage.
BDM-37009
When the Spark engine runs a mapping that writes to an external Hive target using a Cloudera CDP Public Cloud cluster and you use choose RETAIN as the target schema strategy, the mapping fails with the following error:
SEVERE: The Integration Service failed to execute the mapping … Caused by: java.lang.IllegalArgumentException: Wrong FS
Workaround: Enable
Truncate target table
in the Advanced properties for the Hive target.
BDM-36951
The filemanager utility log displays incorrect log tracing level for WARNING and INFO. You might see the log tracing level as SEVERE for WARNING or FINE for INFO.
BDM-36841
When you perform a data preview downstream of the transformation with multiple output groups, the data preview fails with following message:
Cannot run the data preview on the Spark engine because of the following errors: Unable to preview the mapping at this transformation. Additional information may be available in the logs.
BDM-36262
On Databricks, when you run a streaming mapping with Delta target in high precision mode, the mapping fails with
Failed to merge decimal types with incompatible precision
error if the specified precision value is not the default value (18).
Workaround: Specify the correct precision value.

0 COMMENTS

We’d like to hear from you!