Common Content for Data Engineering
- Common Content for Data Engineering 10.5
- All Products
Issue
| Description
|
---|---|
OCON-27886
| When the Spark engine runs a Sqoop mapping on Cloudera CDH version 6.3.4 to write data with the Date data type to a Greenplum target, the mapping fails with the following error:
|
PLAT-28258
| A SAML login attempt from the Administrator console with an invalid credential fails as expected, but appears as a login attempt from the Native security domain. The attempt should appear in the log as a login attempt from a blank security domain.
|
BDM-38142
| Developer Tool users without administrator-level permissions cannot view run-time applications or application objects.
|
BDM-37934
| When you run a mapping on a complex file source in Spark mode, the mapping does not write data to the target file if both the following conditions are true:
|
BDM-37712
| The filemanager utility fails with a permission error if you run a command from a domain that does not contain your user account.
|
BDM-37594
| The Cluster Configuration creation interface in the Administrator Console erroneously lists CDH 6.1 among the versions of Hadoop clusters from which to create a cluster configuration. Cloudera CDH 6.1 is not supported in Informatica version 10.5.
|
BDM-37568
| After you create a SQL data service and try to generate a mapping for a SQL query using the deployed SQLDS command, you get an error message similar to the following:
|
BDM-37537
| Mappings with a Python transformation succeed on a Databricks 5.5 cluster but fail on a version 7.x cluster with an error like:
Job aborted due to stage failure: Task <n> in stage <n.n> failed 4 times, most recent failure: Lost task <n.n> in stage <n.n> ...
|
BDM-37351
| When the Spark engine runs a mapping that includes an Update Strategy transformation that performs an INSET, UPDATE or DELETE operation and the source includes duplicate rows, the mapping fails with a "Cardinality Violation in Merge statement" error.
Workaround: Disable the option "Use Hive Merge."
|
BDM-37300
| The remove command fails if you use a special character in the file name.
|
BDM-37158
| The permission and connection check takes more than 30 minutes if there are more connections in a domain.
|
BDM-37148
| When you run a mapping with an audit and you use a JDBC V2 connection for the audit results, the Data Integration Service does not write the results to the database and logs the following exception:
|
BDM-37084
| The filemanager command fails if you use unicode characters in a file name or path on Microsoft Windows.
|
BDM-37081
| When you rename or move a file to a target directory that already exists, the filemanager creates a new subdirectory under the existing target directory. This issue occurs if you use ADLS Gen 1 storage.
|
BDM-37009
| When the Spark engine runs a mapping that writes to an external Hive target using a Cloudera CDP Public Cloud cluster and you use choose RETAIN as the target schema strategy, the mapping fails with the following error:
Workaround: Enable
Truncate target table in the Advanced properties for the Hive target.
|
BDM-36951
| The filemanager utility log displays incorrect log tracing level for WARNING and INFO. You might see the log tracing level as SEVERE for WARNING or FINE for INFO.
|
BDM-36841
| When you perform a data preview downstream of the transformation with multiple output groups, the data preview fails with following message:
|
BDM-36262
| On Databricks, when you run a streaming mapping with Delta target in high precision mode, the mapping fails with
Failed to merge decimal types with incompatible precision error if the specified precision value is not the default value (18).
Workaround: Specify the correct precision value.
|