Data Services
All Products
Bug
| Description
|
---|---|
PLAT-8714
| If you run a mapping on HiveServer2 on a SUSE 11 Hortonworks cluster that is enabled with Kerberos authentication, a MySQL connection leak occurs and the mapping fails with the following error:
[HiveServer2-Handler-Pool: Thread-3439]: transport.TSaslTransport (TSaslTransport.java:open(315)) - SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
|
OCON-933
| If you configure user impersonation and run a Sqoop mapping on a Hadoop cluster that uses Kerberos authentication, the mapping fails. (460997)
|
BDM-3658
| The Big Data Management Configuration Utility (Hadoop Configuration Manager) does not create a separate log file for each run.
|
462309
| The Analyst Service does not shut down when you use the infaservice.sh shutdown command.
|
462299
| In a Cloudera CDH environment, mappings fail on the Blaze engine if the Resource Manager is highly available and the cluster uses Kerberos authentication. (BDM-1596)
|
461622
| A mapping fails to run in the Blaze environment if multiple transformation strategies in the mapping identify the same probabilistic model file or classifier model file.
|
461610
| Column profile with data domain discovery fails when the data source is a Hive source, you choose the sampling option as
All rows , and you run the profile on the Blaze engine.
|
461286
| When you run mappings on the Spark engine within a very short time span, such as 20 seconds, the mappings fail with OSGI errors.
|
461285
| If the join condition in a Joiner transformation contains string ports with different precision values, the mapping returns an incorrect number of output rows when run on the Blaze engine. (BDM-1585)
|
461283
| Workflows are rescheduled to a different time instead of the original scheduled time when the Integration Service shuts down unexpectedly and misses the scheduled time.
|
461044
| When you run mappings on the Spark engine, the mapping run fails with a compilation error.
Cause: The cluster uses an instance of Java other than the Java that ships with Informatica Big Data Management.
|
460640
| Big Data Management supports Hortonworks Hadoop clusters that use Java 1.8. When the cluster uses Java 1.7, mappings that you execute using the Hive engine fail. You see an error like:
|
460412
| When you export data to an Oracle database through Sqoop, the mapping fails in certain situations. This issue occurs when all of the following conditions are true:
|
458238
| Lookup performance on the Spark engine is very slow when the lookup data contains null values.
|
456892
| When you generate and execute a DDL script to create or replace a Hive target table in the Blaze run-time environment, the mapping fails.
|
456732
| When you synchronize a Hive view in the Developer tool, the links from the mapping source or the connections are not retained. (BDM-2255)
|
454281
| When a Hadoop cluster uses Kerberos authentication, the mapping that writes to HDFS in the native run-time environment fails with the following error if the KDC service ticket has expired: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (BDM-2190)
|
453313
| If you run multiple concurrent mappings on the Spark engine, performance might be slow and the log messages indicate that resources are not available. The Data Integration Service indicates that the mapping failed even though it is still running in the cluster.
|
449810
| The MRX_MAPPINGS view does not show any MAPPING objects even though mappings exist in the repository.
|
Updated January 17, 2019