Table of Contents


  1. Abstract
  2. Informatica Installation
  3. Informatica Upgrade
  4. 10.1 Fixed Limitations and Closed Enhancements
  5. 10.1 Known Limitations
  6. Informatica Global Customer Support

Big Data Fixed Limitations

Review the Release Notes of previous releases for information about previous fixed limitations.
The following table describes fixed limitations:
You cannot create a Hive connection in a Kerberos-enabled domain with TSL 1.2.
When you run a dynamic mapping that uses a control file to dynamically create a Hive table, the Data Integration Service does not read the data types of columns from the control file. Instead, it reads from the run-time properties of the source data object.
When you run a consolidation mapping in the Hadoop environment, the Consolidation transformation can return more than one survivor record.
A mapping that runs on the Hive engine takes a longer time to compile if the mapping has many source and target columns.
Unable to monitor mapping tasks from the Administrator tool when a Blaze mapping is run in a workflow.
When you run a mapping that inserts into a Hive table with two or more partitions, the mapping fails on the Hive engine with the following error:
Hive error code [40,000], Hive message [FAILED: SemanticException Partition spec {…} contains non-partition columns]
A mapping that contains a Data Processor transformation has poor performance when running in a Hadoop environment.
When multiple Lookup transformations are connected to a single expression transformation that contains parameters, the Data Integration Service fails to generate a Hive execution plan for the mapping.
If a Hadoop cluster has encrypted zones for the underlying Hive database, the mapping fails at run time when the Data Integration Service drops a table of temporary Hive tables.
If you view the mapping execution plan after changing the order of ports in a Hive data object in the Developer tool, the Developer tool generates an incorrect execution plan and causes the mapping to fail at run time.
Tez jobs fail when the Hive connection user is different from the Data Integration Service user and the hive.execution.engine property value in hive-site.xml is "tez."
If a non-string port is passed to functions such as LTRIM, RTRIM, or TRIM in an Expression transformation, the mapping fails at run time on a Hive engine.
A Hive mapping on Tez fails when a TO_DATE function is used in expressions.
A Blaze engine mapping hangs in the Developer tool and the Monitoring tool displays no status for the mapping because a synchronization error occurs between Blaze engine components.
The DEF framework creates too many file descriptors for each Blaze grid segment and does not clear them till the mapping ends.
Mapping with a Hive target that contains more than 4000 columns takes a long time to complete.
You cannot monitor jobs that use the Blaze engine if the Application Timeline Server uses Kerberos authentication.
Mapping fails in the native environment when it contains a Hive binary data type for an IBM BigInsights and Pivotal cluster.
The output data differs between a mapping run in the native environment and the Hadoop environment when you add MAX and MIN decimal functions in an Aggregator transformation.
Cannot validate a mapping with an Update Strategy transformation after you specify a primary key or preview data for a set of primary keys on a Hive table.
In an Aggregator transformation, if the data types of the functions after an ELSE statement and a THEN statement in an expression do not match, the mapping fails with an argument type mismatch error.