Table of Contents

Search

  1. Abstract
  2. Informatica Installation
  3. Informatica Upgrade
  4. Informatica Closed Enhancements
  5. Informatica Fixed Limitations
  6. Informatica Known Limitations
  7. Informatica Third-Party Limitations
  8. Informatica Global Customer Support

Big Data Known Limitations

Big Data Known Limitations

The following table describes known limitations:
CR
Description
443150
A Blaze engine mapping hangs in the Developer tool and the Monitoring tool displays no status for the mapping because a synchronization error occurs between Blaze engine components.
Workaround: Run the Blaze engine mapping again.
443164
Mappings that read from one of the following sources fail to run in the native environment when the Data Integration Service is configured to run jobs in separate remote processes:
  • Flat file or complex file in the Hadoop Distributed File System (HDFS)
  • HIVE table
  • HBase table
Workaround: On the Compute view for the Data Integration Service, configure the INFA_HADOOP_DIST_DIR environment variable for each node with the compute role. Set the environment variable to the same value configured for the Data Integration Service Hadoop Distribution Directory execution option for the Data Integration Service.
442422
The DEF framework creates too many file descriptors for each Blaze grid segment and does not clear them till the mapping ends.
441992
Mapping with a Hive target that contains more than 4000 columns takes a long time to complete.
441772
Data corruption occurs for a mapping in the Hadoop environment that contains an Oracle source with a new line character.
441541
You cannot monitor jobs that use the Blaze engine if the Application Timeline Server uses Kerberos authentication.
Workaround: Do not use Kerberos authentication with the Application Timeline Server.
440815
Mapping fails in the native environment when it contains a Hive binary data type for an IBM BigInsights and Pivotal cluster.
440480
When you run the stopBlazeService command, some component logs might not be written to aggregate log files on HDFS.
Workaround: View the Blaze engine logs in the directory configured for the Blaze engine logs.
440423
When you use an ODBC connection to write time data to a Netezza database, the mapping fails. This issue occurs when you run the mapping on Cloudera 5u4.
440388
If a Netezza column has the same precision and scale, and contains a 0 as a data value, the data is corrupted when the Data Integration Service writes it to the target. This issue occurs when you use a Netezza connection and run the mapping on Cloudera 5u4.
440121
The output data differs between a mapping run in the native environment and the Hadoop environment when you add MAX and MIN decimal functions in an Aggregator transformation.
438578
Cannot validate a mapping with an Update Strategy transformation after you specify a primary key or preview data for a set of primary keys on a Hive table.
437592
Mapping fails to validate when it contains Timestamp with Time Zone data type columns that are not connected to any transformation or target.
437204
When a mapping containing a Hive source or target runs in the Hadoop environment, the summary statistics for the mapping do not appear in the Monitoring tool.
437196
The path of the resource file in a complex file object appears as a recursive path of directories starting with the root directory and ending with a string.
424789
Mapping with a Hive source and target that uses an ABS function with an IIF function fails in the Hadoop environment.
422627
Mapping in the Hadoop environment fails when it contains a Hive source and a filter condition that uses the default table name prefixed to the column name.
Workaround: Edit the filter condition to remove the table name prefixed to the column name and run the mapping again.
421834
Mapping in the Hadoop environment fails because the Hadoop connection uses 128 characters in its name.
409922
Mapping validation errors occur when you validate a mapping that has complex data types in the Hive environment.
Workaround: Run the mapping in the native environment.


Updated October 25, 2018