Data Engineering Integration
All Products
Transformation
| Rules and Guidelines
|
---|---|
Address Validator
| You can push mapping logic that includes an Address Validator transformation to Hadoop if you use a Data Quality product license.
The following limitation applies to Address Validator transformations:
|
Aggregator
| An Aggregator transformation with pass-through fields is valid if they are group-by fields.
You can use the ANY function in an Aggregator transformation with pass-through fields to return any row.
|
Case Converter
| The Data Integration Service can push a Case Converter transformation to Hadoop.
|
Comparison
| You can push mapping logic that includes a Comparison transformation to Hadoop if you use a Data Quality product license.
|
Consolidation
| You can push mapping logic that includes a Consolidation transformation to Hadoop if you use a Data Quality product license.
The following limitation applies to Consolidation transformations:
|
Data Masking
| You cannot use the following data masking techniques in mapping logic run on Hadoop clusters:
|
Data Processor
| The following limitations apply when a Data Processor transformation directly connects to a complex file reader:
The following limitations apply when a mapping has a Data Processor transformation:
The Data Processor transformation can use the following input and output formats:
|
Decision
| You can push mapping logic that includes a Decision transformation to Hadoop if you use a Data Quality product license.
|
Expression
| An Expression transformation with a user-defined function returns a null value for rows that have an exception error in the function.
The Data Integration Service returns an infinite or a NaN (not a number) value when you push transformation logic to Hadoop for expressions that result in numerical errors. For example:
In the native environment, the expressions that result in numerical errors return null values and the rows do not appear in the output.
|
Filter
| The Data Integration Service can push a Filter transformation to Hadoop.
|
Java
| You must copy external JAR files that a Java transformation requires to the Informatica installation directory in the Hadoop cluster nodes at the following location:
[$HADOOP_NODE_INFA_HOME]/services/shared/jars/platform/dtm/ You can optimize the transformation for faster processing when you enable an input port as a partition key and sort key. The data is partitioned across the reducer tasks and the output is partially sorted.
The following limitations apply to the Transformation Scope property:
You can enable the Stateless advanced property when you run mappings in a Hadoop environment.
The Java code in the transformation cannot write output to standard output when you push transformation logic to Hadoop. The Java code can write output to standard error which appears in the log files.
|
Joiner
| A Joiner transformation cannot contain inequality joins or parameters in the outer join condition.
|
Key Generator
| You can push mapping logic that includes a Key Generator transformation to Hadoop if you use a Data Quality product license.
|
Labeler
| You can push mapping logic that includes a Labeler transformation to Hadoop when you configure the transformation to use probabilistic matching techniques.
You can push mapping logic that includes all types of Labeler configuration if you use a Data Quality product license.
|
Lookup
| The following limitations apply to Lookup transformations:
|
Match
| You can push mapping logic that includes a Match transformation to Hadoop if you use a Data Quality product license.
The following limitation applies to Match transformations:
|
Merge
| The Data Integration Service can push a Merge transformation to Hadoop.
|
Parser
| You can push mapping logic that includes a Parser transformation to Hadoop when you configure the transformation to use probabilistic matching techniques.
You can push mapping logic that includes all types of Parser configuration if you use a Data Quality product license.
|
Rank
| A comparison is valid if it is case sensitive.
|
Router
| The Data Integration Service can push a Router transformation to Hadoop.
|
Sorter
| The Data Integration service ignores the Sorter transformation when you push mapping logic to Hadoop.
|
SQL
| The Data Integration Service can push SQL transformation logic to Hadoop.
You cannot use a Hive connection.
|
Standardizer
| You can push mapping logic that includes a Standardizer transformation to Hadoop if you use a Data Quality product license.
|
Union
| The custom source code in the transformation cannot write output to standard output when you push transformation logic to Hadoop. The custom source code can write output to standard error, that appears in the runtime log files.
|
Weighted Average
| You can push mapping logic that includes a Weighted Average transformation to Hadoop if you use a Data Quality product license.
|
Updated July 03, 2018