Some processing rules for the Spark engine differ from the processing rules for the Data Integration Service.
The Java transformation is supported with the following restrictions on the Spark engine:
The Java code in the transformation cannot write output to standard output when you push transformation logic to Hadoop. The Java code can write output to standard error which appears in the log files.
For date/time values, the Spark engine supports the precision of up to microseconds. If a date/time value contains nanoseconds, the trailing digits are truncated.
The Java transformation has the following restrictions when used with partitioning:
The Partitionable property must be enabled in the Java transformation. The transformation cannot run in one partition.
The following restrictions apply to the Transformation Scope property:
The value Transaction for transformation scope is not valid.
If you enable an input port for partition key, the transformation scope must be set to All Input.
Stateless must be enabled if the transformation scope is row.
Mapping validation fails in the following situations:
You reference an unconnected Lookup transformation from an expression within a Java transformation.
You select a port of a complex data type as the partition or sort key.
You enable nanosecond processing in date/time and the Java transformation contains a port of complex data type with an element of a date/time type. For example, a port of type
is not valid if you enable nanosecond processing in date/time.
The mapping fails in the following situations:
The Java transformation and the mapping use different precision modes when the Java transformation contains a decimal port or a complex port with an element of a decimal data type.
Even if high precision is enabled in the mapping, the mapping processes data in low-precision mode in some situations, such as when the mapping contains a complex port with an element of a decimal data type, or the mapping is a streaming mapping. If high precision is enabled in both the Java transformation and the mapping, but the mapping processes data in low-precision mode, the mapping fails.
Binary null characters are passed to an output port. To avoid a mapping failure, you can add code to the Java transformation that replaces the binary null characters with an alternative character before writing the data to the output ports.
Using External .jar Files
To use external .jar files in a Java transformation, perform the following steps:
Copy external .jar files to the Informatica installation directory in the Data Integration Service machine at the following location: