Effective in version 10.2.2, the following changes apply to Sqoop:
You can specify a file path in the Spark staging directory of the Hadoop connection to store temporary files for Sqoop jobs. When the Spark engine runs Sqoop jobs, the Data Integration Service creates a Sqoop staging directory within the Spark staging directory to store temporary files:
<Spark staging directory>/sqoop_staging
Previously, the Sqoop staging directory was hard-coded and the Data Integration Service used the following staging directory:
/tmp/sqoop_staging
For more information, see the
Informatica Big Data Management 10.2.2 User Guide
.
Sqoop mappings on the Spark engine use the OpenJDK (AzulJDK) packaged with the Informatica installer. You no longer need to specify the
JDK Home Directory
property for the Data Integration Service.
Previously, to run Sqoop mappings on the Spark engine, you installed the Java Development Kit (JDK) on the machine that runs the Data Integration Service. You then specified the location of the JDK installation directory in the
JDK Home Directory
property under the Data Integration Service execution options in Informatica Administrator.