Table of Contents

Search

  1. Preface
  2. Part 1: Version 10.5.5
  3. Part 2: Version 10.5.4 - 10.5.4.x
  4. Part 3: Version 10.5.3 - 10.5.3.x
  5. Part 4: Version 10.5.2 - 10.5.2.1.x
  6. Part 5: Version 10.5.1 - 10.5.1.1
  7. Part 6: Versions 10.5 - 10.5.0.1
  8. Part 7: Versions 10.4.1 - 10.4.1.3
  9. Part 8: Versions 10.4 - 10.4.0.2
  10. Part 9: Versions 10.2.2 - 10.2.2 HotFix 1
  11. Part 10: Version 10.2.1
  12. Part 11: Version 10.2 - 10.2 HotFix 2

What's New and Changed (10.5.5)

What's New and Changed (10.5.5)

Sqoop

Sqoop

Effective in version 10.2.2, the following changes apply to Sqoop:
  • You can specify a file path in the Spark staging directory of the Hadoop connection to store temporary files for Sqoop jobs. When the Spark engine runs Sqoop jobs, the Data Integration Service creates a Sqoop staging directory within the Spark staging directory to store temporary files:
    <Spark staging directory>/sqoop_staging
    Previously, the Sqoop staging directory was hard-coded and the Data Integration Service used the following staging directory:
    /tmp/sqoop_staging
    For more information, see the
    Informatica Big Data Management 10.2.2 User Guide
    .
  • Sqoop mappings on the Spark engine use the OpenJDK (AzulJDK) packaged with the Informatica installer. You no longer need to specify the
    JDK Home Directory
    property for the Data Integration Service.
    Previously, to run Sqoop mappings on the Spark engine, you installed the Java Development Kit (JDK) on the machine that runs the Data Integration Service. You then specified the location of the JDK installation directory in the
    JDK Home Directory
    property under the Data Integration Service execution options in Informatica Administrator.

0 COMMENTS

We’d like to hear from you!