Table of Contents

Search

  1. Preface
  2. Part 1: Hadoop Integration
  3. Part 2: Databricks Integration
  4. Appendix A: Connections

Create a Cluster Staging Directory

Create a Cluster Staging Directory

Optionally, create a directory on HDFS that the Data Integration Service uses to stage the Informatica binary archive files.
By default, the Data Integration Service writes the files to the HDFS directory
/tmp
.
Grant permission to the Hadoop staging user and all blaze users. If you did not create a Hadoop staging user, the Data Integration Services uses the operating system user that starts the Informatica daemon.

0 COMMENTS

We’d like to hear from you!