Table of Contents

Search

  1. Preface
  2. Part 1: Hadoop Integration
  3. Part 2: Databricks Integration
  4. Appendix A: Managing Distribution Packages
  5. Appendix B: Connections Reference

Create a Cluster Staging Directory

Create a Cluster Staging Directory

Optionally, create a directory on HDFS or View File System (ViewFS) that the Data Integration Service uses to stage the Informatica binary archive files.
By default, the Data Integration Service writes the files to the HDFS directory
/tmp
.
Grant permission to the Hadoop staging user and all blaze users. If you did not create a Hadoop staging user, the Data Integration Services uses the operating system user that starts the Informatica daemon.

0 COMMENTS

We’d like to hear from you!