Table of Contents

Search

  1. Preface
  2. Part 1: Hadoop Integration
  3. Part 2: Databricks Integration
  4. Appendix A: Connections

Create a Staging Directory for Run-time Processing

Create a Staging Directory for Run-time Processing

When the Databricks Spark engine runs job, it stores temporary files in a staging directory.
Optionally, you can create a directory on DBFS to stage temporary files during run time. By default, the Data Integration Service uses the DBFS directory
/<Cluster Staging Directory>/DATABRICKS
.

0 COMMENTS

We’d like to hear from you!