Table of Contents

Search

  1. Preface
  2. Part 1: Hadoop Integration
  3. Part 2: Databricks Integration
  4. Appendix A: Managing Distribution Packages
  5. Appendix B: Connections Reference

Create a Staging Directory for Run-time Processing

Create a Staging Directory for Run-time Processing

When the Databricks Spark engine runs a job, it stores temporary files in a staging directory.
Optionally, you can create a directory on DBFS to stage temporary files during run time. By default, the Data Integration Service uses the DBFS directory at the following path:
/<cluster staging directory>/DATABRICKS
To create a directory on DBFS, see the Databricks documentation.

0 COMMENTS

We’d like to hear from you!