Create a Staging Directory for Run-time Processing
Create a Staging Directory for Run-time Processing
When the Databricks Spark engine runs a job, it stores temporary files in a staging directory.
Optionally, you can create a directory on DBFS to stage temporary files during run time. By default, the Data Integration Service uses the DBFS directory at the following path: