Data Engineering Integration
- Data Engineering Integration 10.5.2
- All Products
Property
| Description
|
---|---|
Auto Termination
| Enables automatic termination of the cluster.
|
Auto Termination Time
| Terminates the cluster after it is inactive for the specified number of minutes. Enter a value between 10 and 10,000. If you do not configure this, or if you set to 0, the cluster will not automatically terminate.
|
Cluster Log Conf
| The location to deliver logs for long-term storage. If configured, the Databricks Spark engine will deliver the logs every five minutes.
Provide the path to DBFS.
|
Init Scripts
| The location where you store init scripts. You can enter multiple destinations. The scripts are run sequentially in the order that you configure them. If you need to install additional Python libraries, specify the init script file location in this property.
Use the following format:
|
Cluster Tags
| Labels that you can assign to resources for tracking purposes. Enter key-value pairs in the following format: <key1>=<value1>,<key2>=<value2>. You can also provide a path to a local file that contains the key-value pairs.
Use the following format:
|
Spark Configurations
| Performance configurations for the Databricks Spark engine. Enter key-value pairs in the following format: key1='value1' key2='value2'. You can also provide a path to a file that contains the key-value pairs.
|
Environment Variables
| Environment variables that you can configure for the Databricks Spark engine. Enter key-value pairs in the following format: key1='value1' key2='value2'
Enter the userJson and pathToFile properties in the environment variables when you use a JSON file to configure Create Cluster task properties. See
Create the JSON File.
|