Table of Contents

Search

  1. Preface
  2. Introduction to Databricks Delta Connector
  3. Connections for Databricks Delta
  4. Mappings for Databricks Delta
  5. Migrating a mapping
  6. Databricks Delta SQL ELT optimization
  7. Data type reference

Databricks Delta Connector

Databricks Delta Connector

Configure Spark parameters for AWS staging

Configure Spark parameters for AWS staging

Before you use the Databricks SQL warehouse to run mappings, configure the Spark parameters for SQL warehouse on the Databricks SQL Admin console.
On the Databricks SQL Admin console, navigate to
SQL Warehouse Settings > Data Security
, and then configure the Spark parameters for AWS under
Data access configuration
.
Add the following Spark configuration parameters and restart the SQL warehouse:
  • spark.hadoop.fs.s3a.access.key <S3 Access Key value>
  • spark.hadoop.fs.s3a.secret.key <S3 Secret Key value>
  • spark.hadoop.fs.s3a.endpoint <S3 Staging Bucket endpoint value>
For example, the S3 staging bucket warehouse value is
s3.ap-south-1.amazonaws.com
.
Ensure that the configured access key and secret key have access to the S3 buckets where you store the data for Databricks Delta tables.

0 COMMENTS

We’d like to hear from you!