Hi, I'm Ask INFA!
What would you like to know?
ASK INFAPreview
Please to access Ask INFA.

Table of Contents

Search

  1. Preface
  2. Introduction to Databricks Connector
  3. Connections for Databricks
  4. Mappings for Databricks
  5. Migrating a mapping
  6. SQL ELT with Databricks Connector
  7. Data type reference
  8. Troubleshooting

Databricks Connector

Databricks Connector

Targets for Databricks

Targets for Databricks

Add a Target transformation to write data to a target.
When you add a Target transformation to a mapping, you define the target connection, target objects, and target properties related to the Databricks connection type.
The following table lists the target properties that are supported by SQL warehouse, all-purpose cluster, and job cluster:
Property
SQL warehouse
1
All-purpose cluster
2
Job cluster
3
Source Type - Single Object
Yes
Yes
Yes
Source Type - Parameter
Yes
Yes
Yes
Object - Existing
Yes
Yes
Yes
Object - Create New at Runtime
Yes
Yes
Yes
Create target - Object Name, Table Location, Database Name
Yes
Yes
Yes
Create target - Table Properties
Yes
No
No
Operation - Insert, Update, Upsert, Delete
Yes
No
Yes
Operation - Data Driven
Yes
No
No
Target Database Name
Yes
No
Yes
Target Table Name
Yes
No
Yes
Update Override Query
Yes
No
No
Write Disposition - Append, Truncate
Yes
No
Yes
Write Disposition - Truncate Always
Yes
No
No
Update Mode - Update as update, Update else insert
Yes
No
Yes
Staging Location
Yes
No
Yes
Pre SQL
Yes
No
No
Post SQL
Yes
No
No
DTM Staging File Size
Yes
No
No
Job Timeout
No
No
Yes
Job Status Poll Interval
No
No
Yes
DB REST API Timeout
No
No
Yes
DB REST API Retry Interval
No
No
Yes
Forward Rejected Rows
Yes
No
Yes
1
The Secure Agent connects to the SQL warehouse at design time and runtime.
2
The Secure Agent connects to the all-purpose cluster to import the metadata at design time.
3
The Secure Agent connects to the job cluster to run the mappings.

0 COMMENTS

We’d like to hear from you!