Hi, I'm Ask INFA!
What would you like to know?
ASK INFAPreview
Please to access Ask INFA.

Table of Contents

Search

  1. Preface
  2. Introduction to Databricks Connector
  3. Connections for Databricks
  4. Mappings for Databricks
  5. Migrating a mapping
  6. SQL ELT with Databricks Connector
  7. Data type reference
  8. Troubleshooting

Databricks Connector

Databricks Connector

Mappings for Databricks

Mappings for
Databricks

When you configure a mapping, you describe the flow of data from the source to the target.
A mapping defines reusable data flow logic that you can use in mapping tasks.
When you create a mapping, you define the Source, Target, and Lookup transformations to represent a
Databricks
object. Use the Mapping Designer in Data Integration to add the Source, Target, or Lookup transformations in the mapping canvas and configure the
Databricks
source, target, and lookup properties.
After you create a mapping, you can run the mapping or you can deploy the mapping in a
mapping
task. The
mapping
task allows you to process data based on the data flow logic defined in a mapping.
In advanced mode, the Mapping Designer updates the mapping canvas to include transformations and functions that enable advanced functionality.
You can use Monitor to monitor the jobs.
The following table lists the functionalities that are supported by SQL warehouse, all-purpose cluster, and job cluster:
Property
SQL warehouse
1
All-purpose cluster
2
Job cluster
3
Source transformation
Yes
Yes
Yes
Target transformation
Yes
Yes
Yes
Filter transformation
Yes
No
Yes
Lookup transformation
Yes
No
No
Sorter transformation
Yes
No
Yes
SQL transformation
Yes
No
No
Dynamic schema handling
Yes
No
No
Identity columns
Yes
No
No
1
The Secure Agent connects to the SQL warehouse at design time and runtime.
2
The Secure Agent connects to the all-purpose cluster to import the metadata at design time.
3
The Secure Agent connects to the job cluster to run the mappings.

0 COMMENTS

We’d like to hear from you!