Hi, I'm Ask INFA!
What would you like to know?
ASK INFAPreview
Please to access Ask INFA.

Table of Contents

Search

  1. Preface
  2. Introduction to Databricks Connector
  3. Connections for Databricks
  4. Mappings for Databricks
  5. Migrating a mapping
  6. SQL ELT with Databricks Connector
  7. Data type reference
  8. Troubleshooting

Databricks Connector

Databricks Connector

Read from Amazon S3 and write to Databricks

Read from Amazon S3 and write to Databricks

You can configure
SQL ELT optimization
for a mapping that uses an Amazon S3 V2 connection in the Source transformation to read from Amazon S3 and a Databricks connection in the Target transformation to write to Databricks.

Example

You work for a healthcare organization. Your organization offers a suite of services to manage electronic medical records, patient engagement, telephonic health services, and care coordination services. The organization uses infrastructure based on Amazon Web Services and stores its data on Amazon S3. The management plans to load data to a data warehouse to perform healthcare analytics and create data points to improve operational efficiency. To load data from an Amazon S3 based storage object to Databricks, you must use ETL and ELT with the required transformations that support the data warehouse model.
Use an Amazon S3 V2 connection to read data from a file object in an Amazon S3 source and a Databricks connection to write to a Databricks target. Configure full
SQL ELT optimization
in the mapping to optimize the performance.

0 COMMENTS

We’d like to hear from you!