Hi, I'm Ask INFA!
What would you like to know?
ASK INFAPreview
Please to access Ask INFA.

Table of Contents

Search

  1. Preface
  2. Introduction to Databricks Connector
  3. Connections for Databricks
  4. Mappings for Databricks
  5. Migrating a mapping
  6. SQL ELT with Databricks Connector
  7. Data type reference
  8. Troubleshooting

Databricks Connector

Databricks Connector

External tables in Databricks

External tables in
Databricks

External tables store their data in locations outside of the predefined managed storage location associated with the metastore, unity catalog, or schema. An external table references an external storage path by using a
LOCATION
clause. For more information on external tables, see the Databricks documentation.
You can read data from external tables of Delta, Parquet, and CSV formats in Databricks. You can write data to only external tables of Delta format in
Databricks
.
External tables of Parquet and CSV formats don't apply to mappings in advanced mode.
You can read or write data of the following data types to external tables:
  • Array*
  • Binary
  • Bigint
  • Boolean
  • Date
  • Decimal
  • Double
  • Float
  • Int
  • Map*
  • Smallint
  • String
  • Struct*
  • Tinyint
  • Timestamp
*Applies only to mappings in advanced mode.
When you configure a target operation to create a new external table target at runtime, specify the path to the external table in the table location. For more information, see Create a target table at runtime

0 COMMENTS

We’d like to hear from you!