Table of Contents

Search

  1. Preface
  2. Part 1: Getting Started with Snowflake Data Cloud Connector
  3. Part 2: Data Integration with Snowflake Data Cloud Connector
  4. Part 3: SQL ELT with Snowflake Data Cloud Connector
  5. Appendix A: Data type reference
  6. Appendix B: Additional runtime configurations
  7. Appendix C: Upgrading to Snowflake Data Cloud Connector

Snowflake Data Cloud Connector

Snowflake Data Cloud Connector

JDBC URL parameters

JDBC URL parameters

You can use the additional JDBC URL parameters field in the Snowflake Data Cloud connection to customize and set any additional parameters when you connect to Snowflake.
You can configure the following properties as additional JDBC URL parameters in the Snowflake Data Cloud connection:
  • To override the database and schema name used to create temporary tables in Snowflake, enter the database and schema name in the following format:
    ProcessConnDB=<DB name>&ProcessConnSchema=<schema_name>
  • To view only the specified database and schema while importing a Snowflake table, enter the database and schema name in the following format:
    db=<database_name>&schema=<schema_name>
  • To read UDF string and numeric data from Snowflake, enter the database and schema where the UDF is created in Snowflake in the following format:
    db=<database_name>&schema=<schema_name>
  • To access Snowflake through Okta SSO authentication, enter the web-based IdP implementing SAML 2.0 protocol in the following format:
    authenticator=https://<Your_Okta_Account_Name>.okta.com
    Microsoft ADFS is not applicable.
    For more information about configuring Okta authentication, see Configuring Snowflake to use federated authentication.
  • To load data from Amazon S3, Google Cloud Storage, or Microsoft Azure Data Lake Storage Gen2 to Snowflake for SQL ELT optimization, enter the Cloud Storage Integration name created for the Amazon S3, Google Cloud Storage, or Microsoft Azure Data Lake Storage Gen2 account in Snowflake in the following format:
    storage_integration=<Storage Integration name>
    The storage integration name is case-sensitive. For example, if the storage integration name you created for Amazon S3, Google Cloud Storage, or Microsoft Azure Data Lake Storage Gen2 in Snowflake is
    STORAGE_INT
    , you need to specify the same integration name:
    storage_integration=STORAGE_INT
    You can also load data from Amazon S3 to Snowflake for SQL ELT optimization without using storage integration.
  • To connect to Snowflake using the proxy server, enter the following parameters:
    useProxy=true& proxyHost=<Proxy host IP address>& proxyPort=<Proxy server port number>& proxyUser=<Proxy server user name>& proxyPassword=<Proxy server password>
  • To ignore double quotes in the table and treat all tables as case-insensitive, enter the following parameter:
    QUOTED_IDENTIFIERS_IGNORE_CASE=true
    When you set this property in the connection to true, Snowflake ignores the double quotes in the table and treats all tables as case-insensitive.
    If you have set this property to true, you cannot access case-sensitive tables with the same connection. You need to create a new connection to fetch any existing case-sensitive tables.
  • To filter queries that are executed in a Snowflake job on the Snowflake web interface, enter the tag name in the following format:
    query_tag=<Tag name>
    You have an option to override the query_tag parameter that is defined in the Snowflake connection when you run a mapping task.
    To override the query_tag parameter, click the
    Runtime Options
    tab of the mapping task. In the
    Advanced Session Properties
    section, select
    Custom Properties
    from the
    Session Property Name
    list, and then enter the following value:
    snowflake_query_tag=<Tag name>
    In advanced mode, you can't override the query_tag parameter.
In addition to the parameters listed, this field provides you the flexibility to configure other Snowflake parameters based on your requirements.

0 COMMENTS

We’d like to hear from you!