Table of Contents

Search

  1. Preface
  2. Part 1: Getting Started with Snowflake Data Cloud Connector
  3. Part 2: Data Integration with Snowflake Data Cloud Connector
  4. Part 3: SQL ELT with Snowflake Data Cloud Connector
  5. Appendix A: Data type reference
  6. Appendix B: Additional runtime configurations
  7. Appendix C: Upgrading to Snowflake Data Cloud Connector

Snowflake Data Cloud Connector

Snowflake Data Cloud Connector

Mappings in advanced mode example

Mappings in advanced mode example

You work for a retail company that offers more than 50,000 products and the stores are distributed across the globe. The company ingests a large amount of customer engagement details from the transactional CRM system into Amazon S3.
The sales team wants to improve customer engagement and satisfaction at every touch point. To create a seamless customer experience and deliver personalized service across the various outlets, the retail company plans to load the data that is stored in the parquet file format from the Amazon S3 bucket to Snowflake.
You can create a mapping in advanced mode to read data from the Amazon S3 bucket and write data to the Snowflake target. You can choose to add transformations in the mapping to process the raw data that you read from the Amazon S3 bucket and then write the curated data to Snowflake.
The following example illustrates how to create a mapping to read from an Amazon S3 source and write to Snowflake:
The mapping includes an Amazon S3 source, Expression transformation, and Snowflake target.
  1. In Data Integration, click
    New
    Mappings
    Mapping
    .
  2. In the Mapping Designer, click
    Switch to Advanced
    .
    The following image shows the
    Switch to Advanced
    button in the Mapping Designer:
    In the Mapping Designer, the header includes the Switch to Advanced button.
  3. In the
    Switch to Advanced
    dialog box, click
    Switch to Advanced
    .
    The Mapping Designer updates the mapping canvas to display the transformations and functions that are available in advanced mode.
  4. Enter a name, location, and description for the mapping.
  5. Add a Source transformation, and specify a name and description in the general properties.
  6. On the
    Source
    tab, perform the following steps to read data from the Amazon S3 source:
    1. In the
      Connection
      field, select the Amazon S3 V2 connection.
    2. In the
      Source Type
      field, select single object as the source type.
    3. In the
      Object
      field, select the parquet file object that contains the customer details.
    4. In the
      Advanced Properties
      section, specify the required parameters.
    The following image shows the configured Source transformation properties that reads customer engagement details from the Amazon S3 object:
    You can view the Amazon S3 source configured properties.
  7. On the
    Expression
    tab, define an expression to change the file name port of the customer parquet file to uppercase based on your business requirement before you write data to the Snowflake target:
    The following image shows the configured Expression transformation properties:
    Specify the expression for the Amazon S3 input fields before writing the data to the Snowflake target.
  8. Add a Target transformation, and specify a name and description in the general properties.
  9. On the
    Target
    tab, specify the details to write data to Snowflake:
    1. In the
      Connection
      field, select the Snowflake Data Cloud target connection.
    2. In the
      Target Type
      field, select single object.
    3. In the
      Object
      field, select the Snowflake object to which you want to write the curated customer engagement data.
    4. In the
      Operation
      field, select the insert operation.
    5. In the
      Advanced Properties
      section, specify the required advanced target properties.
      The following image shows the configured Snowflake Target transformation properties:
      You can view the Snowflake target configured properties.
  10. Click
    Save
    Run
    to validate the mapping.
    In Monitor, you can monitor the status of the logs after you run the task.

0 COMMENTS

We’d like to hear from you!