You can use PowerExchange for Snowflake to extract data from and load data to Snowflake. You can also read data from and write data to Snowflake that is enabled for staging data in Amazon, Azure, and Google Cloud Platform.
You can use Snowflake objects as sources and targets in mappings. When you use Snowflake objects in mappings, you must configure properties specific to Snowflake. You can also use Snowflake sources and targets in dynamic mappings.
You can validate and run Snowflake mappings in the native or in the non-native environment such as Hadoop and Databricks. In the Hadoop environment, you can run mappings on the Spark engine. You can also run profiles against Snowflake objects in the native environment. PowerExchange for Snowflake uses the Snowflake Spark Connector APIs to run Snowflake mappings on the Spark engine. To run mappings in the native environment, PowerExchange for Snowflake uses the Snowflake loader APIs.
You can also use Snowflake as a target in a streaming mapping. For more information, see the
Data Engineering Streaming User Guide
.
Example
An enterprise application uses an Oracle database to store the product transaction details such as transactionID, customerID, productID, quantity, and order date. You need to analyze the completed transactions, pending transactions, and availability of stock. Use PowerExchange for Snowflake to create a mapping to extract all the transaction records from the Oracle source, and load the records to a Snowflake target for data analysis.