Table of Contents


  1. Preface
  2. Introduction to Data Engineering Streaming
  3. Data Engineering Streaming Administration
  4. Sources in a Streaming Mapping
  5. Targets in a Streaming Mapping
  6. Streaming Mappings
  7. Transformation in Streaming Mappings
  8. Window Transformation
  9. Appendix A: Connections
  10. Appendix B: Monitoring REST API Reference
  11. Appendix C: Sample Files

Snowflake Data Objects

Snowflake Data Objects

A Snowflake data object is a physical data object that represents data based on a Snowflake resource. After you create a Snowflake connection, create a Snowflake data object to write data to Snowflake.
You can configure the data object write operation properties that determine how to write data to Snowflake. After you create a Snowflake data object, create a write operation. You can use the Snowflake data object write operation as a target in streaming mappings.
To run streaming mappings with Snowflake as the target, you must specify the private key to authenticate to Snowflake and the Snowflake internal stage in the advanced properties of the Snowflake data object write operation.
When you run streaming mappings to write to Snowflake, the Spark engine writes data to a streaming Snowflake data frame. The data frame in turn writes to the Snowflake internal stage. After the data is available in the Snowflake internal stage, Snowpipe, which is Snowflake's continuous data ingestion service, loads data from the internal stage to Snowflake.
To avoid conflicts while loading data, do not use the stage used by PowerExchange for Snowflake with any other streaming job or to load other files. Do not manually add, modify, or remove files from the internal stage during streaming.


We’d like to hear from you!