Refer to the following questions and answers for information about
Informatica Intelligent Cloud Services
for Snowflake Accelerator.
How can I interact with Snowflake using
Informatica Intelligent Cloud Services
?
Log in to
Informatica Intelligent Cloud Services
, and open
Data Integration
. On the
Home
page, click
New
in the navigation menu on the left to create new mappings and tasks. For more information, see
Getting Started.
Along with a new login, a new database, warehouse, and role are created for you in Snowflake when you sign up. Initially there are no tables that you can read from in this account.
You can create tables in the database using either of the following methods:
Use
Data Integration
to load data using mappings. Create a mapping and configure a Snowflake target in the mapping.
For information about creating a mapping, see
Mappings. When you add a Snowflake target to a mapping,
Data Integration
creates the target table when you run the mapping.
Create tables within the Snowflake console using the new role.
If you want to read data from tables that exist in some other database in your account, you can create additional Snowflake connections in the
Informatica Intelligent Cloud Services
Administrator
service and use the connections in your mappings.
How do I load data to Snowflake or read data from it using
Data Integration
?
Try any of the following:
Upload a CSV file to Snowflake using a mass ingestion task.
Create a mapping to transform data before you load it to Snowflake.
Run a mapping in ELT mode using
SQL ELT optimization
.
For more information about creating mappings, see
Mappings. For more information about mass ingestion tasks, see
Tasks.
What are mappings and tasks?
A mapping defines reusable data flow logic and describes the flow of data from source to target. You create a mapping using the Mapping Designer, a drag and drop, graphical interface. For more information about mappings, see
Mappings.
A task is a process that you configure to analyze, extract, transform, and load data. For example, you configure a
mapping
to run a mapping and a mass ingestion task to transfer large numbers of files between on-premises and cloud repositories. You create tasks using a wizard-based interface. For more information about tasks, see
Tasks.
Are there any steps that I need to perform before I create and run mappings and tasks?
Check the following to ensure that your organization is configured properly to interact with Snowflake:
Log in to
Informatica Intelligent Cloud Services
, open
Administrator
, and check the following:
Open the
Runtime Environments
page and verify that you see at least one runtime environment with the status "Up an Running."
You will see the Informatica Cloud Hosted Agent on this page. If you want to read from or write to cloud-based applications, you can use the Hosted Agent. If you want to work with data inside your firewall, you must download and install a Secure Agent.
Open the
Connections
page and verify that you see a Snowflake Data Cloud connection. Click
Test Connection
and verify that the test is successful.
If you plan to read data from Snowflake, complete the following steps to increase the JVM heap size on the machine that hosts your runtime environment.
You don't need to complete these steps if you use the Hosted Agent or if you only plan to load data to Snowflake.
In
Administrator
, open the
Runtime Environments
page and locate the Secure Agent that you installed.
The Secure Agent is located in a group that usually has the same name as the agent.
In the row that contains the Secure Agent, open the
Actions
menu and select
Edit Secure Agent
.
In the System Configuration Details area, select the
Data Integration Server
service and the
DTM
type.
Edit the JVMOption1 and set the JVM memory value to
-Xmx256m
.
Click
Save
.
Can I use the Informatica Cloud Hosted Agent with Snowflake?
Yes. The Snowflake connector supports the Informatica Cloud Hosted Agent. You can use it if you are working with cloud endpoints. If you need to access any on-premises data sources, you must use the Secure Agent.
Does Informatica support Snowflake on Microsoft Azure?
Yes.
Informatica Intelligent Cloud Services
supports Snowflake on Azure.
Does Informatica support Snowflake on Google?
Yes.
Informatica Intelligent Cloud Services
supports Snowflake on Google Cloud Platform.
Can I configure database
SQL ELT optimization
?
You can configure source and full
SQL ELT optimization
for the Snowflake database using a Snowflake Data Cloud connection. You can also configure full
SQL ELT optimization
for tasks that read from Snowflake, Amazon S3, Google Cloud Storage, and Microsoft Azure Data Lake Storage Gen2 sources and write to a Snowflake target.
For more information about the rules and guidelines for configuring
Yes, you can configure key range partitioning when reading data from Snowflake. You can configure pass-through partitioning along with other target properties to partition data when writing data to Snowflake. For more information, see the
Snowflake Data Cloud Connector Guide.
Can I configure a Snowflake connection using a proxy?
Yes, you can use proxy for the Snowflake connection.
Can I use an SQL Query to read data from Snowflake?
When you use Snowflake Data Cloud Connector on
Informatica Intelligent Cloud Services
, you can specify a query as the source object type.
Can I use an SQL Override for Snowflake lookups?
When you use the Snowflake Data Cloud Connector on
Informatica Intelligent Cloud Services
, you can configure an SQL override in the Source transformation advanced properties.
Can I read data from tables and views in Snowflake?
Yes, you can read data from Snowflake tables and views.
If you use mappings, you can use the Informatica expression language to construct a value to be written to a variant column. If you are reading a variant field, you can write a Snowflake SQL query to read specific attributes from it. You can also load data into a variant column using a
mass ingestion
task.
For information about loading JSON files into Snowflake, see
this KB article.
Do other Informatica products support Snowflake?
Yes. For information about working with Snowflake on other Informatica products, such as Data Engineering Integration (formerly called Big Data Management), PowerCenter or Data Quality, please see
this KB article.