Table of Contents

Search

  1. Version 10.1
  2. Version 10.0
  3. Version 9.6.1
  4. Version 9.6.0

Informatica Adapters

This section describes new Informatica adapter features.
PowerExchange for DataSift
You can extract historical data from DataSift for Twitter sources.
For more information, see the
Informatica PowerExchange for DataSift 9.6.1 User Guide
.
PowerExchange for Greenplum
  • You can use PowerExchange for Greenplum to load large volumes of data into Greenplum tables. You can run mappings developed in the Developer tool. You can run the mappings in native or Hive run-time environments.
  • You can also use PowerExchange for Greenplum to load data to a HAWQ database in bulk.
For more information, see the
Informatica PowerExchange for Greenplum 9.6.1 User Guide
.
PowerExchange for LinkedIn
You can extract information about a group, information about posts of a group, comments about a group post, and comments about specific posts from LinkedIn. You can also extract a list of groups suggested for the user and a list of groups in which the user is a member from LinkedIn.
For more information, see the
Informatica PowerExchange for LinkedIn 9.6.1 User Guide
.
PowerExchange for HBase
You can use PowerExchange for HBase to read data in parallel from HBase. The Data Integration Service creates multiple Map jobs to read data in parallel.
For more information, see the
Informatica PowerExchange for HBase 9.6.1 User Guide
.
PowerExchange for Hive
You can create a Hive connection that connects to HiveServer or HiveServer2. Previously, you could create a Hive connection that connects to HiveServer. HiveServer2 supports Kerberos authentication and concurrent connections.
For more information, see the
Informatica PowerExchange for Hive 9.6.1 User Guide
.
PowerExchange for MongoDB
You can use the Schema Editor to change the schema of MongoDB collections. You can also use virtual tables for MongoDB collections that have nested columns.
For more information, see the
Informatica PowerExchange for MongoDB 9.6.1 User Guide
.
PowerExchange for Teradata Parallel Transporter API
When you load data to a Teradata table in a Hive run-time environment, you can use the Teradata Connector for Hadoop (TDCH) to increase performance. To use TDCH to load data, add the EnableTdch custom property at the Data Integration Service level and set its value to true.
For more information, see the
Informatica PowerExchange for Teradata Parallel Transporter API 9.6.1 User Guide
.


Updated April 09, 2019