Table of Contents

Search

  1. Preface
  2. Introduction to Data Integration Hub
  3. PowerCenter Mappings and Workflows
  4. Data Engineering Integration and Streaming Mapping and Workflows
  5. Data Quality Mappings and Workflows
  6. Informatica Cloud Mappings and Tasks
  7. Data Integration Hub Workflows
  8. Data Integration Hub Transformations
  9. Operational Data Store Dashboard and Reports
  10. Forms Designer
  11. Data Integration Hub Publications and Subscriptions APIs
  12. Data Extraction APIs

Developer Guide

Developer Guide

Before You Begin

Before You Begin

Before you develop
Data Engineering Integration
, Data Engineering Streaming mappings, or
Data Engineering Integration
workflows to use in
Data Integration Hub
, verify that the following conditions exist:
  • The following
    Data Integration Hub
    components are installed:
    • Data Integration Hub
      Hadoop Service.
    • Data Integration Hub
      Data Engineering Integration
      .
  • The topic to which to publish data and from which to consume data is configured in
    Data Integration Hub
    . The topic must be a Hadoop-based topic.
  • The following connections are configured in
    Data Integration Hub
    :
    • Connection to the publishing application.
    • Connection to the subscribing application.
  • A Hadoop connection to where to run the mappings exists in your environment. The connection must be a cluster connection, and must push mapping logic to the Hadoop cluster.
Before you develop Data Engineering Streaming mappings, copy the runtime
.jar
files from
<
DIH
InstallationDir>/ powercenter/lib
to the following location:
BDS_HOME/services/shared/hadoop/$Distribution/extras/spark-auxjars/

0 COMMENTS

We’d like to hear from you!