Table of Contents

Search

  1. Preface
  2. Introduction to Data Integration Hub
  3. PowerCenter Mappings and Workflows
  4. Data Engineering Integration and Streaming Mapping and Workflows
  5. Data Quality Mappings and Workflows
  6. Informatica Cloud Mappings and Tasks
  7. Data Integration Hub Workflows
  8. Data Integration Hub Transformations
  9. Operational Data Store Dashboard and Reports
  10. Forms Designer
  11. Data Integration Hub Publications and Subscriptions APIs
  12. Data Extraction APIs

Developer Guide

Developer Guide

Data Engineering Integration and Streaming Mapping and Workflows Overview

Data Engineering Integration
and Streaming Mapping and Workflows Overview

Data Integration Hub
uses
Data Engineering Integration
and Data Engineering Streaming to run
Data Integration Hub
big data publications and subscriptions.
You use
Data Engineering Integration
mappings to run custom batch publications and subscriptions that publish and consume large, diverse, and fast changing data sets. You use
Data Engineering Integration
workflows with multiple mappings in a workflow to run multiple custom batch publications and subscriptions that publish and consume large, diverse, and fast changing data sets. You use Data Engineering Streaming mappings to run custom multi-latency publications that publish streams of data in real time.
You create a
Data Engineering Integration
workflow by using multiple
Data Engineering Integration
mappings.
The Data Integration Service runs the
Data Engineering Integration
mapping, Data Engineering Streaming mappings,
Data Engineering Integration
workflows on the Hadoop environment.
You use the Developer tool to develop the
Data Engineering Integration
, Data Engineering Streaming mappings, and
Data Engineering Integration
Workflow that process the publications and subscriptions. You then use the
Data Integration Hub
Operation Console to import the mappings into a
Data Integration Hub
workflow.
The
Data Integration Hub
operator creates a publication or a subscription in the
Data Integration Hub
Operation Console, and selects the
Data Integration Hub
workflow which is based on the
Data Engineering Integration
, Data Engineering Streaming mapping, or
Data Engineering Integration
Workflow. For more information, see the
Data Integration Hub
Operator Guide
.

Sample mappings

You can find sample mappings in the following locations:
  • Data Engineering Integration
    mappings:
    <
    DIH
    InstallationDir>/samples/bdm_mappings
    . Each sample mapping has an associated readme file that describes the sample mapping and contains guidelines for using the mapping as a basis to create your own mappings.
  • Data Engineering Streaming mappings:
    <
    DIH
    InstallationDir>/samples/bds_mappings
    . Under this folder, there are sub-folders with sample mappings for an Oracle publication repository and for a Microsoft SQL Server publication repository. The readme file that resides in this folder describes the sample mappings and contains guidelines for using the mappings as a basis to create your own mappings.
  • Data Engineering Integration
    Workflow:
    <
    DIH
    InstallationDir>/samples/bdm_workflows
    . Under this folder, there are sub-folders with sample workflows for an Oracle publication repository and for a Microsoft SQL Server publication repository. The readme file that resides in this folder describes the sample workflows and contains guidelines for using the workflows as a basis to create your own workflows.

0 COMMENTS

We’d like to hear from you!