Table of Contents

Search

  1. Preface
  2. Introduction to Data Integration Hub
  3. PowerCenter Mappings and Workflows
  4. Data Engineering Integration and Streaming Mapping and Workflows
  5. Data Quality Mappings and Workflows
  6. Informatica Cloud Mappings and Tasks
  7. Data Integration Hub Workflows
  8. Data Integration Hub Transformations
  9. Operational Data Store Dashboard and Reports
  10. Forms Designer
  11. Data Integration Hub Publications and Subscriptions APIs
  12. Data Extraction APIs

Developer Guide

Developer Guide

Data Quality Mappings and Workflows Overview

Data Quality Mappings and Workflows Overview

Data Quality is a processing engine that
Data Integration Hub
uses to run
Data Integration Hub
custom publications and subscriptions for on-premise applications. The Data Integration Service runs the Data Quality mappings and workflows on the native environment.
You use the Developer tool to develop the Data Quality mappings that process the publications and subscriptions. You then use the
Data Integration Hub
Operation Console to import the Data Quality mapping into a
Data Integration Hub
workflow.
You create a Data Quality workflow by using multiple Data Quality mappings.
The
Data Integration Hub
operator creates a publication or a subscription in the
Data Integration Hub
operation console, and selects the
Data Integration Hub
workflow which is based on the Data Quality mapping or Data Quality workflow. For more information, see the
Data Integration Hub
Operator Guide
.
You can find sample mappings in the following directory:
<
DIH
InstallationDir>/samples/idq_mappings
. Each sample mapping has an associated readme file that describes the sample mapping and contains instructions.

0 COMMENTS

We’d like to hear from you!