Table of Contents

Search

  1. Preface
  2. Introduction to Data Integration Hub
  3. PowerCenter Mappings and Workflows
  4. Data Engineering Integration and Streaming Mapping and Workflows
  5. Data Quality Mappings and Workflows
  6. Informatica Cloud Mappings and Tasks
  7. Data Integration Hub Workflows
  8. Data Integration Hub Transformations
  9. Operational Data Store Dashboard and Reports
  10. Forms Designer
  11. Data Integration Hub Publications and Subscriptions APIs
  12. Data Extraction APIs

Developer Guide

Developer Guide

Creating the Source Definition for a Subscription Workflow

Creating the Source Definition
for a Subscription Workflow

In the PowerCenter Designer, create the source and define the source properties
of a subscription workflow. When you develop a publication workflow, you define the source based on the application from which you want to publish data
.
The source definition process includes the following steps:
  1. Create the source object.
    Set the source connection to DIH__STAGING. You create a source based on the topic structure from the publication repository. You can copy the source from the publication metadata folder in the PowerCenter repository.
  2. Add variables to filter the published data to consume. For example, you can select to consume data from a specific table in the publication repository.
  3. Add the required fields that determine the data set to consume. You can define multiple data sets to consume, similar to an aggregated or a compound subscription with an automatic mapping.
The following table describes the fields to add to the source object of a subscription workflow:
Field
Description
DIH__PUBLICATION_INSTANCE_ID
Required. Identifiers of one or more published data sets in a comma-separated list. Each data set that an application publishes has a unique identifier. To filter the data to consume, use the value from the
$$<topicName>__PublicationInstanceIDs
workflow parameter.
The parameter datatype must be number(19) if you write to an Oracle database target or number(19,0) if you write to a Microsoft SQL Server database target.
DIH__PUBLICATION_INSTANCE_DATE
Date and time that each application started publishing the data sets, in a comma-separated list. If you use database partitions, you can filter the data to consume by using the value from the
$$<topic_name>__PublicationInstanceDatesSQL
workflow parameter. The value format depends on the publication repository database type.
On an Oracle database, the datatype must be date and the value must be in the following format:
YYYY-MM-DD HH24:MI:SS
On a Microsoft SQL Server database, the datatype must be datetime and the value must be in the following format:
yyyy-mm-dd hh:mi:ss (24h)
If you want to filter the data to consume with a different transformation, you can use the
$$<topic_name>__PublicationInstanceDates
parameter instead.
You can filter data to consume in the Source Filter attribute of the Source Qualifier transformation in subscription workflow. The following example shows the field syntax to filter by ID and date range in a single line:
MY_TABLE.DIH__PUBLICATION_INSTANCE_ID in ($$myTopic__PublicationInstanceIDs) AND MY_TABLE.DIH__PUBLICATION_INSTANCE_DATE in ($$myTopic__PublicationInstanceDatesSQL)

0 COMMENTS

We’d like to hear from you!