Custom PowerCenter Mappings and Workflows Rules and Guidelines
Custom PowerCenter Mappings and Workflows Rules and Guidelines
When you develop PowerCenter workflows
to use in publications and subscriptions with custom mappings, publication pre-processes, subscription post-processes, and monitoring rules
, consider the following rules and guidelines:
General rules and guidelines
Before you develop workflows for
Data Integration Hub
in PowerCenter, verify that the
Data Integration Hub
PowerCenter client and server plug-ins are installed and registered to the PowerCenter repository. For details, see the
Data Integration Hub
Installation and Configuration
Guide.
Name and store PowerCenter entities for custom mappings in different folders with a different naming convention from the naming convention of PowerCenter entities for custom mappings.
Data Integration Hub
uses a separate reporting session to update the status of publication and of subscription events that use an automatic mapping. It is recommended to create separate sessions for data processing and reporting, similar to automatic mappings. You can use a workflow for an automatic mapping as a reference.
You can use user-defined session parameters in custom workflows and define their values in
Data Integration Hub
or in a parameter file. You can manage the values of user-defined session parameters in
Data Integration Hub
in the Forms Designer. You cannot manage the values of built-in session parameters in
Data Integration Hub
. For more information about session parameters, see the section "Working with Session Parameters" in the
PowerCenter Advanced Workflow Guide
.
Data Integration Hub
does not support session parameters in the format
$InputFile_$$CustomVariable
.
Informatica recommends that custom publications and subscriptions that use PowerCenter mappings use the same event statuses and types as those that
Data Integration Hub
assigns to automatic mappings. For details, see the
Data Integration Hub
Operator Guide
. You assign event statuses and types in the DX_Event_Details transformation.
To prevent naming conflicts, do not use
_DIH__
in the parameter names, and do not use workflow and mapping parameters with the same names as workflow and mapping parameters that
Data Integration Hub
uses in workflows for publications and subscriptions with automatic mappings.
If you publish from a database source or write to a database target with a different database type from the publication repository database type,
Data Integration Hub
converts the data to a data type that the publication repository database supports. Therefore, if you consume the published data from the publication repository to a different target database, verify that the data type conversion does not create run-time errors during processing. For example, if you publish data from a Microsoft SQL Server database source to an Oracle publication repository, and then consume the published data to a Microsoft SQL Server database target, MIN or MAX values might be converted to a value that is higher or lower than values that the Microsoft SQL Server database target supports.
To use the workflow in
Data Integration Hub
as a publication workflow or a subscription workflow, create a
Data Integration Hub
workflow in the
Data Integration Hub
Operation Console by selecting the workflow in the PowerCenter repository or by selecting the exported workflow definition file. For more information, see
Creating a Data Integration Hub Workflow.
If you publish from a database source, you cannot use the following special characters in table names and in column names of a publication target: space ( ), dash (-), and period (.). The publication process replaces the characters with underscores (_).
When you develop a publication pre-process workflow, call the DX_Start_Publication transformation at the end of the pre-processing workflow, for example in a separate mapping. The transformation instructs the
Data Integration Hub
server to trigger the publication process. When you configure the DX_Start_Publication transformation, consider the following guidelines:
When a publication pre-process starts a single publication, use the DXEventId port. The event ID ensures that
Data Integration Hub
uses the same event for both the publication pre-process workflow and the publication workflow and changes the event status accordingly.
If you do not define a DXEventId port you must define a DXPublicationName port.
When a publication pre-process starts multiple publications, do not use the event ID in the DX_Start_Publication transformation. In this case, you can use the Event Details PowerCenter transformation to change the event status.
Do not call the DX_Start_Publication transformation more than once in a workflow. If you do,
Data Integration Hub
starts the publication multiple times.
When you develop a workflow for a publication with a file source, if the path of the source file is parameterized,
Data Integration Hub
picks up the file and moves it to the
Data Integration Hub
document store. If the path of the source file is hard coded, a PowerCenter source picks up and processes the file. For source files with a parameterized file path, the following rules apply:
For flat file sources, the source file type must be indirect.
For pass-through file sources, the source file type must be direct.
When you select a
Data Integration Hub
workflow that is based on a PowerCenter workflow to use in a publication with a custom mapping,
Data Integration Hub
creates the structure of the published data set in the publication repository based on the target definitions of the workflow.
Subscription rules and guidelines
When you develop a subscription post-processing workflow, call the DX_Notification transformation at the end of the workflow. You can find a sample post-processing workflow in the following directory:
<
DIH
InstallationDir>/samples/post_processing_workflow
.
When you develop a workflow for a compound subscription, define the behavior if the compound subscription starts manually before all published data sets are ready to consume. For example, you can instruct the mapping to fail the workflow or to ignore empty tables. Published data sets that are not ready to consume have the publication instance ID 0.
When you develop a workflow for a subscription with a file target, you can parameterize the target file path. The following rules and guidelines apply when you parameterize the file path:
For flat file targets, the target file parameter must start with
$OutputFile
.
For pass-through file targets, the target file parameter must start with
$OutputFile_DIHRepoFile_
.
When the
Data Integration Hub
operator creates the subscription in the
Data Integration Hub
Operation Console, they must specify the target output file name as the value for the output file parameter.
The value of the output file parameter can contain a pattern that ensures that the name is unique for each file, for example
($sequence)
.
When you develop a workflow for a subscription that consumes data from topic tables where delta detection is applied, add Update Strategy transformations to the mapping, and define the update strategy for data that exists in the target application. Add one of the following flags for each row in topic tables where delta detection is applied: