The operational data store loader is a PowerCenter workflow that collects event information from the run-time
Data Integration Hub
repository and then loads the aggregated events to the operational data store. The Dashboard retrieves the aggregated event information and displays it in panels based on the selected KPI.
You can change workflow parameters that affect the workflow behavior. For example, you can choose how long to wait between each event load process and how many retry attempts to perform before failing the workflow. Do not change any internal workflow parameters.
The workflow determines which events to load based on the difference between the time that the event finished processing and the time that the scheduled load process starts. Use the dx.ods.latency.seconds system property to determine the time to wait before the workflow loads the event after the time the event finished processing. Increase the latency if you experience clock sync issues or if you expect events with longer processing time.
If you process a large volume of events, you can change
Data Integration Hub
system properties to minimize bottlenecks and to increase performance during the event load process. The workflow loads events in batches. Use the dx.ods.row.limit.thousands system property to determine the number of events to include in each batch.
You import the operational data store event loader to PowerCenter after you install the
Data Integration Hub
Dashboard and Reports component with the main
Data Integration Hub
installation. For more information, see the
Data Integration Hub
Installation and Configuration Guide
.
If a PowerCenter session fails, the operational data store event workflow might not display a failed status. Monitor the PowerCenter session to verify the success of the run.