runs a mapping that contains Lookup transformations, it builds a cache in memory when it processes the first row of data in a cached Lookup transformation. If there are multiple Lookup transformations in a mapping,
Data Integration
creates the caches sequentially when the first row of data is processed by the Lookup transformation. This slows Lookup transformation processing.
You can enable concurrent caches to improve performance. When the number of additional concurrent pipelines is set to one or more,
Data Integration
builds caches concurrently rather than sequentially. Performance improves greatly when the tasks contain a number of active transformations that might take time to complete, such as Aggregator, Joiner, or Sorter transformations. When you enable multiple concurrent pipelines,
Data Integration
doesn't wait for active task runs to complete before it builds the cache. Other Lookup transformations in the pipeline also build caches concurrently.