Publish and subscribe to high volumes of data, data streams, and data that you want to store for a long period of time with
Data Integration Hub
. For example, store business intelligence data that you need to review over time on the
Data Integration Hub
Hadoop publication repository, or publish from and subscribe to Hadoop Distributed File System (HDFS) and Hive data warehouses.
If you want to keep the published data in the Hadoop publication repository after the data is consumed by all subscribers, you can configure
Data Integration Hub
not to delete published data from the repository.
You can use both automatic mappings and custom mappings to publish and consume big data with
Data Integration Hub
. For custom mapping publications you can use Informatica Data Engineering Integration mappings and workflows and Informatica Data Engineering Streaming mappings. For custom mapping subscriptions you use Informatica Data Engineering Integration mappings and workflows.