PowerExchange adapters. Connect to data sources through adapters.
Enterprise Data Catalog. Perform data lineage analysis for big data sources and targets.
Enterprise Data Lake. Discover raw data and publish it in a lake as a Hive table.
Data Quality. Perform address validation and data discovery.
Data Replication. Replicate change data to a Hadoop Distributed File System (HDFS).
Data Transformation. Process complex file sources from the Hadoop environment.
Big Data Streaming. Stream data as messages, and process it as it becomes available.
Edge Data Streaming. Collect and ingest data in real time to a Kafka queue.
Dynamic Data Masking. Mask or prevent access to sensitive data.