You can access log file data, sensor data, Supervisory Control And Data Acquisition (SCADA) data, message bus data, Programmable logic controller (PLC) data on the Spark engine in the Hadoop environment.
You can create physical data objects to access the different types of data. Based on the type of target you are writing to, you can create the following data objects:
AmazonKinesis
A physical data object that represents data in an Amazon Kinesis Firehose Delivery Stream. Create an AmazonKinesis data object to write to an Amazon Kinesis Firehose Delivery Stream.
Azure Event Hub
A physical data object that represents data in Microsoft Azure Event Hubs data streaming platform and event ingestion service. Create an Azure Even Hub data object to connect to an Event Hub target.
Complex file
A representation of a file in the Hadoop file system. Create a complex file data object to write data to an HDFS sequence file or binary file.
For more information about complex file data objects, see the
Informatica PowerExchange for HDFS User Guide
.
HBase
A physical data object that represents data in an HBase resource. Create an HBase data object to connect to an HBase data target.
JMS
A physical data object that accesses a JMS server. Create a JMS data object to write to a JMS server.
Kafka
A physical data object that accesses a Kafka broker. Create a Kafka data object to write to a Kafka broker.
MapRStreams
A MapRStreams data object is a physical data object that represents data in a MapR Stream. Create a MapRStreams data object to write to a MapR Stream.
Microsoft Azure Data Lake
A Microsoft Azure Data Lake Store data object is a physical data object that represents a Microsoft Azure Data Lake Store table. Create an Azure Data Lake object to write to a Microsoft Azure Data Lake Store table.
Relational
A physical data object that you can use to access a relational table. You can create a relational object to connect to a Hive or JDBC-compliant database.
For more information about relational data objects, see the