PowerExchange for Kafka User Guide for PowerCenter

PowerExchange for Kafka User Guide for PowerCenter

Kafka Mapping Example

Kafka Mapping Example

You run the IT department of a major bank that has millions of customers. You want to monitor network activity in real time. You need to collect network activity data from various sources such as firewalls or network devices to improve security and prevent attacks. The network activity data includes Denial of Service (DoS) attacks and failed login attempts made by customers. The network activity data is written to Kafka queues.
Create a mapping to read the network activity data from Kafka topics and write the data to Oracle for processing the network activity data.
The following procedure shows how to move data from the Kafka topic to Oracle:
  1. Import a Kafka source.
  2. Import an Oracle target.
  3. Create a mapping with the Kafka source and an Oracle target.
    The following image shows the example mapping:
    The image show the example mapping.
The mapping contains the following objects:
Source Definition
The mapping source definition is a Kafka topic that contains the network activity data.
In the Source Analyzer, import the Kafka topic that you want to read. The PowerCenter Integration Service reads network activity data from the Kafka topic.
The following table describes the structure of the Kafka source definition:
Field Name
Data Type
partitionId
Integer
key
Raw
TopicName
String
timestamp
Timestamp
data
Binary
Target Definition
The mapping contains an Oracle target definition. In the Target Designer, import an Oracle target definition.
The following table describes the structure of the Oracle target definition:
The following table describes the structure of the Oracle target definition:
Field Name
Data Type
PARTITIONID
Number(p,s)
TOPICNAME
varchar2
TIMESTAMP
Varchar2
DATA
Blob
KEY
Blob
Link ports between the Kafka Source Qualifier and Oracle target definition to create a flow of data. In the
Workflow Manager
, create a session and add the mapping to the session. Create a workflow and add the session to the workflow.
When you run the workflow, the data is read from the Kafka queue and written to the Oracle target. You can then run queries on the Oracle database to analyze the network activity data.

0 COMMENTS

We’d like to hear from you!