Table of Contents

Search

  1. Preface
  2. Introduction to Data Engineering Streaming
  3. Data Engineering Streaming Administration
  4. Sources in a Streaming Mapping
  5. Targets in a Streaming Mapping
  6. Streaming Mappings
  7. Transformation in Streaming Mappings
  8. Window Transformation
  9. Appendix A: Connections
  10. Appendix B: Monitoring REST API Reference
  11. Appendix C: Sample Files

Confluent Kafka Dynamic Mapping

Confluent Kafka Dynamic Mapping

You can use Confluent Kafka data objects as dynamic sources and targets in a streaming mapping.
Confluent Kafka is a distributed publish-subscribe messaging system that is fast, scalable, and durable. Confluent Kafka topics are partitioned and replicated across multiple nodes thereby allowing distributed processing. You create a Confluent Kafka data object to read data from Kafka brokers or from Confluent Kafka brokers using schema registry.
After you create the Confluent data objects, create a Confluent Kafka streaming mapping to dynamically accommodate changes to source, target, and transformation logic at run time. Configure rules, parameters, and transformation properties to create the dynamic mapping. The dynamic Confluent Kafka streaming mapping manages frequent schema or metadata changes or reuses the mapping logic for data sources with different schemas.
If the data source for a source or target changes, you can configure a mapping to get metadata changes at run time. Configure the Read or Write transformation to accommodate the source or target changes dynamically.
You do not need to manually synchronize the data object and update each transformation before you run the mapping again. The Data Integration Service dynamically determine transformation ports, transformation logic in the ports, and the port links within the mapping.
To enable a streaming mapping to run dynamically, configure it to get data object columns from the data source at run time.
For information about dynamic mappings, see the
Informatica Developer Mapping Guide
.

Example

You run the IT department of a major bank that has millions of customers. You want to monitor network activity in real time. You need to collect network activity data from sources, such as firewalls or network devices to improve security and prevent attacks. The network activity data includes Denial of Service (DoS) attacks and failed login attempts made by customers. The network activity data is written to multiple Confluent Kafka topics with different schema definitions.
You can create one streaming mapping at design time. Configure the streaming mapping to refresh the Confluent Kafka schema registry to adjust the mapping to various sources and targets at run time. The latest schema is fetched from Confluent schema registry and used to synchronize the sources and targets.

0 COMMENTS

We’d like to hear from you!