You cannot configure a mapping to
read data in Realtime mode.
When you create a mapping in advanced
mode, you cannot preview data for individual transformations to test the mapping
logic.
You cannot configure a Lookup
transformation in a mapping in advanced mode.
When you configure one-way or two-way
SSL authentication to connect to a Kafka broker, ensure that the truststore or
keystore file name doesn't contain spaces or UTF-8
characters.
When you write data into an existing
Kafka target in Avro format and import the schema with an Avro schema file, the
Secure Agent ignores the schema in the Avro schema file and uses an
automatically generated Avro schema to write data into the Kafka
target.
When you use the Confluent schema
registry to import Avro metadata, you cannot write data into an existing Kafka
target. To write data to a Kafka topic in Avro format, create a target
runtime.
You cannot use parameterized sources
when you select the discover structure format.
When you use an intelligent structure
model with an Avro schema file and the Avro input file doesn't match the Avro
schema file or only partially matches the Avro schema file, there might be data
loss.
When you use an intelligent structure
model with an Avro schema file and the input file contains more columns than the
Avro schema file, Intelligent Structure Discovery doesn't assign the data to an
Unassigned Data
field.
When you use an intelligent structure
model with an Avro schema file and there is a data type mismatch in the input
file and schema file, Intelligent Structure Discovery doesn't assign the data to
an
Unassigned Data
field.
When you use an intelligent structure
model in a source, ensure that you do not use transformations that do not
support hierarchical data types. Otherwise, the mapping fails with the following
error:
The transformation does not support fields that contain
hierarchical data.
When you select Avro as the format
type, you cannot preview data.
When you select Avro as the format type
for an existing target and configure a schema with columns of primitive data
types, the default schema contains a field of Union data type with a primitive
data type and a Null data type.
When you import a Kafka target, ensure
that there are no hierarchical fields in the target. To write data to a
hierarchical field, create a target at runtime.
When you monitor a mapping run in
advanced mode, the
My Jobs
page does not display the
number of rows that the Spark job processed.
When you read Boolean data type in JSON
format from an Amazon S3 source and write to a Kafka target, the data is written
as an Integer data type.
When you parameterize the
Custom Start Position Timestamp
source advanced
property with an incorrect parameterization format like
$$$<parameter_name>
, the mapping fails with an
irrelevant error in place of parameterization format error:
The Custom
Start Position Timestamp cannot be empty when Start position offset is
selected as [Custom] in the source data object
[Source].
When you parameterize the
Custom Start Position Timestamp
source advanced
property with an incorrect parameterization format like
$<parameter_name>
, the mapping doesn't
fail.
If the JSON data that you read from a
source fails to align with the source schema defined in the schema definition
file, the data written to the target appears corrupted.
You can't write the header data from the
source to the header field in the target for mappings in advanced
mode.