Table of Contents

Search

  1. Preface
  2. Part 1: Versions 10.5 - 10.5.0.1
  3. Part 2: Versions 10.4.1 - 10.4.1.3
  4. Part 3: Versions 10.4 - 10.4.0.2
  5. Part 4: Versions 10.2.2 - 10.2.2 HotFix 1
  6. Part 5: Version 10.2.1
  7. Part 6: Version 10.2 - 10.2 HotFix 2

What's New and Changed (10.5.0.1)

What's New and Changed (10.5.0.1)

Mapping Audits

Mapping Audits

You can create an audit to validate the consistency and accuracy of data that is processed in a mapping.
An audit is composed of rules and conditions. Use a rule to compute an aggregated value for a single column of data. Use a condition to make comparisons between multiple rules or between a rule and constant values.
You can configure audits for the following mappings that run on the native environment or the Spark engine:
  • Read operations in Amazon S3, JDBC V2, Microsoft Azure SQL Data Warehouse, and Snowflake mappings.
  • Read operations for complex files such as Avro, Parquet, and JSON in HDFS mappings.
  • Read and write operations in Hive and Oracle mappings.
For more information, see the
Data Engineering Integration 10.5 User Guide
.

0 COMMENTS

We’d like to hear from you!