Table of Contents

Search

  1. Preface
  2. Part 1: Version 10.5.4
  3. Part 2: Version 10.5.3 - 10.5.3.x
  4. Part 3: Version 10.5.2 - 10.5.2.1.x
  5. Part 4: Version 10.5.1 - 10.5.1.1
  6. Part 5: Versions 10.5 - 10.5.0.1
  7. Part 6: Versions 10.4.1 - 10.4.1.3
  8. Part 7: Versions 10.4 - 10.4.0.2
  9. Part 8: Versions 10.2.2 - 10.2.2 HotFix 1
  10. Part 9: Version 10.2.1
  11. Part 10: Version 10.2 - 10.2 HotFix 2

What's New and Changed (10.5.4)

What's New and Changed (10.5.4)

Mapping Audits

Mapping Audits

You can create an audit to validate the consistency and accuracy of data that is processed in a mapping.
An audit is composed of rules and conditions. Use a rule to compute an aggregated value for a single column of data. Use a condition to make comparisons between multiple rules or between a rule and constant values.
You can configure audits for the following mappings that run on the native environment or the Spark engine:
  • Read operations in Amazon S3, JDBC V2, Microsoft Azure SQL Data Warehouse, and Snowflake mappings.
  • Read operations for complex files such as Avro, Parquet, and JSON in HDFS mappings.
  • Read and write operations in Hive and Oracle mappings.
For more information, see the
Data Engineering Integration 10.5 User Guide
.

0 COMMENTS

We’d like to hear from you!