You can create an audit to validate the consistency and accuracy of data that is processed in a mapping.
An audit is composed of rules and conditions. Use a rule to compute an aggregated value for a single column of data. Use a condition to make comparisons between multiple rules or between a rule and constant values.
You can configure audits for the following mappings that run on the native environment or the Spark engine:
Read operations in Amazon S3, JDBC V2, Microsoft Azure SQL Data Warehouse, and Snowflake mappings.
Read operations for complex files such as Avro, Parquet, and JSON in HDFS mappings.
Read and write operations in Hive and Oracle mappings.