Table of Contents

Search

  1. Preface
  2. Part 1: Version 10.5.2 - 10.5.2.1.x
  3. Part 2: Version 10.5.1 - 10.5.1.1
  4. Part 3: Versions 10.5 - 10.5.0.1
  5. Part 4: Versions 10.4.1 - 10.4.1.3
  6. Part 5: Versions 10.4 - 10.4.0.2
  7. Part 6: Versions 10.2.2 - 10.2.2 HotFix 1
  8. Part 7: Version 10.2.1
  9. Part 8: Version 10.2 - 10.2 HotFix 2

What's New and Changed (10.5.2.1)

What's New and Changed (10.5.2.1)

Processing Invalid Values

Processing Invalid Values

Effective in version 10.5, the Spark engine writes NULL values to the target when the mapping contains certain invalid values.
The Spark engine writes NULL values in the following situations:
  • The
    terms
    argument in PV, FV, PMT, and RATE finance functions passes a 0 value. The value of
    terms
    must be an integer greater than 0.
  • The
    month
    argument in the MAKE_DATE_TIME function passes an invalid value. The value of
    month
    must be from 1 to 12.
Previously, the Spark engine wrote random values.
The Data Integration Service rejects rows with invalid values. If you want rows with invalid values to be rejected and not written to the target, run the mapping in the native environment.

0 COMMENTS

We’d like to hear from you!