Table of Contents

Search

  1. Preface
  2. Part 1: Version 10.5.3 - 10.5.3.x
  3. Part 2: Version 10.5.2 - 10.5.2.1.x
  4. Part 3: Version 10.5.1 - 10.5.1.1
  5. Part 4: Versions 10.5 - 10.5.0.1
  6. Part 5: Versions 10.4.1 - 10.4.1.3
  7. Part 6: Versions 10.4 - 10.4.0.2
  8. Part 7: Versions 10.2.2 - 10.2.2 HotFix 1
  9. Part 8: Version 10.2.1
  10. Part 9: Version 10.2 - 10.2 HotFix 2

What's New and Changed (10.5.3)

What's New and Changed (10.5.3)

Upgrading to Version 10.4.0

Upgrading to Version 10.4.0

Installer Changes

Effective in version 10.4.0, the Informatica installer has the following changes:
  • You can run the 10.4.0 installer to install Data Engineering, Data Catalog, and traditional products. While you can install traditional products in the same domain with Data Engineering and Data Catalog products, Informatica recommends that you install the traditional products in separate domains.
  • You can run the 10.4.0 installer to upgrade Data Engineering, Data Catalog, and traditional products.
  • When you create a domain, you can choose to create the PowerCenter Repository Service and the PowerCenter Integration Service.
Effective in version 10.4.0, the Informatica upgrade has the following change:
  • The Search Service creates a new index folder and re-indexes search objects. You do not need to perform the re-index after you upgrade.

Running Mappings with High or Low Precision

Effective in version 10.2.2, mappings that use the Spark engine run in high precision mode by default.
You can disable the high precision setting in the Developer tool in the mapping run-time configurations advanced settings:
  1. In the Developer tool, select
    Window
    Preferences
    .
  2. Select
    Informatica
    Run Configurations
    Mapping
    .
  3. Select the
    Advanced
    tab.
  4. Deselect
    High precision
    .
You might want to do this to run mappings on the Spark engine that were developed in an earlier release and that you prefer to run with low precision. In some cases, such as a mapping in which scale is not specified, mappings fail at the default high precision setting and require low precision to run.
Effective in version 10.4.0, the following additional changes affect mappings that run on the Spark engine:
  • Decimal to string conversion no longer appends a decimal to integers. For example, the value of integer 1 in decimal format remains "1" in string format.
  • In decimal to string conversion, trailing zeroes after the decimal point are trimmed. For example, the value of 1.000 is decimal format is rendered as "1" in string format.
  • When you run a mapping in low precision mode, decimal to string conversion for a precision setting of more than 15 results in exponential notation.

0 COMMENTS

We’d like to hear from you!