Table of Contents

Search

  1. Preface
  2. Upgrade Overview
  3. Before You Upgrade the Domain
  4. Prepare for the Upgrade
  5. Upgrade the Domain
  6. Upgrade the Domain with Changes to Node Configuration
  7. Before You Upgrade the Application Services
  8. Application Service Upgrade
  9. Informatica Client Upgrade
  10. After You Upgrade
  11. Upgrade Checklist

Upgrading from Version 10.1 (10.2.2 HotFix 1)

Upgrading from Version 10.1 (10.2.2 HotFix 1)

Update Mappings for Dropped Hive Engine Support

Update Mappings for Dropped Hive Engine Support

After you upgrade, you need to update mappings that have the Hive engine configured within the Hadoop validation environment. Run a series of infacmd commands to update mappings to change the Hive engine configuration. Informatica continues to support the Blaze and Spark engines in the Hadoop environment.
Run commands using the following infacmd plugins.
  • infacmd dis plugin
    . Run commands with the dis plugin to update mappings that are deployed to the Data Integration Service. For example, dis enableMappingValidationEnvironment.
  • infacmd mrs plugin
    . Run commands with the mrs plugin to update mappings that are not deployed to the Data Integration Service. For example, mrs enableMappingValidationEnvironment.
When you run the commands, the -sn (Service Name) parameter depends on the plugin that you use. Use the name of the Data Integration Service when you run dis commands, and use the name of the Model Repository Service when you run mrs commands.
Run the following commands against both the dis and the mrs plugins.
listMappingEngines
To identify mappings that have the Hive engine configured for validation, run the listMappingEngines command. Consider the following sample syntax:
mrs|dis listMappingEngines -dn domain_3987 -un Administrator -pd Password -vef hive -sn SN_3986
enableMappingValidationEnvironment
If you want to enable other validation environments, run the enableMappingValidationEnvironment command for each environment that you want to enable. You can enable the following environments: native, blaze, spark, or spark-databricks. Consider the following sample syntax examples based on different command filters:
  • Modify all mappings.
    mrs|dis enableMappingValidationEnvironment -dn domain_3987 -un Administrator -pd Password -sn SN_3986 -ve spark -cn HADOOP_cco_hdp619
  • Modify mappings based on mapping name.
    mrs|dis enableMappingValidationEnvironment -dn domain_3987 -un Administrator -pd Password -sn SN_3986 -ve spark -cn HADOOP_cco_hdp619 -mnf m_nav327,m_nav376
  • Modify mappings based on execution environment, mapping name, and project name.
    mrs|dis enableMappingValidationEnvironment -dn domain_3987 -un Administrator -pd Password -sn SN_3986 -ve spark -cn HADOOP_cco_hdp619 -eef hadoop -mnf m_nav327,m_nav376 -pn project1
setMappingExecutionEnvironment
If you want to change the execution environment, run the setMappingExecutionEnvironment. Consider the following sample syntax based on mapping name filter:
mrs|dis setMappingExecutionEnvironment -dn domain_3987 -un Administrator -pd Password -sn SN_3986 -ee Databricks -mnf m_nav327,m_nav376 -cn DATABRICKS_cco_db619
disableMappingValidationEnvironment
Update all mappings in the Model repository to disable the Hive engine from the Hadoop validation environment. Consider the following sample syntax:
mrs|dis disableMappingValidationEnvironment -dn domain_3987 -un Administrator -pd Password -sn SN_3986 -ve hive
listMappingEngines
Run the listMappingEngines command again to verify that there are no mappings with a Hive validation environment.
For information about the commands, see the
Informatica Command Reference
.

Warnings

Consider the following points of failure if you do not update the environments:
  • Mappings fail at run time if configured with the Hive engine as the only validation environment.
  • If you edit the validation environment in the Developer tool that has the Hive engine as the only validation environment, the Hadoop connection in the mapping is lost. You need to set the validation environments and select the Hadoop connection again. This can happen when you upgrade from a previous version or when you import a mapping from a previous version.