Table of Contents

Search

  1. Preface
  2. Upgrade Overview
  3. Before You Upgrade the Domain on Linux
  4. Before You Upgrade the Domain on UNIX
  5. Before You Upgrade the Domain on Windows
  6. Prepare for the Upgrade
  7. Upgrade the Domain
  8. Upgrade the Domain with Changes to Node Configuration
  9. Before You Upgrade the Application Services
  10. Application Service Upgrade
  11. Informatica Client Upgrade
  12. After You Upgrade
  13. Appendix A: Upgrade Checklist
  14. Appendix B: Managing Distribution Packages

Upgrading from Version 10.2 (10.5.4)

Upgrading from Version 10.2 (10.5.4)

Update Mappings for Dropped Hive Engine Support

Update Mappings for Dropped Hive Engine Support

After you upgrade, you need to update mappings that have the Hive engine configured within the Hadoop validation environment. Run a series of infacmd commands to update mappings to change the Hive engine configuration. Informatica continues to support the Blaze and Spark engines in the Hadoop environment.
Run commands using the following infacmd plugins.
  • infacmd dis plugin
    . Run commands with the dis plugin to update mappings that are deployed to the Data Integration Service. For example, dis enableMappingValidationEnvironment.
  • infacmd mrs plugin
    . Run commands with the mrs plugin to update mappings that are not deployed to the Data Integration Service. For example, mrs enableMappingValidationEnvironment.
When you run the commands, the -sn (Service Name) parameter depends on the plugin that you use. Use the name of the Data Integration Service when you run dis commands, and use the name of the Model Repository Service when you run mrs commands.
Run the following commands against both the dis and the mrs plugins.
listMappingEngines
To identify mappings that have the Hive engine configured for validation, run the listMappingEngines command. Consider the following sample syntax:
mrs|dis listMappingEngines -dn domain_3987 -un Administrator -pd Password -vef hive -sn SN_3986
enableMappingValidationEnvironment
If you want to enable other validation environments, run the enableMappingValidationEnvironment command for each environment that you want to enable. You can enable the following environments: native, blaze, spark, or spark-databricks. Consider the following sample syntax examples based on different command filters:
  • Modify all mappings.
    mrs|dis enableMappingValidationEnvironment -dn domain_3987 -un Administrator -pd Password -sn SN_3986 -ve spark -cn HADOOP_cco_hdp619
  • Modify mappings based on mapping name.
    mrs|dis enableMappingValidationEnvironment -dn domain_3987 -un Administrator -pd Password -sn SN_3986 -ve spark -cn HADOOP_cco_hdp619 -mnf m_nav327,m_nav376
  • Modify mappings based on execution environment, mapping name, and project name.
    mrs|dis enableMappingValidationEnvironment -dn domain_3987 -un Administrator -pd Password -sn SN_3986 -ve spark -cn HADOOP_cco_hdp619 -eef hadoop -mnf m_nav327,m_nav376 -pn project1
setMappingExecutionEnvironment
If you want to change the execution environment, run the setMappingExecutionEnvironment. Consider the following sample syntax based on mapping name filter:
mrs|dis setMappingExecutionEnvironment -dn domain_3987 -un Administrator -pd Password -sn SN_3986 -ee Databricks -mnf m_nav327,m_nav376 -cn DATABRICKS_cco_db619
disableMappingValidationEnvironment
Update all mappings in the Model repository to disable the Hive engine from the Hadoop validation environment. Consider the following sample syntax:
mrs|dis disableMappingValidationEnvironment -dn domain_3987 -un Administrator -pd Password -sn SN_3986 -ve hive
listMappingEngines
Run the listMappingEngines command again to verify that there are no mappings with a Hive validation environment.
For information about the commands, see the
Informatica Command Reference
.

Warnings

Consider the following points of failure if you do not update the environments:
  • Mappings fail at run time if configured with the Hive engine as the only validation environment.
  • If you edit the validation environment in the Developer tool that has the Hive engine as the only validation environment, the Hadoop connection in the mapping is lost. You need to set the validation environments and select the Hadoop connection again. This can happen when you upgrade from a previous version or when you import a mapping from a previous version.

0 COMMENTS

We’d like to hear from you!