How to Migrate Mappings from the Hive Engine

How to Migrate Mappings from the Hive Engine

Update Mappings for Dropped Hive Engine Support

Update Mappings for Dropped Hive Engine Support

After you upgrade to version 10.2.2, you need to update mappings that have the Hive engine configured within the Hadoop validation environment. Run a series of infacmd commands to update mappings to change the Hive engine configuration. Informatica continues to support the Blaze and Spark engines in the Hadoop environment.
Run commands using the following infacmd plugins.
  • infacmd dis plugin
    . Run commands with the dis plugin to update mappings that are deployed to the Data Integration Service. For example, dis enableMappingValidationEnvironment.
  • infacmd mrs plugin
    . Run commands with the mrs plugin to update mappings that are not deployed to the Data Integration Service. For example, mrs enableMappingValidationEnvironment.
When you run the commands, the -sn (Service Name) parameter depends on the plugin that you use. Use the name of the Data Integration Service when you run dis commands, and use the name of the Model Repository Service when you run mrs commands.
Run the following commands against both the dis and the mrs plugins.
listMappingEngines
To identify mappings that have the Hive engine configured for validation, run the listMappingEngines command with the
-vef
parameter set to
hive
. Consider the following sample syntax:
mrs|dis listMappingEngines -dn domain_3987 -un Administrator -pd Password -vef hive -sn SN_3986
For more information, see dis listMappingEngines and mrs listMappingEngines.
enableMappingValidationEnvironment
If you want to enable other validation environments, run the enableMappingValidationEnvironment command for each environment that you want to enable. You can enable the following environments: native, blaze, spark, or spark-databricks. Consider the following sample syntax examples based on different command filters:
  • Modify all mappings.
    mrs|dis enableMappingValidationEnvironment -dn domain_3987 -un Administrator -pd Password -sn SN_3986 -ve spark -cn HADOOP_cco_hdp619
  • Modify mappings based on mapping name.
    mrs|dis enableMappingValidationEnvironment -dn domain_3987 -un Administrator -pd Password -sn SN_3986 -ve spark -cn HADOOP_cco_hdp619 -mnf m_nav327,m_nav376
  • Modify mappings based on execution environment, mapping name, and project name.
    mrs|dis enableMappingValidationEnvironment -dn domain_3987 -un Administrator -pd Password -sn SN_3986 -ve spark -cn HADOOP_cco_hdp619 -eef hadoop -mnf m_nav327,m_nav376 -pn project1
setMappingExecutionEnvironment
If you want to change the execution environment, run the setMappingExecutionEnvironment. Consider the following sample syntax based on mapping name filter:
mrs|dis setMappingExecutionEnvironment -dn domain_3987 -un Administrator -pd Password -sn SN_3986 -ee Databricks -mnf m_nav327,m_nav376 -cn DATABRICKS_cco_db619
disableMappingValidationEnvironment
Update all mappings in the Model repository to disable the Hive engine from the Hadoop validation environment. Consider the following sample syntax:
mrs|dis disableMappingValidationEnvironment -dn domain_3987 -un Administrator -pd Password -sn SN_3986 -ve hive
listMappingEngines
Run listMapping Engines again to verify that all Hive validation environments are disabled.

Warnings

Consider the following points of failure
if you do not update the environments
:
  • Mappings fail at run time if configured with the Hive engine as the
    only
    validation environment.
  • If you edit the validation environment in the Developer tool that has the Hive engine as the
    only
    validation environment, the Hadoop connection in the mapping is lost. You need to set the validation environments and select the Hadoop connection again. This can happen when you upgrade from a previous version or when you import a mapping from a previous version.

Back to Top

0 COMMENTS

We’d like to hear from you!