Table of Contents

Search

  1. Preface
  2. Introduction to Hive Connector
  3. Hive connections
  4. Mappings and mapping tasks with Hive Connector
  5. Migrating a mapping
  6. Data type reference
  7. Troubleshooting

Hive Connector

Hive Connector

Running a mapping on Azure HDInsights Kerberos cluster with WASB storage

Running a mapping on Azure HDInsights Kerberos cluster with WASB storage

To read and process data from sources that use a Kerberos-enabled environment, you must configure the Kerberos configuration file, create user authentication artifacts, and configure Kerberos authentication properties for the Informatica domain.
To run a mapping for Hive Connector using the Azure HDInsights with Windows Azure Storage Blob (WASB) kerberos cluster, perform the following steps:
  1. Go to the
    /usr/lib/python2.7/dist-packages/hdinsight_common/
    directory on the Hadoop cluster node.
  2. Run the following command to decrypt the account key:
    /decrypt.sh ENCRYPTED ACCOUNT KEY
  3. Edit the
    core-site.xml
    file, in Agent conf location.
  4. Replace the encrypted account key provided in the
    fs.azure.account.key.STORAGE_ACCOUNT_NAME.blob.core.windows.net
    property with the decrypted key, received as the output of the step #2.
  5. Comment out the following properties to disable encryption and decryption of the account key:
    • fs.azure.account.keyprovider.STORAGE_ACCOUNT_NAME.blob.core.windows.net
    • fs.azure.shellkeyprovider.script
  6. Save the
    core-site.xml
    file.
  7. Copy the
    hdinsight_common folder
    from
    /usr/lib/python2.7/dist-packages/hdinsight_common/
    to the Secure Agent location.
  8. Open the
    core-site.xml
    file in a browser to verify if the xml tags appear and ensure that there are no syntax issues.
  9. Restart the Secure Agent.
Azure HDInsights Kerberos cluster with WASB storage is not applicable for mappings that run on the
advanced cluster
.

0 COMMENTS

We’d like to hear from you!