Table of Contents

Search

  1. Preface
  2. Part 1: Hadoop Integration
  3. Part 2: Databricks Integration
  4. Appendix A: Connections Reference

Configure the Data Integration Service

Configure the Data Integration Service

Configure the Data Integration Service to integrate with the Hadoop environment.
Perform the following pre-integration tasks:
  1. Download Informatica Hadoop binaries to the Data Integration Service machine if the operating systems of the Hadoop environment and the Data Integration Service are different.
  2. Configure the Data Integration Service properties, such as the cluster staging directory, Hadoop Kerberos service principal name, and the path to the Kerberos keytab file.
  3. Prepare an installation of Python on the Data Integration Service machine or on the Hadoop cluster if you plan to run the Python transformation.
  4. Copy the krb5.conf file to the following location on the machine that hosts the Data Integration Service:
    • <Informatica installation directory>/java/jre/lib/security
    • <Informatica installation directory>/services/shared/security
  5. Copy the keytab file to the following directory:
    <Informatica installation directory>/isp/config/keys

0 COMMENTS

We’d like to hear from you!