Table of Contents

Search

  1. Preface
  2. Part 1: Installation Getting Started
  3. Part 2: Before You Install the Services
  4. Part 3: Run the Services Installer
  5. Part 4: After You Install the Services
  6. Part 5: Informatica Client Installation
  7. Part 6: Uninstallation
  8. Appendix A: Starting and Stopping Informatica Services
  9. Appendix B: Managing Distribution Packages
  10. Appendix C: Connecting to Databases from UNIX or Linux
  11. Appendix D: Updating the DynamicSections Parameter of a DB2 Database

Installation for Data Engineering

Installation for Data Engineering

Integrate the Domain with the Hadoop or Databricks Environment

Integrate the Domain with the Hadoop or Databricks Environment

If you imported the cluster configuration from the Hadoop or Databricks environment during installation, you must complete the integration between the domain and the Hadoop environment. Integration tasks are required in both the Hadoop environment and the Informatica domain environment.
To integrate the domain with the Hadoop environment, you complete the following high-level tasks:
  1. Prepare directories, users, and permissions.
  2. Configure *-site.xml files on the Hadoop or Databricks environment. The properties *-site.xml files must be updated with values required for Informatica processing in the third-party environment.
  3. Refresh the cluster configuration in the Administrator tool. Refresh the cluster configuration to get the updated properties from the *-site.xml files on the cluster.
  4. Update connections in the Administrator tool. Update connections if you want to use property values other than the default values. You will also need to configure environment variables in the connection properties.
For more information about how to import a Hadoop cluster configuration, see the
Data Engineering Integration Guide
.

0 COMMENTS

We’d like to hear from you!