Table of Contents

Search

  1. Installation Getting Started
  2. Before You Install the Services
  3. Run the Big Data Suite Installer
  4. After You Install the Services
  5. Install the Developer Tool
  6. Uninstallation
  7. Starting and Stopping Informatica Services
  8. Connecting to Databases
  9. Updating the DynamicSections Parameter of a DB2 Database

Installation and Configuration Guide

Installation and Configuration Guide

Interactive Data Preparation Service Details

Interactive Data Preparation Service
Details

Create the
Interactive Data Preparation Service
. If you run the installer to create the
Enterprise Data Preparation Service
and the
Interactive Data Preparation Service
, you must create both services on the same node.
  1. Specify the name of the
    Interactive Data Preparation Service
    .
    The name is not case sensitive and must be unique within the domain. It cannot exceed 128 characters or begin with @. It also cannot contain spaces or the following special characters: ` ~ % ^ * + = { } \ ; : ' " / ? . , < > | ! ( ) ] [
  2. If you plan to use rules, you must associate a Data Integration Service and a Model Repository Service with the
    Interactive Data Preparation Service
    .
    • To skip associating a Model Repository Service and a Data Integration Service with the
      Interactive Data Preparation Service
      , press
      1
      .
    • To associate a Model Repository Service and a Data Integration Service with the
      Interactive Data Preparation Service
      , press
      2
      , and then enter the service names.
  3. Enter the name of the node where the
    Interactive Data Preparation Service
    runs.
    • To create the service during installation, enter the name of the current node.
    • If you do not want to create the service during installation, do not enter a value. You can use the Administrator tool to create the service after installation.
  4. Enter the name of the Informatica license to associate with the service.
  5. Choose whether to enable secure communication for the service.
    • To enable secure communication for the service, press
      1
      .
    • To disable secure communication, press
      2
      .
  6. If you enable secure communication for the service, select the SSL certificate to use.
    • To use the default Informatica SSL certificate contained in the default keystore and the default truststore, press
      1
      .
    • To use a custom SSL certificate contained in a custom keystore and truststore, press
      2
      , and then enter the path and file name for the keystore and truststore files. You must also enter the keystore and truststore passwords.
  7. If you enable secure communication for the service, enter the port number for the HTTPS connection. If you enable non-secure communication for the service, enter the port number for the HTTP connection.
  8. Select the Hadoop authentication mode.
    • To select the non-secure authentication mode, press
      1
      .
    • To select Kerberos authentication, press
      2
      .
  9. If you select Kerberos, enter the authentication parameters.
    The following table describes the authentication parameters that you set if you select Kerberos:
    Property
    Description
    HDFS Principal Name
    Service Principal Name (SPN) for the data preparation Hadoop cluster.
    Hadoop Impersonation User Name
    User name to use in Hadoop impersonation as shown in the Impersonation User Name property for the Hadoop connection in the Administrator tool.
    If the Hadoop cluster uses Kerberos authentication, the Hadoop impersonation user must have read, write, and execute permissions on the HDFS storage location folder.
    Kerberos Keytab File
    Path and file name of the SPN keytab file for the user account to impersonate when connecting to the Hadoop cluster. The keytab file must be in a directory on the machine where the service runs.
    Fully Qualified Path to the Kerberos Configuration File
    Path to the krb5.conf Kerberos configuration file.
  10. Specify the HDFS storage location, HDFS connection, local storage location, and Solr port number details.
    The following table describes the properties you set:
    Property
    Description
    HDFS Storage Location
    HDFS location for data preparation file storage. If the Hadoop cluster uses Kerberos authentication, the Hadoop impersonation user must have read, write, and execute permissions on the HDFS storage location folder.
    HDFS Connection
    HDFS connection for data preparation file storage.
    Local Storage Location
    Directory for data preparation file storage on the node on which the
    Interactive Data Preparation Service
    runs. If the connection to the local storage fails, the service recovers data preparation files from the HDFS storage location.
    Solr port
    Solr port number for the Apache Solr server used to provide data preparation recommendations.
  11. Choose whether to enable the
    Interactive Data Preparation Service
    .
    • To enable the service at a later time using the Administrator tool, press
      1
      .
    • To enable the service after you complete the installation process, press
      2
      .
The
Enterprise Data Preparation Service
Details section appears.