Connections for INFACore

Connections for INFACore

Hive

Hive

You can create a Hive connection to read from and write to Hive tables.

Feature snapshot

Operation
Support
Read
Yes
Write
Yes

Connection properties

The following table describes the Hive connection properties:
Connection property
Description
Connection Name
Name of the connection.
Each connection name must be unique within the organization. Connection names can contain alphanumeric characters, spaces, and the following special characters: _ . + -,
Maximum length is 255 characters.
Authentication Type
You can select one of the following authentication types:
  • Kerberos. Select
    Kerberos
    for a Kerberos cluster.
  • LDAP. Select
    LDAP
    for an LDAP-enabled cluster.
  • None. Select
    None
    for a Hadoop cluster that is not secure or not LDAP-enabled.
JDBC URL
*
The JDBC URL to connect to Hive.
Specify the following format based on your requirement:
  • To view and import tables from a single database, use the following format:
    jdbc:hive2://<host>:<port>/<database name>
  • To view and import tables from multiple databases, do not enter the database name. Use the following JDBC URL format:
    jdbc:hive2://<host>:<port>/
    After the port number, enter a slash.
  • To access Hive on a Hadoop cluster enabled for TLS, specify the details in the JDBC URL in the following format:
    jdbc:hive2://<host>:<port>/<database name>;ssl=true;sslTrustStore=<TrustStore_path>;trustStorePassword=<TrustStore_password>
    ,
    where the truststore path is the directory path of the truststore file that contains the TLS certificate on the agent machine.
JDBC Driver
*
The JDBC driver class to connect to Hive.
Username
The user name to connect to Hive in LDAP or None mode.
Password
The password to connect to Hive in LDAP or None mode.
Principal Name
The principal name to connect to Hive through Kerberos authentication.
Impersonation Name
The user name of the user that the Secure Agent impersonates to run jobs on a Hadoop cluster. You can configure user impersonation to enable different users to run jobs or connect to Hive. The impersonation name is required for the Hadoop connection if the Hadoop cluster uses Kerberos authentication.
Keytab Location
The path and file name to the Keytab file for Kerberos login.
Configuration Files Path
*
The directory that contains the Hadoop configuration files for the client.
Copy the following site.xml files from the Hadoop cluster and add them to a folder in the Linux box: core-site.xml, hdfs-site.xml, and hive-site.xml
Specify the path in this field before you use the connection to access Hive on a Hadoop cluster:
DFS URI
*
The URI to access the Distributed File System (DFS), such as Amazon S3, Microsoft Azure Data Lake Storage, and HDFS.
Based on the DFS you want to access, specify the required storage and bucket name.
For example, for HDFS, refer to the value of the
fs.defaultFS
property in the
core-site.xml
file of the Hadoop cluster and enter the same value in the
DFS URI
field.
DFS Staging Directory
The staging directory in the Hadoop cluster where the Secure Agent stages the data. You must have full permissions for the DFS staging directory.
Specify a transparent encrypted folder as the staging directory.
Hive Staging Database
The Hive database where external or temporary tables are created. You must have full permissions for the Hive staging database.
Additional Properties
The additional properties required to access the DFS.
Configure the property as follows:
<DFS property name>=<value>;<DFS property name>=<value>
For example:
To access the Amazon S3 file system, specify the access key, secret key, and the Amazon S3 property name, each separated by a semicolon:
fs.s3a.<bucket_name>.access.key=<access key value>; fs.s3a.<bucket_name>.secret.key=<secret key value>; fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem;
To access the Azure Data Lake Storage Gen2 file system, specify the authentication type, authentication provider, client ID, client secret, and the client endpoint, each separated with a semicolon:
fs.azure.account.auth.type=<Authentication type>; fs.azure.account.oauth.provider.type=<Authentication_provider>; fs.azure.account.oauth2.client.id=<Client_ID>; fs.azure.account.oauth2.client.secret=<Client-secret>; fs.azure.account.oauth2.client.endpoint=<ADLS Gen2 endpoint>
*
These fields are mandatory parameters.

0 COMMENTS

We’d like to hear from you!