Data Integration Connectors
- Data Integration Connectors
- All Products
Connection property
| Description
|
---|---|
Authentication Type
| You can select one of the following authentication types:
|
JDBC URL
*
| The JDBC URL to connect to Hive.
Specify the following format based on your requirement:
|
JDBC Driver
*
| The JDBC driver class to connect to Hive.
|
Username
| The user name to connect to Hive in LDAP or None mode.
|
Password
| The password to connect to Hive in LDAP or None mode.
|
Principal Name
| The principal name to connect to Hive through Kerberos authentication.
|
Impersonation Name
| The user name of the user that the Secure Agent impersonates to run mappings on a Hadoop cluster. You can configure user impersonation to enable different users to run mappings or connect to Hive. The impersonation name is required for the Hadoop connection if the Hadoop cluster uses Kerberos authentication.
|
Keytab Location
| The path and file name to the Keytab file for Kerberos login.
|
Configuration Files Path
*
| The directory that contains the Hadoop configuration files for the client.
Copy the site.xml files from the Hadoop cluster and add them to a folder in the Linux box. Specify the path in this field before you use the connection in a mapping to access Hive on a Hadoop cluster:
|
DFS URI
*
| The URI to access the Distributed File System (DFS), such as Amazon S3, Microsoft Azure Data Lake Storage, and HDFS.
For mappings in advanced mode that run on the
advanced cluster , Azure Data Lake Storage Gen2 is supported on the Azure HDinsight cluster.
Based on the DFS you want to access, specify the required storage and bucket name.
For example, for HDFS, refer to the value of the
fs.defaultFS property in the
core-site.xml file of the Hadoop cluster and enter the same value in the
DFS URI field.
|
DFS Staging Directory
| The staging directory in the Hadoop cluster where the Secure Agent stages the data. You must have full permissions for the DFS staging directory.
Specify a transparent encrypted folder as the staging directory.
|
Hive Staging Database
| The Hive database where external or temporary tables are created. You must have full permissions for the Hive staging database.
|
Additional Properties
| Applies to mappings in advanced mode.
Configure the property as follows:
<DFS property name>=<value>;<DFS property name>=<value>
For example:
To access the Amazon S3 file system, specify the access key, secret key, and the Amazon S3 property name, each separated by a semicolon:
To access the Azure Data Lake Storage Gen2 file system, specify the authentication type, authentication provider, client ID, client secret, and the client endpoint, each separated with a semicolon:
|
* These fields are mandatory parameters.
|