Administrator
- Administrator
- All Products
Connection property
| Description
|
|---|---|
Connection Name
| Name of the Hadoop Files V2 connection.
|
Description
| Description of the connection. The description cannot exceed 765 characters.
|
Use Secret Vault | Stores sensitive credentials for this connection in the secrets manager that is configured for your organization.
This property appears only if secrets manager is set up for your organization.
This property is not
supported by Data Ingestion and Replication and the Data Access Management services. When you enable the secret vault in the connection, you can select
which credentials that the Secure Agent retrieves from the secrets manager. If you
don't enable this option, the credentials are stored in the repository or on a local
Secure Agent, depending on how your organization is configured. If you’re using this connection
to apply data access policies through pushdown or proxy services, you cannot use
the Secret Vault configuration option. For information about how to configure and use a secrets manager, see
"Secrets manager configuration" in the Administrator help .
|
Runtime Environment
| The name of the runtime environment where you want to run the tasks.
|
User Name
| Required to read data from HDFS. Enter a user name that has access to the single-node HDFS location to read data from or write data to.
|
NameNode URI
| The URI to access HDFS.
Use the following format to specify the name node URI in Cloudera, Amazon EMR, and Hortonworks distributions:
hdfs://<namenode>:<port>/
where,
To connect to the Hadoop cluster, specify the name node port
fs.defaultFS .
If the Hadoop cluster is configured for high availability, you must copy the
fs.defaultFS value in the
core-site.xml file and append
/ to specify the name node URI.
For example, the following snippet shows the
fs.defaultFS value in a sample
core-site.xml file:
In the above snippet, the fs.defaultFS value is
and the corresponding name node URI is
Specify either the name node URI or the local path. Do not specify the name node URI if you want to read data from or write data to a local file system path.
|
Local Path
| A local file system path to read and write data. Read the following conditions to specify the local path:
Default value for Local Path is NA.
|
Configuration Files Path
| The directory that contains the Hadoop configuration files.
Copy the core-site.xml, hdfs-site.xml, and hive-site.xmlfrom the Hadoop cluster and add them to a folder in Linux Box.
|
Keytab File
| The file that contains encrypted keys and Kerberos principals to authenticate the machine.
|
Principal Name
| Users assigned to the superuser privilege can perform all the tasks that a user with the administrator privilege can perform.
|
Impersonation Username
| You can enable different users to run mappings in a Hadoop cluster that uses Kerberos authentication or connect to sources and targets that use Kerberos authentication. To enable different users to run mappings or connect to big data sources and targets, you must configure user impersonation.
|