Data Engineering Streaming
- Data Engineering Streaming 10.4.0
- All Products
Property
| Description
|
---|---|
Database Type
| The database type.
|
Name
| Name of the connection. The name is not case sensitive and must be unique within the domain. The name cannot exceed 128 characters, contain spaces, or contain the following special characters:
~ ` ! $ % ^ & * ( ) - + = { [ } ] | \ : ; " ' < , > . ? /
|
ID
| String that the Data Integration Service uses to identify the connection. The ID is not case sensitive. It must be 255 characters or less and must be unique in the domain. You cannot change this property after you create the connection. Default value is the connection name.
|
Description
| The description of the connection. The description cannot exceed 765 characters.
|
User Name
| The database user name.
|
Password
| The password for the database user name.
|
JDBC Driver Class Name
| Name of the JDBC driver class.
The following list provides the driver class name that you can enter for the applicable database type:
For more information about which driver class to use with specific databases, see the vendor documentation.
|
Connection String
| Connection string to connect to the database. Use the following connection string:
For more information about the connection string to use with specific drivers, see the vendor documentation.
|
Environment SQL
| Optional. Enter SQL commands to set the database environment when you connect to the database. The Data Integration Service executes the connection environment SQL each time it connects to the database.
If you enable Sqoop, Sqoop ignores this property.
|
Transaction SQL
| Optional. Enter SQL commands to set the database environment when you connect to the database. The Data Integration Service executes the transaction environment SQL at the beginning of each transaction.
If you enable Sqoop, Sqoop ignores this property.
|
SQL Identifier Character
| Type of character that the database uses to enclose delimited identifiers in SQL queries. The available characters depend on the database type.
Select (None) if the database uses regular identifiers. When the Data Integration Service generates SQL queries, the service does not place delimited characters around any identifiers.
Select a character if the database uses delimited identifiers. When the Data Integration Service generates SQL queries, the service encloses delimited identifiers within this character.
If you enable Sqoop, Sqoop ignores this property.
|
Support Mixed-case Identifiers
| Enable if the database uses case-sensitive identifiers. When enabled, the Data Integration Service encloses all identifiers within the character selected for the
SQL Identifier Character property.
When the
SQL Identifier Character property is set to none, the
Support Mixed-case Identifiers property is disabled.
If you enable Sqoop, Sqoop honors this property when you generate and execute a DDL script to create or replace a target at run time. In all other scenarios, Sqoop ignores this property.
|
Use Sqoop Connector
| Enables Sqoop connectivity for the data object that uses the JDBC connection. The Data Integration Service runs the mapping in the Hadoop run-time environment through Sqoop.
You can configure Sqoop connectivity for relational data objects, customized data objects, and logical data objects that are based on a JDBC-compliant database.
Select
Sqoop v1.x to enable Sqoop connectivity.
Default is
None .
|
Sqoop Arguments
| Enter the arguments that Sqoop must use to connect to the database. Separate multiple arguments with a space.
To run the mapping on the Blaze engine with the Teradata Connector for Hadoop (TDCH) specialized connectors for Sqoop, you must define the TDCH connection factory class in the Sqoop arguments. The connection factory class varies based on the TDCH Sqoop Connector that you want to use.
To run the mapping on the Spark engine, you do not need to define the TDCH connection factory class in the Sqoop arguments. The Data Integration Service invokes the Cloudera Connector Powered by Teradata and Hortonworks Connector for Teradata (powered by the Teradata Connector for Hadoop) by default.
To run the mapping with a generic JDBC connector instead of the specialized Cloudera or Hortonworks connector, you must define the --driver and --connection-manager Sqoop arguments in the JDBC connection. If you define the --driver and --connection-manager arguments in the Read or Write transformation of the mapping, Sqoop ignores the arguments.
If you do not enter Sqoop arguments, the Data Integration Service constructs the Sqoop command based on the JDBC connection properties.
|