Some third-party rules and guidelines apply to processing data depending on the source or target connection that you use to ingest data.
Consider the following rules and guidelines when you ingest data:
The source connection must be a JDBC connection. Use the JDBC connection to access a schema.
For example, to access an Oracle schema, configure a JDBC connection that uses an Oracle driver to connect to an Oracle database. You cannot use an Oracle connection.
JDBC connections must be enabled with Sqoop connectivity.
A source table cannot be ingested using a Sqoop connection if the table contains special characters in the table metadata.
A source table cannot be ingested using a Sqoop connection if column names in the table contain spaces.
Oracle long and number data types cannot be ingested using a Sqoop connection. To resolve the issue, append
NumberPrecisionScale=1
to the JDBC connection string.
A source table cannot be ingested into Hive if the table metadata uses UTF-8 characters. To resolve the issue, configure the Hive metastore for UTF-8 data processing.
A source table cannot be ingested to an Avro file in a Hive target if the source table contains a column with a timestamp data type. To ingest timestamp data to an Avro file, the third-party Hive JDBC driver must use Hive version 1.1 or higher.