Table of Contents

Search

  1. Preface
  2. Introduction to Hadoop Integration
  3. Before You Begin
  4. Amazon EMR Integration Tasks
  5. Azure HDInsight Integration Tasks
  6. Cloudera CDH Integration Tasks
  7. Hortonworks HDP Integration Tasks
  8. MapR Integration Tasks
  9. Appendix A: Connections

Verify Run-time Drivers

Verify Run-time Drivers

Verify run-time drivers for mappings that access JDBC-compliant databases in the Hadoop environment. Use any Type 4 JDBC driver that the database vendor recommends.
  1. Download Type 4 JDBC drivers associated with the JCBC-compliant databases that you want to access.
  2. To use Sqoop TDCH Hortonworks Connector for Teradata, perform the following task:

      Download all .jar files in the Hortonworks Connector for Teradata package from the following location : http://hortonworks.com/downloads/#addons

      The package has the following naming convention:
      hdp-connector-for-teradata-<version>-distro.tar.gz

  3. To optimize the Sqoop mapping performance on the Spark engine while writing data to an HDFS complex file target of the Parquet format, download the following .jar files:
  4. Copy all of the .jar files to the following directory on the machine where the Data Integration Service runs:
    <Informatica installation directory>\externaljdbcjars
    Changes take effect after you recycle the Data Integration Service. At run time, the Data Integration Service copies the .jar files to the Hadoop distribution cache so that the .jar files are accessible to all nodes in the cluster.

0 COMMENTS

We’d like to hear from you!