Table of Contents

Search

  1. Abstract
  2. Supported Versions
  3. Performance Tuning and Sizing Guidelines for Informatica® Big Data Management 10.2.2

Performance Tuning and Sizing Guidelines for Informatica® Big Data Management 10.2.2

Performance Tuning and Sizing Guidelines for Informatica® Big Data Management 10.2.2

TDCH for Sqoop Import and Export Guidelines

TDCH for Sqoop Import and Export Guidelines

Spark job scales linearly during Sqoop import and export. You can tune Spark jobs based on cluster resources. Configure the following advanced properties for Spark in the Hadoop connection:
spark.executor.instances=<number of executor instances>
The following formula determines the total running containers:
Total running containers = (Number of cores) x (Number of executor instances)
The Spark engine uses 2 executor instances by default. So, only 4 containers run in parallel. For better performance, fine tune the spark.executor.instances property.

0 COMMENTS

We’d like to hear from you!