Hi, I'm Ask INFA!
What would you like to know?
ASK INFAPreview
Please to access Ask INFA.

Table of Contents

Search

  1. Preface
  2. Monitoring jobs
  3. Monitoring Data Integration jobs
  4. Data Integration job log files
  5. Monitoring imports and exports

Monitoring Data Integration Jobs

Monitoring Data Integration Jobs

Code task API execution parameters

Code task API execution parameters

The execution parameters for each
code task
instance display the API parameters used in the task.
The following table describes the execution parameters for the
code task
:
Property
Required / Optional
Description
Override Code Task Timeout
Optional
Overrides the code task timeout value for this execution. A value of -1 signifies no timeout.
Log Level
Optional
Log level for session logs, agent job log, Spark driver, and executor logs. Valid values are: none, terse, normal, verboseInitialization, or verboseData.
The default value is normal.
The following table describes the Spark properties for the
code task
:
Property
Required / Optional
Description
Main Class
Required
Entry point of the Spark application. For example:
org.apache.spark.examples.company.SparkExampleApp
Main Class Arguments
Optional
Ordered arguments sent to the Spark application main class. For example:
--appTypeSPARK_PI_FILES_JARS--
classesToLoadcom.company.test.SparkTest1Class
Primary Resource
Required
Scala JAR file that contains the
code task
.
JAR File Path
Optional
The directory and file name of the JAR file that is uploaded to the cluster and added to the Spark driver and executor classpaths.
Spark File Path
Optional
The directory and file name of the Spark file that is uploaded to the cluster and available under the current working directory.
Custom Properties
Optional
Spark properties or other custom properties that
Data Integration
uses.

0 COMMENTS

We’d like to hear from you!