Hi, I'm Ask INFA!
What would you like to know?
ASK INFAPreview
Please to access Ask INFA.

Table of Contents

Search

  1. Preface
  2. Monitoring jobs
  3. Monitoring Data Integration jobs
  4. Data Integration job log files
  5. Monitoring imports and exports

Monitoring Data Integration Jobs

Monitoring Data Integration Jobs

Spark application task details

Spark application task details

The Spark application task details for each
code task
display under
Spark Application Task Results
.
Each Spark application task includes the following details:
Property
Description
Status
Status of the Spark task. The Spark task can have one of the following statuses:
  • Running. The task is running.
  • Success. The task completed successfully.
  • Failed. The task did not complete because it encountered errors.
  • Stopped. The task was stopped.
  • Unknown. The status of the task is unknown.
If the Secure Agent fails while the job is running, the status of the Spark tasks continues to display Running. You must cancel the job and run the job again.
Start time
Date and time when the Spark task started.
End time
Date and time when the Spark task ended.
Duration
Amount of time that the Spark task ran.
Memory Per Executor
Amount of memory that each Spark executor uses.
Cores Per Executor
Number of cores that each Spark executor uses.
Driver and Agent Job Logs
Select
Download
to download the Spark driver and agent job logs.
Advanced Log Location
The log location that is configured in the
advanced configuration
for the
advanced cluster
. You can navigate to the advanced log location to view and download the agent job log, Spark driver log, and Spark executor logs.
Error Message
Error message, if any, that is associated with the job.
Each Spark application task is translated into Spark jobs, which are further broken down into stages. You can view the following details for each Spark job and stage:
Property
Description
Job Name
Name of the Spark job or stage.
Duration
Amount of time that the Spark job or stage ran.
Total Tasks
Number of tasks the Spark job or stage attempted.
Failed Tasks
Number of tasks that the Spark job or stage failed to complete.
Input Size / Records
Size of the file and number of records input by the Spark job or stage.
Output Size / Records
Size of the file and number of records output by the Spark job or stage.
Status
Status of the Spark job or stage. The status can be one of the following values:
  • Running. The job or stage is running.
  • Success. The job or stage completed successfully.
  • Failed. The job or stage did not complete because it encountered errors.
  • Aborted. The job or stage did not complete because the user aborted the
    code task
    .
After you abort a
code task
, there might be some lag time before the Monitor service shows the status as
Aborted
.

0 COMMENTS

We’d like to hear from you!