Table of Contents

Search

  1. Preface
  2. Monitoring jobs
  3. Monitoring Data Integration jobs
  4. Data Integration job log files
  5. Monitoring Data Accelerator for Azure jobs
  6. Monitoring Data Profiling jobs
  7. Monitoring imports and exports
  8. Monitoring file transfer jobs
  9. Monitoring Mass Ingestion jobs
  10. Monitoring advanced clusters
  11. Monitoring source control logs

Monitor

Monitor

Job results

Job results

The job results for each subtask that runs on an
advanced cluster
display the status of the job, a download link to the Spark execution plan, and an error message, if any.
The job results include the following properties:
Property
Description
Status
Job status. A job can have one of the following statuses:
  • Starting. The job is starting.
  • Running. The job is either queued or running.
  • Success. The job completed successfully.
  • Failed. The job did not complete because it encountered errors.
If the
advanced cluster
is not running when you run a job, the job waits for the cluster to start. During this time the job status is Starting.
If the Secure Agent fails while the job is running, the status of the job continues to display Running. You must cancel the job and run the job again.
The status for a queued job displays Running. To find out if a job is queued or running, check the session log.
Execution Plan
Allows you to download the Spark execution plan which shows the runtime Scala code that the
advanced cluster
uses to run the data logic in the mapping. You can use the Scala code to debug issues in the mapping.
Error Message
Error message, if any, that is associated with the job.

0 COMMENTS

We’d like to hear from you!