Table of Contents

Search

  1. Preface
  2. Monitoring jobs
  3. Monitoring Data Integration jobs
  4. Data Integration job log files
  5. Monitoring Mass Ingestion jobs
  6. Monitoring Data Accelerator for Azure jobs
  7. Monitoring Data Profiling jobs
  8. Monitoring imports and exports
  9. Monitoring file transfer jobs
  10. Monitoring elastic clusters
  11. Monitoring source control logs

Monitor

Monitor

Spark task details

Spark task details

For
mapping
tasks that are based on
elastic mappings
, the mappings are translated into Spark tasks that process the data logic simultaneously. You can view details for each Spark task.
Each Spark task includes the following details:
Property
Description
Name
Name of the Spark task.
Status
Status of the Spark task. The Spark task can have one of the following statuses:
  • Running. The task is running.
  • Succeeded. The task completed successfully.
  • Failed. The task did not complete because it encountered errors.
  • Stopped. The task was stopped.
  • Unknown. The status of the task is unknown.
If the Secure Agent fails while the
elastic job
is running, the status of the Spark tasks continues to display Running. You must cancel the job and run the job again.
Start time
Date and time when the Spark task started.
End time
Date and time when the Spark task ended.
Error message
Error encountered when running the Spark task, if any.
Actions
Actions that you can take regarding the Spark task. You can perform the following actions:
  • Download the Spark driver and agent job logs.
  • View the advanced log location.
    The advanced log location is the log location that is configured in the
    elastic configuration
    for the
    elastic cluster
    .
    You can navigate to the advanced log location to view copies of the agent job log, Spark driver log, and Spark executor logs.
  • Download advanced logs. If you use a serverless runtime environment, you can download the agent job log, Spark driver log, and Spark executor logs.


Updated August 03, 2020