Data Integration
- Data Integration
- All Products
Property
| Description
|
|---|---|
Status
| Status of the Spark task. The Spark task can have one of the following statuses:
If the Secure Agent fails while the job is running, the status of the Spark tasks continues to display Running. You must cancel the job and run the job again.
|
Start time
| Date and time when the Spark task started.
|
End time
| Date and time when the Spark task ended.
|
Duration
| Amount of time that the Spark task ran.
|
Memory Per Executor
| Amount of memory that each Spark executor uses.
|
Cores Per Executor
| Number of cores that each Spark executor uses.
|
Driver and Agent Job Logs
| Select
Download to download the Spark driver and agent job logs.
|
Advanced Log Location
| The log location that is configured in the
advanced configuration for the
advanced cluster . You can navigate to the advanced log location to view and download the agent job log, Spark driver log, and Spark executor logs.
|
Error Message
| Error message, if any, that is associated with the job.
|
Property
| Description
|
|---|---|
Job Name
| Name of the Spark job or stage.
|
Duration
| Amount of time that the Spark job or stage ran.
|
Total Tasks
| Number of tasks the Spark job or stage attempted.
|
Failed Tasks
| Number of tasks that the Spark job or stage failed to complete.
|
Input Size / Records
| Size of the file and number of records input by the Spark job or stage.
|
Output Size / Records
| Size of the file and number of records output by the Spark job or stage.
|
Status
| Status of the Spark job or stage. The status can be one of the following values:
After you abort a
code task , there might be some lag time before the Monitor service shows the status as
Aborted .
|