To submit a job to the cluster, the Secure Agent generates an execution plan that divides the data logic in the mapping into multiple Spark tasks. The cluster launches Spark drivers and Spark executors to process the Spark tasks simultaneously.
As developers run additional jobs, the cluster provisions and deprovisions resources to adapt to the size and number of jobs. For example, the cluster can provision additional cluster nodes and cluster storage during processing bursts.
Each job generates a session log, a Spark driver log, Spark executor logs, and an agent job log.