Data Integration
- Data Integration
- All Products
GET <server URL>/disnext/api/v1/CodeTask/<Code task ID>
Field
| Type
| Description
|
---|---|---|
codeTaskName
| String
| Name of the
code task .
|
codeTaskId
| Numeric
| The
code task identifier.
|
agentGroupId
| String
| Runtime environment that contains the Secure Agent used to run the
code task .
|
overrideTaskTimeout
| String
| Overrides the code task timeout value for this execution. A value of -1 signifies no timeout.
|
logLevel
| String
| Log level for session logs, agent job log, Spark driver, and executor logs. Valid values are: none, terse, normal, verboseInitialization, or verboseData.
The default value is normal.
|
sparkMainClass
| String
| Entry point of the Spark application. For example:
org.apache.spark.examples.company.SparkExampleApp
|
sparkMainClassArgs
| List<String>
| Ordered arguments sent to the Spark application main class. For example:
--appTypeSPARK_PI_FILES_JARS--
classesToLoadcom.company.test.SparkTest1Class
|
sparkPrimaryResource
| String
| Scala JAR file that contains the
code task .
|
sparkJars
| List <String>
| The directory and file name of the JAR file that is uploaded to the cluster and added to the Spark driver and executor classpaths.
|
sparkFiles
| List <String>
| The directory and file name of the Spark file that is uploaded to the cluster and available under the current working directory.
|
advancedCustomProperties
| String
| Spark properties or other custom properties that
Data Integration uses.
|
GET <server URL>/disnext/api/v1/CodeTask/<Code task ID> Content-Type: application/json Accept: application/json IDS-SESSION-ID:{{IDS_SESSION_ID}}
{ "codeTaskName": "CODETASK_API", "codeTaskId": 3, "agentGroupId": "01000025000000000003", "overrideTaskTimeout": null, "logLevel": "normal", "sparkMainClass": "org.apache.spark.examples.infa.sparkdirect.SparkDirectExampleApp", "sparkMainClassArgs": ["6"], "sparkPrimaryResource": "spark-examples_2.12-3.0.0.jar", "sparkJars": [], "sparkFiles": [], "advancedCustomProperties": "{\"spark.driver.memory\": \"1G\", \"spark.executor.memory\": \"1G\", \"spark.kubernetes.driverEnv.SPARK_DIRECT_TASK_SLEEP\": \"600\", \"spark.kubernetes.driverEnv.SPARK_DIRECT_APP_TYPE\": \"SPARK_PI\", \"spark.kubernetes.driverEnv.SPARK_DIRECT_KMS_ENCRYPTED_PROPS\": \"spark.sparkdirect.kms.prop\", \"spark.sparkdirect.kms.prop\": \"5pkOjS0HILDwSaW6eyxtiwB3g2TBYayjKLRFSSyxn5M=0p6v3eCvrtFkw6K78Buwal\", \"advanced.custom.property\": \"infa.spark.local=false\"}" }