B2B Gateway 
			
			- B2B Gateway
- All Products
 
           
      	
            
	
      POST <server URL>/disnext/api/v1/CodeTask
| Field 
				   | Type 
				   | Required / Optional 
				   | Description 
				   | 
|---|---|---|---|
| codeTaskName 
				   | String 
				   | Required 
				   | Name of the 
					  code task. | 
| runtimeEnvironmentName 
				   | String 
				   | Optional 
				   | Runtime environment used for the 
					  code task. Either the runtimeEnvironmentName or the agentGroupId is required. If both are provided, then the agentGroupId is used.
					  | 
| codeExecutionParameters | Parameters in the 
					  code task. | ||
| agentGroupId 
				   | String 
				   | Required 
				   | Runtime environment that contains the Secure Agent used to run the 
					  code task. Either the runtimeEnvironmentName or the agentGroupId is required. If both are provided, then the agentGroupId is used.
					  | 
| overrideTaskTimeout 
				   | Long 
				   | Optional 
				   | Overrides the code task timeout value for this execution. A value of -1 signifies no timeout. 
					  | 
| logLevel 
				   | String 
				   | Optional 
				   | Log level for session logs, agent job log, Spark driver, and executor logs. Valid values are: none, terse, normal, verboseInitialization, or verboseData. 
					  The default value is normal. 
					  | 
| sparkMainClass 
				   | String 
				   | Required 
				   | Entry point of the Spark application. For example: 
					  org.apache.spark.examples.company.SparkExampleApp 
					  | 
| sparkMainClassArgs 
				   | List <String> 
				   | Optional 
				   | Ordered arguments sent to the Spark application main class. For example: 
					  --appTypeSPARK_PI_FILES_JARS-- 
					  classesToLoadcom.company.test.SparkTest1Class 
					  | 
| sparkPrimaryResource 
				   | String 
				   | Required 
				   | Scala JAR file that contains the 
						 code task. | 
| sparkJars 
				   | List <String> 
				   | Optional 
				   | The directory and file name of the JAR file that is uploaded to the cluster and added to the Spark driver and executor classpaths. 
					  | 
| sparkFiles 
				   | List <String> 
				   | Optional 
				   | The directory and file name of the Spark file that is uploaded to the cluster and available under the current working directory. 
					  | 
| advancedCustomProperties 
				   | String 
				   | Optional 
				   | Spark properties or other custom properties that 
						 Data Integrationuses. For example:  "{\"spark.driver.memory\": \"2G\", \"spark.executor.instances\": \"4\"}" 
					  | 
| Name 
				   | Response Value 
				   | Note 
				   | 
|---|---|---|
| CODE_TASK_ID 
				   | codeTaskId 
				   |  Used in the start and view 
					  code taskresources. | 
POST <server URL>/disnext/api/v1/CodeTask Content-Type: application/json Accept: application/json IDS-SESSION-ID:{{IDS_SESSION_ID}} { "codeTaskName" : "CODETASK_API", "runtimeEnvironmentName" : "{{RTE_NAME}}", "codeExecutionParameters" : { "agentGroupId": "{{AGENT_GROUP_ID}}", "logLevel": "normal", "sparkMainClass": "org.apache.spark.examples.infa.sparkdirect.SparkDirectExampleApp", "sparkMainClassArgs": ["6"], "sparkPrimaryResource": "spark-examples_2.12-3.0.0.jar", "sparkJars": [], "sparkFiles": [], "advancedCustomProperties": "{\"spark.driver.memory\": \"1G\", \"spark.executor.memory\": \"1G\", \"spark.kubernetes.driverEnv.SPARK_DIRECT_TASK_SLEEP\": \"600\", \"spark.kubernetes.driverEnv.SPARK_DIRECT_APP_TYPE\": \"SPARK_PI\", \"spark.kubernetes.driverEnv.SPARK_DIRECT_KMS_ENCRYPTED_PROPS\": \"spark.sparkdirect.kms.prop\", \"spark.sparkdirect.kms.prop\": \"5pkOjS0HILDwSaW6eyxtiwB3g2TBYayjKLRFSSyxn5M=0p6v3eCvrtFkw6K78Buwal\", \"advanced.custom.property\": \"infa.spark.local=false\"}" } }
{ "summary": "Code Task created successfully", "codeTaskId": 3, "codeTaskName": "CODETASK_API" }