Table of Contents

Search

  1. Preface
  2. Informatica Intelligent Cloud Services REST API
  3. Platform REST API version 2 resources
  4. Platform REST API version 3 resources
  5. Data Integration REST API
  6. Mass Ingestion Files REST API
  7. Mass Ingestion Streaming REST API
  8. RunAJob utility
  9. ParamSetCli utility
  10. REST API codes
  11. REST API resource quick references

REST API Reference

REST API Reference

Create a code task

Create a
code task

Use the CodeTask resource to create a
code task
. The response includes the code task ID that you can use in subsequent API calls.

POST request

Use the following URI to create a
code task
:
POST <server URL>/disnext/api/v1/CodeTask
Use the following fields in the POST request:
Field
Type
Required / Optional
Description
codeTaskName
String
Required
Name of the
code task
.
runtimeEnvironmentName
String
Optional
Runtime environment used for the
code task
.
Either the runtimeEnvironmentName or the agentGroupId is required. If both are provided, then the agentGroupId is used.
codeExecutionParameters
Parameters in the
code task
.
agentGroupId
String
Required
Runtime environment that contains the Secure Agent used to run the
code task
.
Either the runtimeEnvironmentName or the agentGroupId is required. If both are provided, then the agentGroupId is used.
overrideTaskTimeout
Long
Optional
Overrides the code task timeout value for this execution. A value of -1 signifies no timeout.
logLevel
String
Optional
Log level for session logs, agent job log, Spark driver, and executor logs. Valid values are: none, terse, normal, verboseInitialization, or verboseData.
The default value is normal.
sparkMainClass
String
Required
Entry point of the Spark application. For example:
org.apache.spark.examples.company.SparkExampleApp
sparkMainClassArgs
List <String>
Optional
Ordered arguments sent to the Spark application main class. For example:
--appTypeSPARK_PI_FILES_JARS--
classesToLoadcom.company.test.SparkTest1Class
sparkPrimaryResource
String
Required
Scala JAR file that contains the
code task
.
sparkJars
List <String>
Optional
The directory and file name of the JAR file that is uploaded to the cluster and added to the Spark driver and executor classpaths.
sparkFiles
List <String>
Optional
The directory and file name of the Spark file that is uploaded to the cluster and available under the current working directory.
advancedCustomProperties
String
Optional
Spark properties or other custom properties that
Data Integration
uses. For example:
"{\"spark.driver.memory\": \"2G\", \"spark.executor.instances\": \"4\"}"

POST response

The following variable is set from the response attributes:
Name
Response Value
Note
CODE_TASK_ID
codeTaskId
Used in the start and view
code task
resources.

POST request example

Use the following sample as a reference to create a
code task
:
POST <server URL>/disnext/api/v1/CodeTask Content-Type: application/json Accept: application/json IDS-SESSION-ID:{{IDS_SESSION_ID}} { "codeTaskName" : "CODETASK_API", "runtimeEnvironmentName" : "{{RTE_NAME}}", "codeExecutionParameters" : { "agentGroupId": "{{AGENT_GROUP_ID}}", "logLevel": "normal", "sparkMainClass": "org.apache.spark.examples.infa.sparkdirect.SparkDirectExampleApp", "sparkMainClassArgs": ["6"], "sparkPrimaryResource": "spark-examples_2.12-3.0.0.jar", "sparkJars": [], "sparkFiles": [], "advancedCustomProperties": "{\"spark.driver.memory\": \"1G\", \"spark.executor.memory\": \"1G\", \"spark.kubernetes.driverEnv.SPARK_DIRECT_TASK_SLEEP\": \"600\", \"spark.kubernetes.driverEnv.SPARK_DIRECT_APP_TYPE\": \"SPARK_PI\", \"spark.kubernetes.driverEnv.SPARK_DIRECT_KMS_ENCRYPTED_PROPS\": \"spark.sparkdirect.kms.prop\", \"spark.sparkdirect.kms.prop\": \"5pkOjS0HILDwSaW6eyxtiwB3g2TBYayjKLRFSSyxn5M=0p6v3eCvrtFkw6K78Buwal\", \"advanced.custom.property\": \"infa.spark.local=false\"}" } }

POST response example

A successful POST response returns a summary, the
code task
ID, and the
code task
name similar to the following example:
{ "summary": "Code Task created successfully", "codeTaskId": 3, "codeTaskName": "CODETASK_API" }

0 COMMENTS

We’d like to hear from you!