Data Engineering Integration
- Data Engineering Integration 10.4.0
- All Products
<RESTOperationsHubService_Host>:<RESTOperationsHubService_Port>/RestOperationsHub/services/v1/MappingService/MappingExecutionPlans('jobId')
Field
| Type
| Description
|
---|---|---|
userName
| String
| Required. User name to connect to the domain.
You can pass the input value as a header.
|
encryptedpassword
| String
| Required. Password for the user. Encrypt the password with the pmpasswd command line program.
You can pass the input value as a header.
|
securityDomain
| String
| Optional. The security domain to which the domain user belongs.
You can pass the input value as a header.
|
jobId
| String
| Required. The argument of the entity that contains the ID for the mappings.
You can pass the input value as a query.
|
Field
| Type
| Description
|
---|---|---|
name
| Integer
| Name of the script.
|
content
| String
| For Spark, the Data Integration Service translates the mapping to a Scala program and an optional set of commands. The execution plan shows the commands and the Scala program code.
For Blaze, the content comprises of session task, instances, and type.
|
depends
| String
| Tasks that the script depends on. Tasks include other scripts and Data Integration Service tasks, like the Start task.
|
Field
| Description
|
---|---|
Session Task
| Response can include three types of session tasks that are pre session task, main session task, and post session task. The main session task can contain Submappings.
|
Type
| Type of the session task containing a set of segments and DAG vertices.
|
Instances
| Transformation or object name.
|