Table of Contents

Search

  1. Preface
  2. Informatica Intelligent Cloud Services REST API
  3. Platform REST API version 2 resources
  4. Platform REST API version 3 resources
  5. Data Integration REST API
  6. Mass Ingestion Files REST API
  7. Mass Ingestion Streaming REST API
  8. Model Serve REST API
  9. RunAJob utility
  10. ParamSetCli utility
  11. REST API codes
  12. REST API resource quick references

REST API Reference

REST API Reference

job resource

job resource

Use the
job
resource to start a
file ingestion
job. You can also use the
job
resource to retrieve job status or job logs for a
file ingestion
task. Use the
file ingestion
REST API version 1 task resource to retrieve the ID and name of the task.

RUN Request

To start a
file ingestion
task job, use the following URI:
mftsaas/api/v1/job
Include the following information in the request:
Field
Type
Required
Description
taskId
String
Yes
File ingestion
ID.
taskName
String
-
File ingestion
name.
Use the following source directory and target directory keys for the specified connectors when you start a
file ingestion
job:
Connector
srcDir
tgtDir
local
sourceDirectory
targetDirectory
ftp,ftps,sftp
sourceDirectory
targetDirectory
gcs
sourceDirectory
gcsTargetLocation
hdfs
sourceDirectory
hdfsTargetLocation
adlsgen2
sourceDirectory
adlsGen2TargetLocation
s3
s3SourceLocation
s3TargetLocation
blob
blobSourceLocation
blobContainer
You can overwrite the following parameters using the job resource REST API:
Category
Parameter
ID
General
Source Connection
sourceConnection
General
Target Connection
targetConnection
General
Parallel Batch Log Level
parallelBatch
General
Log Level
logLevel
Source
Source Directory
sourceDirectory
Source
File Pattern
filePattern
Source
Batch Size
batchSize
Source
Include files from sub-folders
includeSubfolder
Source
Skip Duplicate files
checkDuplicate
Source
Check File Stability
fileStability
Source
Stability Check Interval
stabilityCheckInterval
Target
Target Directory
targetDirectory
You must pass the connection ID to overwrite the source and target connection parameters.
Use the following sample as a reference to start a
file ingestion
task job:
{ "taskId": "k1YHA1blhcBjbJvCIRQX2s", "taskName": "localtolocal_param2" }
Use the following sample request to overwrite the source option values that were passed in the user interface:
"variables": [{ "variable": "<string>", "value": "<string>" }]
In the following example, the parameter value that were passed in the user interface is overwritten to corresponding values provided in JSON POST while using the job resource REST API:
{ "taskId": "0efdVUEZeV2cB0quomeksd", "taskName": "localtolocal_param2", "parameters": { "category": [{ "id": "General", "parameter": [ { "id":"sourceConnection", "value":"AdvancedSFTPV2" }, { "id":"targetConnection", "value":"AdvancedSFTPV2" }, { "id":"parallelBatch", "value":"10" }, { "id":"logLevel", "value":"DEBUG" } ] },{ "id": "Source", "parameter": [{ "id": "sourceDirectory", "value": "/root/test1" }, { "id":"filePatternType", "value":"reg" }, { "id": "filePattern", "value": "*.txt" }, { "id": "batchSize", "value": "5" }, { "id":"includeSubfolder", "value":"true" }, { "id":"checkDuplicate", "value":"true" }, { "id":"fileStability", "value":"true" }, { "id":"stabilityCheckInterval", "value":"30" } ] }, { "id": "Target", "parameter": [{ "id": "targetDirectory", "value": "/root/test2" }] } ] } }
The following example shows to override a file ingestion task with
filename
as a variable:
{ "taskId": "4m24k3UFWMkkqd55YDefIB", "taskName": "R41_Local_Local", "parameters": { "category": [ { "id": "Source", "parameter": [ { "id": "sourceDirectory", "value": "/${Parentfolder}" }, { "id": "filePickupFilePath", "value": "${filename}" }, { "id": "batchSize", "value": "5" } ] }, { "id": "Target", "parameter": [ { "id": "targetDirectory", "value": "/${Parentfolder}/Target" } ] } ] }, "variables": [ { "variable": "Parentfolder", "value": "root/Arun" }, { "variable": "filename", "value": "filepath.txt" } ] }
The following exampleshows to override a file ingestion task with
filelist
as a variable:
{ "taskId": "4m24k3UFWMkkqd55YDefIB", "taskName": "R41_Local_Local", "parameters": { "category": [ { "id": "Source", "parameter": [ { "id": "sourceDirectory", "value": "/${Parentfolder}" }, { "id": "filePickupFileList", "value": "${filelist}" }, { "id": "batchSize", "value": "5" } ] }, { "id": "Target", "parameter": [ { "id": "targetDirectory", "value": "/${Parentfolder}/Target" } ] } ] }, "variables": [ { "variable": "Parentfolder", "value": "root/Arun" }, { "variable": "filelist", "value": "File1.txt,File2.txt,File3.txt,File4.txt" } ] }

RUN Response

If successful,
file ingestion
returns the run ID for the job. Use the run ID to monitor the job status and request log files for the job.
If unsuccessful, the response includes a reason for the failure.

GET Status Request

To retrieve the status of a specific
file ingestion
task job, use the following URI:
mftsaas/api/v1/job/<runId>/status

GET Status Response

If successful,
file ingestion
returns the job status and the job details, which includes a list of files and the details and status of each file.
If unsuccessful, the response includes a reason for the failure.

GET Job Logs Request

To retrieve the log files for a specific
file ingestion
task job, use the following URI:
mftsaas/api/v1/job/<runId>/logs

GET Job Logs Response

If successful,
file ingestion
returns the log files for the job.
If unsuccessful, the response includes a reason for the failure.

0 COMMENTS

We’d like to hear from you!