Table of Contents

Search

  1. Preface
  2. Mappings
  3. Parameters
  4. CLAIRE recommendations
  5. Data catalog discovery
  6. Visio templates

Mappings

Mappings

Parameter file location

Parameter file location

When you use a parameter file, save the parameter file on a local machine or in a cloud-hosted directory based on the task type. You enter details about the parameter file on the
Schedule
tab when you create the task.
By default,
Data Integration
uses the following parameter file directory on the Secure Agent machine:
<Secure Agent installation directory>/apps/Data_Integration_Server/data/userparameters
When you use a parameter file in a synchronization task, save the parameter file in the default directory.
For mapping tasks, you can also save the parameter file in one of the following locations:
A local machine
Save the file in a location that is accessible by the Secure Agent.
You enter the file name and directory on the
Schedule
tab when you create the task. Enter the absolute file path. Alternatively, enter a path relative to a $PM system variable, for example, $PMSessionLogDir/ParameterFiles.
You can use the following system variables:
  • $PMRootDir
  • $PMTargetFileDir
  • $PMSourceFileDir
  • $PMLookupFileDir
  • $PMCacheDir
  • $PMSessionLogDir
  • $PMExtProcDir
  • $PMTempDir
To find the configured path of a system variable, see the pmrdtm.cfg file located at the following directory:
<Secure Agent installation directory>\apps\Data_Integration_Server\55.0.<version>\ICS\main\bin\rdtm
You can also find the configured path of any variable except $PMRootDir in the Data Integration Server system configuration details in Administrator.
If you do not enter a location,
Data Integration
uses the default parameter file directory.
A cloud platform
You can use a connection stored with
Informatica Intelligent Cloud Services
. The following table shows the connection types that you can use and the configuration requirements for each connection type:
Connection type
Requirements
Amazon S3 V2
You can use a connection that was created with the following credentials:
  • Access Key
  • Secret Key
  • Region
The S3 bucket must be public.
Azure Data Lake Store Gen2
You can use a connection that was created with the following credentials:
  • Account Name
  • ClientID
  • Client Secret
  • Tenant ID
  • File System Name
  • Directory Path
The storage point must be public.
Google Storage V2
You can use a connection that was created with the following credentials:
  • Service Account ID
  • Service Account Key
  • Project ID
The storage bucket must be public.
Create the connection before you configure the task. You select the connection and file object to use on the
Schedule
tab when you create the task.
Data Integration
displays the location of the parameter file and the value of each parameter in the job details after you run the task.