Data Integration-Free and PayGo
- Data Integration-Free and PayGo
- All Products
Field
| Description
|
---|---|
Pre-Processing Commands
| Commands to run before the task.
|
Post-Processing Commands
| Commands to run after the task completes.
|
Maximum Number of Log Files
| Number of session log files to retain. By default,
Data Integration stores each type of log file for 10 runs before it overwrites the log files for new runs.
If a dollar sign ($) is present in a custom session log file name, for example, MyLog_$CurrentTime, the file name is dynamic. If you customize the session log file name using a dynamic name, this property doesn't apply. To purge old log files, delete the files manually.
|
Schema Change Handling
| Determines how
Data Integration picks up changes to the object schema. Select one of the following options:
Default is Asynchronous.
|
Dynamic Schema Handling
| Determines how
Data Integration applies schema changes from upstream transformations to the target object. Available when the schema change handling is dynamic and the field mapping is automatic.
For each target, select how
Data Integration updates the target schema. The options available are based on the target connection.
For more information, see
Schema change handling or the help for the appropriate connector.
|
Field
| Description
|
---|---|
Parameter File Directory
| Path for the directory that contains the parameter file, excluding the parameter file name. The Secure Agent must be able to access the directory.
You can use an absolute file path or a path relative to one of the following $PM system variables:
By default,
Data Integration uses the following parameter file directory:
|
Parameter File Name
| Name of the file that contains the definitions and values of user-defined parameters used in the task.
You can provide the file name or the relative path and file name in this field.
|
Field
| Description
|
---|---|
Connection
| Connection where the parameter file is stored. You can use the following connection types:
|
Object
| Name of the file that contains the definitions and values of user-defined parameters used in the task.
|
Property
| Description
|
---|---|
SQL ELT Optimization Type
| Type of SQL ELT optimization. Use one of the following options:
When you use SQL ELT optimization, do not use the
Error Log Type advanced session property.
Default is Full.
SQL ELT optimization functionality varies depending on the support available for the connector. For more information, see the help for the appropriate connector. SQL ELT optimization doesn't apply to mapping tasks that are based on mappings in SQL ELT mode because mappings in SQL ELT mode are automatically configured to push transformation logic to the cloud data warehouse.
|
Optimization Context Type
| Provides context about the mapping configuration for SQL ELT optimization. If you select an option other than None,
Data Integration constructs a single query for SQL ELT optimization by combining multiple targets in the mapping based on the target configurations. If you select None, the query is not optimized.
If
Data Integration cannot apply the selected context,
Data Integration uses the default SQL ELT optimization behavior.
Select one of the following options:
Default is None.
For more information, see the help for the appropriate connector.
|
SQL ELT Optimization Fallback Option
| If full SQL ELT optimization is not available for the connection, choose how Data Integration handles SQL ELT optimization.
Choose one of the following options:
Default is disabled.
|
Create Temporary View
| Allows the task to create temporary view objects in the database when it pushes the task to the database.
Use when the task includes an SQL override in the Source Qualifier transformation or Lookup transformation.
Default is enabled.
Disabled when the SQL ELT optimization type is None.
|
Create Temporary Sequence
| This option is not used.
|