Table of Contents

Search

  1. Preface
  2. Data integration tasks
  3. Mapping tasks
  4. Data transfer tasks
  5. Data loader tasks

Tasks

Tasks

Configuring runtime options

Configuring runtime options

On the
Runtime Options
page, configure optional runtime options for the task. Expand each area to see the options you can configure.
  1. Specify whether to run the task on a schedule or without a schedule.
    Choose one of the following options:
    • If you want to run the task on a schedule, click
      Run on a schedule
      . Select the schedule you want to use or click
      New
      to create a schedule.
    • If you want to run the task without a schedule, click
      Do not run on a schedule
      .
  2. Configure email notification options for the task.
  3. Optionally, in the
    Advanced Options
    area, configure the advanced options that are displayed for your connection:
    Field
    Description
    Pre-Processing Commands
    Commands to run before the task.
    Post-Processing Commands
    Commands to run after the task completes.
    Maximum Number of Log Files
    Number of session log files to retain. By default,
    Data Integration
    stores each type of log file for 10 runs before it overwrites the log files for new runs.
    If a dollar sign ($) is present in a custom session log file name, for example, MyLog_$CurrentTime, the file name is dynamic. If you customize the session log file name using a dynamic name, this property doesn't apply. To purge old log files, delete the files manually.
    Schema Change Handling
    Determines how
    Data Integration
    picks up changes to the object schema. Select one of the following options:
    • Asynchronous.
      Data Integration
      refreshes the schema when you update the mapping or mapping task, and after an upgrade.
    • Dynamic.
      Data Integration
      refreshes the schema every time the task runs.
    Default is Asynchronous.
    Dynamic Schema Handling
    Determines how
    Data Integration
    applies schema changes from upstream transformations to the target object. Available when the schema change handling is dynamic and the field mapping is automatic.
    For each target, select how
    Data Integration
    updates the target schema. The options available are based on the target connection.
    For more information, see Schema change handling or the help for the appropriate connector.
  4. Optionally, to create a parameter file based on the parameters and default values specified in the mapping on which the task is based, click
    Download Parameter File Template
    within the parameter file location.
    For more information about parameter file templates, see
    Mappings
    .
  5. Optionally, if the
    mapping
    task contains parameters, you can use parameter values from a parameter file. Choose one of the following options:
    • To use a parameter file on a local machine, select
      Local
      . Enter the following information:
      Field
      Description
      Parameter File Directory
      Path for the directory that contains the parameter file, excluding the parameter file name. The Secure Agent must be able to access the directory.
      You can use an absolute file path or a path relative to one of the following $PM system variables:
      • $PMRootDir
      • $PMSourceFileDir
      • $PMLookupFileDir
      • $PMCacheDir
      • $PMSessionLogDir
      • $PMExtProcDir
      • $PMTempDir
      By default,
      Data Integration
      uses the following parameter file directory:
      <Secure Agent installation directory>/apps/Data_Integration_Server/data/userparameters
      Parameter File Name
      Name of the file that contains the definitions and values of user-defined parameters used in the task.
      You can provide the file name or the relative path and file name in this field.
    • To use a cloud-hosted file, select
      Cloud Hosted
      . Enter the following information about the file:
      Field
      Description
      Connection
      Connection where the parameter file is stored. You can use the following connection types:
      • Amazon S3
      • Google Storage V2
      • Azure Data Lake Store Gen2
      Object
      Name of the file that contains the definitions and values of user-defined parameters used in the task.
  6. Choose whether to run the task in standard or verbose execution mode, if this option is visible.
    If you select verbose mode, the mapping generates additional data in the logs that you can use for troubleshooting. Select verbose execution mode only for troubleshooting purposes. Verbose execution mode impacts performance because of the amount of data it generates.
  7. Optionally, configure SQL ELT optimization.
    The following table describes the SQL ELT optimization properties:
    Property
    Description
    SQL ELT Optimization Type
    Type of SQL ELT optimization. Use one of the following options:
    • None. The task processes all transformation logic for the task.
    • To Source. The task pushes as much of the transformation logic to the source database as possible.
    • To Target. The task pushes as much of the transformation logic to the target database as possible.
    • Full. The task pushes all of the transformation logic to the source and target databases. The source and target schema must be the same.
    • $$PushdownConfig. The task uses the SQL ELT optimization type specified in the user-defined parameter file for the task.
      When you use $$PushdownConfig, ensure that the user-defined parameter is configured in the parameter file.
    When you use SQL ELT optimization, do not use the
    Error Log Type
    advanced session property.
    Default is Full.
    SQL ELT optimization functionality varies depending on the support available for the connector. For more information, see the help for the appropriate connector. SQL ELT optimization doesn't apply to mapping tasks that are based on mappings in SQL ELT mode because mappings in SQL ELT mode are automatically configured to push transformation logic to the cloud data warehouse.
    Optimization Context Type
    Provides context about the mapping configuration for SQL ELT optimization. If you select an option other than None,
    Data Integration
    constructs a single query for SQL ELT optimization by combining multiple targets in the mapping based on the target configurations. If you select None, the query is not optimized.
    If
    Data Integration
    cannot apply the selected context,
    Data Integration
    uses the default SQL ELT optimization behavior.
    Select one of the following options:
    • None
    • SCD Type 2 merge
    • Multi-insert
    Default is None.
    For more information, see the help for the appropriate connector.
    SQL ELT Optimization Fallback Option
    If full SQL ELT optimization is not available for the connection, choose how Data Integration handles SQL ELT optimization.
    Choose one of the following options:
    • Partial SQL ELT. Default. Data Integration pushes as much transformation logic as possible to the source and target database. The task processes any transformation logic that it can't push to a database.
    • Non SQL ELT. The task runs with no SQL ELT optimization.
    • Fail Task. Data Integration fails the task.
    Default is disabled.
    Create Temporary View
    Allows the task to create temporary view objects in the database when it pushes the task to the database.
    Use when the task includes an SQL override in the Source Qualifier transformation or Lookup transformation.
    Default is enabled.
    Disabled when the SQL ELT optimization type is None.
    Create Temporary Sequence
    This option is not used.
  8. Choose to enable cross-schema SQL ELT optimization.
    Cross-schema SQL ELT optimization doesn't apply to mapping tasks that are based on mappings in SQL ELT mode.
  9. If you want to run multiple instances of the task at the same time, enable simultaneous runs of the
    mapping
    task.
    Some mapping features might produce unexpected results in simultaneous task runs.
  10. Click
    Finish
    .

0 COMMENTS

We’d like to hear from you!