The last step in the Data Archive Project is to configure a Data Archive run. The following page appears on your screen:
Internally, during Data Archive job execution, data is not directly copied from data source to data target. Instead, a staging schema and interim (temporary) tables are used to ensure that archivable data and associated table structures are sufficiently validated during Archive and Purge.
The following steps are involved in the process:
Generate Candidates. Generates interim tables based on entities and constraints specified in the previous step.
Build Staging. Builds the table structure in the staging schema for archivable data.
Copy to Staging. Copies the archivable rows from source to the staging tablespace.
Validate Destination (when project type demands data archive). Validates table structure and data in the target in order to generate DML for modifying table structure and/or adding rows of data.
Copy to Destination (when project type demands data archive). Copies data to the target. If your target is the Data Vault, this step moves data to the staging directory. To move the data from the staging directory to the Data Vault, run the Data Vault Loader job after you publish the archive project.
Delete from Source (when project type demands data purge). Deletes data from source.
If the source is Informix and you set the
Delete Wait Time
property, Data Archive waits for the specified time period after deleting each batch in a table and after deleting each table.
Purge Staging. Deletes interim tables from the staging schema.