Table of Contents

Search

  1. Preface
  2. Starting Data Archive
  3. System Configuration
  4. Database Users and Privileges
  5. Source Connections
  6. Target Connections
  7. Archive Store Configuration
  8. Datatype Mapping
  9. Database Optimization
  10. SAP Application Retirement
  11. z/OS Source Data Retirement
  12. Seamless Data Access
  13. Data Discovery Portal
  14. Security
  15. SSL Communication with Data Vault
  16. LDAP User Authentication
  17. Auditing
  18. Running Jobs from External Applications
  19. Salesforce Archiving Administrator Tasks
  20. Upgrading Oracle History Data
  21. Upgrading PeopleSoft History Data
  22. Data Archive Maintenance
  23. Appendix A: Datetime and Numeric Formatting
  24. Appendix B: Data Archive Connectivity

Administrator Guide

Administrator Guide

Informix Source Connections

Informix Source Connections

Define archive source connection properties to connect to applications on Informix databases.
Depending on the connection type, you can configure the following source connection properties:
Host
IP address of the source application database server.
Port
Port of the source application database server.
Server Name
Unique identifier or system identifier for the source application database server.
You must enter both the server name and the database name. Use the following format:
<server_name>;databasename=<dbname>
ILM Repository Administrator User
Default administration database user for the source database server, such as SYSTEM.
The administration database user has DBA rights to the database, including the ability to run DDL and access system-level objects.
Admin Login Name
Login name for the administration database user. This user does not require special permissions as it is only used to connect to the source database. You can provide any user name, such as a database connection user or a read-only user.
Password
Password for the administration login name.
Application User Name
Database user that owns the application tables that you want archive, such as APPS for Oracle applications.
The production application user is either the owner of the tables that you want to archive or partition, or the user that has full rights to the tables that you want to archive or partition. During the archive process, you can use the production application user to delete from the source. Alternatively, you can configure the source connection to use the staging user to delete data from the source.
Use the default database user for the application or provide another value if you do not use the default user. If you provide a user that does not own or has full access to the application tables, then the archive job fails because the job cannot find the table or view.
The archive job uses the number of schemas in the source application to determine schema names. The archive job uses this attribute value for applications that have one schema, such as PeopleSoft and Siebel. The archive job uses the schema name from the mined application for applications that have multiple schemas, such as Oracle applications.
The ILM engine uses this field to generate the SQL SELECT statement to read data from the application tables. For example, if you enter SYSADMIN, the archive job generates the following SQL:
SELECT * from SYSADMIN.<OBJECT NAME>
Application Login Name
Login name that connects to the source database that contains the data you want to archive. This user does not require special permissions as it is only used for the database connection. You can provide any user name, such as a database connection user or a read-only user.
Password
Password for the application login name.
Staging User Name
Staging database user for the source database.
The staging database user stores interim tables for business rule validation and generates candidates during the archive and partitioning processes. If you enable staging for the archive source connection, the staging user also creates staging tables. The staging tables temporarily store a copy of the archived data before the archive job moves the data to the archive or history database. The staging user typically resides in the source database.
Staging Login Name
Login name for the staging database.
Password
Password for the staging login name.
Staging Tablespace
Tablespace in the staging database user that stores the interim and staging tables when you run an archive cycle. The archive job always stores the interim tables in this tablespace. The archive job stores staging tables if you enable staging.
Note for Informix sources, Data Archive uses "exclusive" mode locking for staging and interim tables.
Use Copy To Staging
Determines whether the archive job temporarily stores data in a staging area on the archive source before it moves data to the target. The archive job includes heavy select and delete operations on the tables that are selected in the archive job. The operations may impact performance on any user operations on the tables. For example, if you run an archive job that archives purchase orders, users may see a performance impact when they create a purchase order. The main benefit of staging is to reduce the number of operations on the production tables.
If enabled, the archive job copies data to the staging area after the archive job evaluates the business rules. The archive job copies data that meets the business rules from the archive source to the staging area. Then, the job deletes data from the source and moves data from the staging area to the target system.
If disabled, the job extracts data directly from the archive source to the archive target. The extraction and deletion occur simultaneously. You may want to disable staging if you do not want to create additional storage space on the production database.
Enable staging if you want to use the "Delete Commit Interval" option when you create and run an archive and purge project for an Informix source
For Informix sources, Data Archive uses "exclusive" mode locking for staging and interim tables.
JDBC Fetch Size
Controls the number of rows that the system fetches in one batch from one table if you use JDBC to connect to the source database.
Only change the default value if you receive an out-of-memory error when the archive job copies data to the destination. You may get the out-of-memory error if the source table has large objects in each row, such as CLOBS or BLOBS data types. If you receive this error, then reduce the fetch size.
Default is 1000.
Database Link to Production (Restore Only)
Database link name that connects the history database to the production database. This attribute is required when you create the history database as a source and you use transaction or cycle restore. For restore scenarios, the history database is the source, and the production database is the target.
During the restore job, the system uses this attribute for the database link value. If you do not provide a value, then the system dynamically creates the database link from the source to the ILM repository. After the restore job completes, the system drops the database link.
If the system needs to create the database link, then the administration or application database users need to have create and drop database link permissions.
Parallel Entity Run
Determines the interim table names in the entity if you want to run the entity in parallel archive jobs. For example, you may extract data in parallel from two different organizations in the same human resource entity. The system creates interim tables in the staging database user. However, the tables need to be differentiated so the archive jobs know which interim tables are relevant for that job.
By default, the system creates interim table names with the same name in the EDM. If you enable this attribute, then the system creates the interim table names with the following syntax:
<table name>_<job ID>
If you enable this attribute and the staging database user does not have authorization to create tables, then the archive job fails.
SAP Host
Host of the SAP application that you want to retire.
Required for SAP application retirement only.
SAP Client
Client in which the user logs in. Note that all clients in the SAP application are retired.
Required for SAP application retirement only.
SAP System Number
System number in which the user logs in.
Required for SAP application retirement only.
SAP Language
Language in which the user logs in. Note that all languages in the SAP application are retired.
Required for SAP application retirement only.
SAP User
User that logs in to the SAP application. The user must be assigned to the ZINFA_RETIREMENT_PREPARATION role and include RFC connection authorizations.
Required for SAP application retirement only.
SAP User Password
Password for the SAP user.
Required for SAP application retirement only.
FTP User
User name to connect to the FTP server.
Required for SAP application retirement if you set up an FTP connection between the SAP application server and Data Archive.
FTP Password
Password for the FTP user.
Required for SAP application retirement if you set up an FTP connection between the SAP application server and Data Archive.
FTP Host
Host name of the FTP server.
Required for SAP application retirement if you set up an FTP connection between the SAP application server and Data Archive.
FTP Port
Port number of the FTP server. Default port is 21.
When you specify a port number, enable that port number for FTP on the host machine.
Required for SAP application retirement if you set up an FTP connection between the SAP application server and Data Archive.
FTP Folder Location
Name of the FTP folder on the Data Archive server. For example,
ERP\
.
Required for SAP application retirement if you set up an FTP connection between the SAP application server and Data Archive.
Compressed
For SAP application retirement projects, if you want Data Archive to save the BCP files in a compressed file format, enable the check box. When you enable the check box, Data Archive saves the BCP files as
.gz
files in the staging area.
Delete Wait Time
Used by the
Delete from Source
step to determine the number of seconds to wait before deleting the next batch in a table and before deleting the next table.
Specify the number of seconds or leave blank.
For example, if you specify 300, Data Archive waits 5 minutes between deleting batches in a table and between deleting tables.
If you do not specify a number or if you enter 0, Data Archive deletes all rows continuously without a time delay.
Select First Max Rows
Determines the number of records, starting from the first record, to archive in the current archive job. Specify the number of rows or leave blank.
For example, if you specify 900, Data Archive copies the first 900 rows from the driving table to the interim table and archives these rows.
If you do not specify a number or if you enter 0, Data Archive copies all the rows from the driving table to the interim table and archives these rows.

INTERVAL Data Types

If the Informix data contains INTERVAL data types, you must use the Informix native driver to load data to Data Vault. Once you have uploaded the driver, run the following commands in the ILM repository:
update am_platforms set DRIVER_NAME = 'com.informix.jdbc.IfxDriver' where platform_id = 24 update am_platforms set JDBC_URL = 'jdbc:informix-sqli://$HOST:$PORT/$DBNAME:INFORMIXSERVER=$SID;DELIMIDENT=Y' where platform_id = 24
Then restart the Data Archive server.

0 COMMENTS

We’d like to hear from you!