Table of Contents

Search

  1. Preface
  2. Starting Data Archive
  3. System Configuration
  4. Database Users and Privileges
  5. Source Connections
  6. Target Connections
  7. Archive Store Configuration
  8. Datatype Mapping
  9. Database Optimization
  10. SAP Application Retirement
  11. z/OS Source Data Retirement
  12. Seamless Data Access
  13. Data Discovery Portal
  14. Security
  15. LDAP User Authentication
  16. Auditing
  17. Running Jobs from External Applications
  18. Upgrading Oracle History Data
  19. Upgrading PeopleSoft History Data
  20. Data Archive Maintenance
  21. Storage Classifications
  22. Appendix A: Datetime and Numeric Formatting
  23. Appendix B: Data Archive Connectivity

Administrator Guide

Administrator Guide

Step 3. Create the Target Connection

Step 3. Create the Target Connection

In Data Archive, create a target connection to HDFS and set the archive store type to
Hadoop HDFS
.
The following list describes the properties that you need to set for the target connection:
Staging Directory
Directory in which the Data Vault Loader temporarily stores data as it completes the archive process. Enter the absolute path for the directory.
The directory must be accessible to the ILM application server.
For SAP application retirement, based on the type of connection between the SAP application server and staging area on the Data Archive server, enter one of the following paths:
  • If the connection is through FTP, enter the absolute path for the FTP folder on the Data Archive server.
  • If the connections is through an NFS mount point, enter the absolute path of the staging folder on the SAP application server.
Number of Rows Per File
Maximum number of rows that the Data Vault Loader stores in a file in the Data Vault. Default is 1 million rows.
Data Vault Data Directory
Directory in which the Data Vault Loader creates the archive. Enter the absolute path for the directory. You can set up the directory on a local storage or use Network File System (NFS) to connect to a directory on any of the following types of storage devices:
  • Direct-attached storage (DAS)
  • Network-attached storage (NAS)
  • Storage area network (SAN)
You can specify a different directory for each Data Vault target connection. The directory must be accessible to the ILM application server and the Data Vault Service.
If you select an archive store in the
Archive Store Type
property, the Data Vault Loader archives data to the archive store, not to the location specified in the
Data Vault Data Directory
property. Instead, the Data Vault Loader uses the Data Vault data directory as a staging location when it writes data to the archive store.
Data Vault Archive Folder Name
Name of the folder in the Data Vault in which to store the archived data. The Data Vault folder corresponds to the database in the archive source.
Data Vault Host
Host name or IP address of the machine that hosts the Data Vault Service.
Data Vault Port
Port number used by the ssasql command line program and other clients such as the Data Vault SQL Tool and ODBC applications to connect to the Data Vault. Default is 8500.
Data Vault Administration Port
Port number used by the Data Vault Agent and the Data Vault Administration Tool to connect to the Data Vault. Default is 8600.
Data Vault User
Name of the administrator user account to connect to the Data Vault Service.
You can use the default administrator user account created during the Data Vault installation. The user name for the default administrator user account is
dba
.
Data Vault User Password
Password for the administrator user account.
Confirm Password
Verification of the password for the administrator user account.
Add-On URL
URL for the Data Vault Service for External Attachments component. The Data Vault Service for External Attachments converts external attachments from the archived format to the source format. Required to restore encrypted attachments from the Data Vault to the source database.
Maintain Imported Schema Name
Use schema names from the source data imported through the Enterprise Data Manager.
By default, this option is enabled. The Data Vault Loader creates a schema structure in the Data Vault folder that corresponds to the source schema structure imported through the Enterprise Data Manager. It adds the transactional tables to the schemas within the structure. The Data Vault Loader also creates a
dbo
schema and adds the metadata tables to the
dbo
schema.
The imported schema structure is based on the data source. If source connections contain similar structures but use different schema names, you must import the source schema structure for each source connection. For example, you import the schema structure from a development instance. You export metadata from the development instance and import the metadata into the production instance. If the schema names are different in development and production, you must import the schema structure from the production instance. You cannot use the schema structure imported from the development instance.
If this option is not enabled, the Data Vault Loader creates the
dbo
schema in the Data Vault folder. The Data Vault Loader adds all transactional tables for all schemas and all metadata tables to the
dbo
schema.
Archive Store Type
Storage platform for the Data Vault. Select the
Hadoop HDFS
archive store.
HDFS URL
Hostname or IP address for the HDFS server.
HDFS Port
Port number to connect to HDFS. The default HDFS port number is 54310.
Command
Path to the directory for the Data Vault in HDFS. Do not include the HDFS prefix or host name.

0 COMMENTS

We’d like to hear from you!