Table of Contents

Search

  1. Preface
  2. Starting Data Archive
  3. System Configuration
  4. Database Users and Privileges
  5. Source Connections
  6. Target Connections
  7. Archive Store Configuration
  8. Datatype Mapping
  9. Database Optimization
  10. SAP Application Retirement
  11. z/OS Source Data Retirement
  12. Seamless Data Access
  13. Data Discovery Portal
  14. Security
  15. SSL Communication with Data Vault
  16. LDAP User Authentication
  17. Auditing
  18. Running Jobs from External Applications
  19. Salesforce Archiving Administrator Tasks
  20. Upgrading Oracle History Data
  21. Upgrading PeopleSoft History Data
  22. Data Archive Maintenance
  23. Appendix A: Datetime and Numeric Formatting
  24. Appendix B: Data Archive Connectivity

Administrator Guide

Administrator Guide

Archive Store Properties

Archive Store Properties

When you select an archive store for the Data Vault target connection, you must specify the connection properties to the archive store.
The archive store that you select determines the connection properties that you must set.

AWS S3

If Data Archive and Data Vault are installed on a Windows 64-bit or Red Hat Enterprise Linux 7 environment, you can set up keyless access to the Amazon Web Services (AWS) S3 archive store.
To access an AWS S3 archive store without specifying the access keys, select
AWS S3
as the
Archive Store Type
.
The following list describes the connection properties that you set for keyless access to the AWS S3 archive store:
Command
The AWS region URL that includes the bucket name. For example:
https://s3.amazonaws.com/testbucket/
Profile Name
The profile name of the AWS S3 connection.
To connect a Data Archive instance that runs on Amazon EC2 to AWS S3, add an Identity Access Management role. Set
Profile Name
to
default
.
When Data Archive uses a service other than Amazon EC2, connect to AWS S3 in one of the following ways:
  • Set the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY properties as environment variables. Set
    Profile Name
    to
    default
    .
  • Create a credentials file using AWS CLI. This file contains the user defined profile and the values for the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY properties.
You do not have to enter the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY values in the
ssa.ini
file of the Data Vault server or plugin.

EMC Atmos

The following list describes the connection properties that you set for the EMC Atmos archive store:
POOL_ADDR
Pool address for the Atmos storage pool in the following format:
<Storage IP address>?<Path and filename of the PEA file>
The Pool Entry Authorization (PEA) file controls client application access to the Atmos storage. You must get the pool address from the Atmos system administrator.
The following example shows a pool address:
168.159.214.13?C:/SHARE/HW/CAS64/user1.pea

EMC Centera

The following list describes the connection properties that you set for the EMC Centera archive store:
POOL_ADDR
Pool address for the Centera storage pool in the following format:
<Storage IP address>?<Path and filename of the PEA file>
The Pool Entry Authorization (PEA) file controls client application access to the Centera storage. You must get the pool address from the Centera system administrator.
The following example shows a pool address:
168.159.214.13?C:/SHARE/HW/CAS64/user1.pea

Hitachi Content Archive Platform

The following list describes the properties that you set for the Hitachi Content Archive Platform archive store:
HCP Authentication Token
Authentication token for the namespace and tenant in the HCP server. Get the authentication token from the HCP administrator.
Command
Path to the Data Vault folder in HCP.
For example, the following path shows the location of the Data Vault archive folder named
infa_archive
in the REST space of the namespace
ns0
and tenant
ten1
in the HCP server
hcp.archivas.com
:
ns0.ten1.hcp.archivas.com/rest/infa_archive

Hadoop HDFS

The following list describes the properties that you set for the Hadoop HDFS archive store:
HDFS URL
Hostname or IP address for the HDFS server.
HDFS Port
Port number to connect to HDFS. The default HDFS port number is 54310.
Command
Path to the directory for the Data Vault in HDFS. Do not include the HDFS prefix or host name.

Microsoft Azure Storage

The following list describes the connection properties that you set for the Microsoft Azure Storage archive store:
Azure Key
Key for the account that has read and write access to the buckets.
Command
The location of the Microsoft Azure container that hosts the
sct
files.
For example:
https://idv.blob.core.windows.net/idvcontainer1/
In this example,
https://idv.blob.core.windows.net/
is the URL to the Microsoft Azure Storage account and
idvcontainer1
is the name of the container.

S3 Storage

You can configure a simple storage service provided by Amazon or EMC as the Data Vault archive store. To configure, you must provide the Access Key ID and the Secret Access Key.
The following list describes the connection properties that you must set for the S3 archive store:
AWS Key
AWS-key-ID:AWS-secret-key.
For example:
AKIAI62XJDYYYXXXXXX:MFx94W+ZlGTdEXom+21BKBh4Y41y11x1x1x1x1
Command
The AWS region URL along with the bucket name. For example:
https://s3.amazonaws.com/testbucket/
In addition to the AWS Key and Command properties, you must install the libcurl API (libcurl.so.4.2.0) and the dependent libraries in the following directories:
Data Vault Service
On Windows, install the files in the root folder of the Data Vault Service directory.
On UNIX, install the files in the
/<File Archive Service Directory>/odbc
directory.
Data Vault Service Agent
On Windows or UNIX, install the files in the root folder of the Data Vault Service agent directory.
If the Data Vault Service agent is installed on multiple machines, install the files on each machine that hosts a Data Vault Service agent.
You must also configure the corresponding connection properties in the
ssa.ini
file of the Data Vault Server. For more information see the
Informatica Data Vault Administrator Guide
.

0 COMMENTS

We’d like to hear from you!