Table of Contents

Search

  1. Preface
  2. Starting Data Archive
  3. System Configuration
  4. Database Users and Privileges
  5. Source Connections
  6. Target Connections
  7. Archive Store Configuration
  8. Datatype Mapping
  9. Database Optimization
  10. SAP Application Retirement
  11. z/OS Source Data Retirement
  12. Seamless Data Access
  13. Data Discovery Portal
  14. Security
  15. LDAP User Authentication
  16. Auditing
  17. Running Jobs from External Applications
  18. Upgrading Oracle History Data
  19. Upgrading PeopleSoft History Data
  20. Data Archive Maintenance
  21. Storage Classifications
  22. Appendix A: Datetime and Numeric Formatting
  23. Appendix B: Data Archive Connectivity

Administrator Guide

Administrator Guide

Step 1. Install the libhdfs API Files

Step 1. Install the libhdfs API Files

The libhdfs API provides access to files in a Hadoop file system. Data Archive requires the libhdfs API files to access an archive in HDFS. The Hadoop installation includes the libhdfs API.
The Data Vault Service requires the following libhdfs files:
  • commons-logging-api-1.0.4.jar
  • hadoop-0.20.2-core.jar
  • libhdfs.so
To install the libhdfs API, copy the libhdfs files to the machines where the following Data Vault Service components are installed:
Data Vault Service
On Windows, copy the files to the root of the Data Vault Service directory.
On UNIX, copy the files to
<Data Vault Service Directory>/odbc
.
Data Vault Agent
On Windows or UNIX, copy the files to the root of the Data Vault Agent directory.
If the Data Vault Agent is installed on multiple machines, copy the libhdfs API files to all machines that host a Data Vault Agent.
Data Vault Service plug-in for Data Archive
On Windows, copy the files to
<Data Archive Directory>\webapp\file_archive
.
On UNIX, copy the files to
<Data Archive Directory>/webapp/file_archive/odbc
.
After the installation, verify that the CLASSPATH environment variable includes the location of the libhdfs files.

0 COMMENTS

We’d like to hear from you!