Table of Contents

Search

  1. Preface
  2. Starting Data Archive
  3. System Configuration
  4. Database Users and Privileges
  5. Source Connections
  6. Target Connections
  7. Archive Store Configuration
  8. Datatype Mapping
  9. Database Optimization
  10. SAP Application Retirement
  11. z/OS Source Data Retirement
  12. Seamless Data Access
  13. Data Discovery Portal
  14. Security
  15. SSL Communication with Data Vault
  16. LDAP User Authentication
  17. Auditing
  18. Running Jobs from External Applications
  19. Salesforce Archiving Administrator Tasks
  20. Upgrading Oracle History Data
  21. Upgrading PeopleSoft History Data
  22. Data Archive Maintenance
  23. Storage Classifications
  24. Appendix A: Datetime and Numeric Formatting
  25. Appendix B: Data Archive Connectivity

Administrator Guide

Administrator Guide

Hadoop Distributed File System Configuration

Hadoop Distributed File System Configuration

You can use a Hadoop Distributed File System (HDFS) that runs on Linux as an archive store in Data Archive.
To create an archive in HDFS, complete the following tasks:
  1. Install the libhdfs API files.
  2. Create a directory in HDFS.
  3. Create a Data Vault target connection.
  4. Run the Create Archive Folder job.
  5. Copy the connection to other Data Vault Service configuration files.
  6. Validate the connection to HDFS.

0 COMMENTS

We’d like to hear from you!