Implementing a Disaster Recovery Strategy for Informatica® Big Data Management 10.2 on Amazon AWS

Implementing a Disaster Recovery Strategy for Informatica® Big Data Management 10.2 on Amazon AWS

Verify Prerequisites

Verify Prerequisites

Verify the following prerequisites before you begin to set up Pilot Light disaster recovery:

Domain Prerequisites

  • Install an Informatica domain in both zones.
  • The Informatica domain stores metadata on Amazon RDS configured for multi-AZ.
  • The Model repository and the Model repository database are stored on Amazon RDS configured for multi-AZ.
  • The Data Information Service is configured on the Amazon EFS that stores source and target data, caches, and log files.

Cluster Prerequisites

  • Availability Zones 1 and 2 are set up in the VPC.
  • Amazon Elastic Load Balancer monitors all gateway nodes in AZ1 and AZ2.
  • A Route53 alias is configured to point to the Elastic Load Balancer.
  • Create and maintain AMI with a copy of the Big Data Management binary package. Apply patches on a regular basis. Store and update AMI in S3 storage to ensure the image is updated.
  • Backup nodemeta.xml.
  • Create a backup scheduler for connector contents, and store backup files and snapshots in an Amazon S3 bucket.
  • A Hadoop cluster in installed in each availability zone.
  • Define backup policies for Informatica embedded Hadoop components, such as HDFS, Hive, and HBase. Create snapshots regularly and store the snapshots in an Amazon S3 bucket.
  • Prepare a validation test suite.

0 COMMENTS

We’d like to hear from you!