Table of Contents

Search

  1. Preface
  2. Part 1: Hadoop Integration
  3. Part 2: Databricks Integration
  4. Appendix A: Connections Reference

Step 2. Configure a Policy for the Target S3 Bucket

Step 2. Configure a Policy for the Target S3 Bucket

Configure the target S3 bucket with a policy that includes account, role, and the bucket name.
You can create a new policy, or add to an existing policy.
  1. Copy the following statement:
    { "Sid": "AWS Databricks Policy for s3 bucket - put, get, delete", "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::<aws-account-id-databricks>:role/<iam-role-for-s3-access>" }, "Action": [ "s3:PutObject", "s3:GetObject", "s3:DeleteObject" ], "Resource": "arn:aws:s3:::<s3-bucket-name>/*" }, { "Sid": "AWS Databricks Policy for s3 bucket - list, get bucket location", "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::<aws-account-id-databricks>:role/<iam-role-for-s3-access>" }, "Action": [ "s3:ListBucket", "s3:GetBucketLocation" ], "Resource": "arn:aws:s3:::<s3-bucket-name>" }
  2. Configure the following elements in the statement:
    Element
    Description
    aws-account-id-databricks
    Account ID of the AWS account in which you are configuring this integration.
    iam-role-for-s3-access
    The IAM role that you created in Create an IAM Role and Policy to Access an S3 Bucket.
    s3-bucket-name
    S3 bucket name.
  3. Click
    Save
    .

0 COMMENTS

We’d like to hear from you!