Set permissions in Ranger

Create custom Ranger policies to enable the CDP user to read and write to the source and target buckets.

From your RAZ-enabled environment, access the Ranger service for your cloud provider. Then create a policy give the CDP user read and write access to the source bucket. Create another policy to give the CDP user read and write access to the target bucket.

  1. From your Data Hub cluster, select Ranger from the list of services and log into Ranger.
    The Ranger Service Manager page displays.
  2. Select your cluster from your Cloud provider service folder.
    The following image shows an Amazon S3 service folder with a Data Hub cluster.
    The List of Policies page appears.
  3. Click Add New Policy.
    The Create Policy page appears.
  4. Add the following details to allow the user to access the source bucket:
    1. Enter a unique name for the policy. For example, Logs input.
    2. Specify your source bucket name. For example, s3a://my-input-bucket-nifi.
    3. In the Path field, specify the path to a specific directory or file. Or, to indicate any path, enter /.
    4. In the Allow Condition section, specify the CDP user name in the Select User field. For example, srv_nifi-logs-ingest.
    5. In the Permissions fields, enter read and write.
    6. Click Add to save the policy.
  5. Add the following details to allow the user to access the target bucket:
    1. Enter a unique name for the policy. For example, Logs output.
    2. Specify your target bucket name. For example, s3a://my-output-bucket-nifi.
    3. In the Path field, specify the path to a specific directory or file. Or, to indicate any path, enter /.
    4. In the Allow Condition section, specify the CDP user name in the Select User field. For example, srv_nifi-logs-ingest.
    5. In the Permissions fields, enter read and write.
    6. Click Add to save the policy.
When you start the data flow, the processors using the CDP user credentials can list and fetch from the source bucket and put and delete in the target bucket.
Start the data flow.