Configuring Cloud Data Access
Also available as:
PDF

Test access from HDP to S3

To test access to S3 from HDP, SSH to a cluster node and run a few hadoop fs shell commands against your existing S3 bucket.

To test access, SSH to any cluster node and switch to the hdfs user by using sudo su hdfs.

Amazon S3 access path syntax is:

s3a://bucket/dir/file

For example, to access a file called “mytestfile” in a directory called “mytestdir”, which is stored in a bucket called “mytestbucket”, the URL is:

s3a://mytestbucket/mytestdir/mytestfile

The following FileSystem shell commands demonstrate access to a bucket named “mytestbucket”:

hadoop fs -ls s3a://mytestbucket/
hadoop fs -mkdir s3a://mytestbucket/testDir
hadoop fs -put testFile s3a://mytestbucket/testFile
hadoop fs -cat s3a://mytestbucket/testFile
test file content

For more information about configuring the S3 connector for HDP and working with data stored on S3, refer to Cloud Data Access HDP documentation.