Using the S3Guard Command to List and Delete Uploads
hadoop s3guard uploads command can also be used to list all
outstanding uploads under a path, and delete them.
hadoop s3guard uploads s3a://guarded-bucket/ tests3ascale/scale/hugefile nB3gctgP34ZSpzJ5_o2AVb7kKRiicXfbkBqIflA0y.IH7CimH1O0VUzohRCj3QfFstoQIdcge478ZidXu764yN0vuGI1j5kcOV3rDhsrc.RBZ5skZ93jVCN9g2c21QgB Total 1 uploads found.
All outstanding uploads can be aborted, or those older than a certain age.
hadoop s3guard uploads -abort -days 1 -force s3a://guarded-bucket/ Deleting: tests3ascale/scale/hugefile nB3gctgP34ZSpzJ5_o2AVb7kKRiicXfbkBqIflA0y.IH7CimH1O0VUzohRCj3QfFstoQIdcge478ZidXu764yN0vuGI1j5kcOV3rDhsrc.RBZ5skZ93jVCN9g2c21QgB 2018-06-28 20:58:18,504 [main] INFO s3a.S3AFileSystem (S3AFileSystem.java:abortMultipartUpload(3266)) - Aborting multipart upload nB3gctgP34ZSpzJ5_o2AVb7kKRiicXfbkBqIflA0y.IH7CimH1O0VUzohRCj3QfFstoQIdcge478ZidXu764yN0vuGI1j5kcOV3rDhsrc.RBZ5skZ93jVCN9g2c21QgB to tests3ascale/scale/hugefile Total 1 uploads deleted.
While bucket lifecycle is the best way to guarantee that all outstanding uploads are deleted; this command line tool is useful to verify that such cleanup is taking place, or explicitly clean up after a failure.