FROM AWS S3 MISCONFIGURATION TO SENSITIVE DATA EXPOSURE
2021-02-19 21:15:21 Author: infosecwriteups.com(查看原文) 阅读量:216 收藏

Mase289

Image for post

Photo by Markus Spiske on Unsplash

Often companies deploy third-party applications to store various media content. This content is usually in various file formats such as images, documents, Html, JavaScript, SQL. Etcetera. During my engagements on bug bounty programs, it isn’t uncommon to find references to Amazon AWS S3 buckets that are disclosed in various places of the web application such as the website source code, or through a particular operation such as a file upload.

RESPONSE

200 OK“Successfully uploaded to s3://testbucket/profilepicures/user/fancy_avatar.jpg”

These buckets can also be found using the google dork; “site:s3.amazonaws.com” “target.com”

It is also not uncommon to find that these cloud storage applications are misconfigured during deployment. A case in point was a particular target I was hunting on a few days to the new year. Typical operations to check for while testing access control permissions on an AWS s3 bucket include the following;

aws s3 ls s3://test-bucket to list folders and objects within the bucket

aws s3 ls s3://test-bucket/test-folder/ to list folders and objects within the /test-folder/

aws s3 cp test.txt s3://test-bucket to copy a file (test.txt) to the bucket

aws s3 mv test.txt s3://test-bucket to move a file (test.txt) to the bucket

aws s3 rm s3://test-bucket/test.txt to delete the file (test.txt) from the bucket

Amazon S3 access control lists (ACLs) enable you to manage access to buckets and objects. Each bucket and object has an ACL attached to it as a subresource. It defines which AWS accounts or groups are granted access and the type of access.

Now for my most recent target, I noticed that testing for all the typical OWASP Type bugs resulted in none being discovered. The target was hardened for the most part. However, when uploading a profile picture to the site, the file was saved to an s3 bucket and loaded from there whenever the file was accessed. Additionally, the site provided the end end user with a feature to upload, view, and update their Curriculum Vitae. Performing these actions also revealed the URL of the S3 bucket.

s3://vulnerable-bucket.s3-ap-southeast-1.amazonaws.com/

The site was basically a platform that linked job seekers to recruiters and ensured that candidates for various jobs were correctly placed and matched with available job opportunities. I noted down this s3 URL to which both CVs and Profile pictures got uploaded and continued with my testing.

It was only much later that I launched Amazon aws s3 cli which is a command-line utility anybody can use to perform various operations on an s3 bucket. These include the creation, deletion, uploading, and synchronization of buckets, folders, and objects within the bucket. I began my testing by trying to copy and move a file into the bucket.

I created a simple text file with my identifier alias and tried;

aws s3 cp test.txt s3://vulnerable-bucket

That returned an Access Denied error and so did the following attempt;

aws s3 mv test.txt s3://vulnerable-bucket

The access control settings were properly restricting copy and move operations on the bucket. I then attempted to perform a list operation to check whether this was restricted as well. The following folder structure was returned and to my surprise, it included sensitive folders containing both company and user data.

 PRE JD_2.0/
PRE candidatebulkupload/
PRE candidatefiles/
PRE companylogo/
PRE feed/
PRE marketmapresumes/
PRE medias/
PRE parsed_resumes/
PRE printableprofiles/
PRE production-logs/
PRE profilepic/
PRE metrics/
PRE resume/
PRE resumes/
PRE uploads/
PRE whatsapp/

I immediately went through the folders I assumed to be the most sensitive to determine whether I could actually access the files themselves and not just the directory /file structure. After confirming this, I stopped my testing and submitted a report to the affected program which was quick to respond and implement a fix. The impact of this could have been devastating-considering all the contents of the s3 bucket dated way back to the inception of the company. It is unclear how long this s3 bucket was left open to the public so unauthorized access to it could have already been obtained.

TIP: You can use the command aws s3 cp s3://test-bucket/file.txt ./ to download a file contained within the bucket to your current working directory. Useful in cases where you need to perform further analysis on the files.

Hope you enjoyed reading through this writeup. Till next time happy hacking!


文章来源: https://infosecwriteups.com/from-aws-s3-misconfiguration-to-sensitive-data-exposure-784f37a30bf9?source=rss----7b722bfd1b8d--bug_bounty
如有侵权请联系:admin#unsafe.sh