BOOK THIS SPACE FOR AD
ARTICLE AD***DISCLAIMER — S3 Buckets and other Data Storage Solutions may contain extremely sensitive information. I do not condone the abuse of any data or infrastructure you find.***
In this article, you will learn:
How to Find S3 BucketsHow to Test S3 Bucket SecurityS3 Buckets and equivalent data storage solutions can hold sensitive data that you don’t want the public to see, and yet, sometimes these buckets are mistakenly left open to the public. There is actually a website dedicated to providing a searchable database of publicly accessible S3 Buckets (Amazon, Google, Azure, and Digital Ocean Spaces) and it’s terrifying some of the data that is left public (social security numbers, financial documents, etc.) We’ll talk about that website later on in the article.
Why Test S3 Buckets?
As a Bug Bounty Hunter, my goal is to find vulnerabilities and show the impact of what happens if a vulnerability is exploited by a hacker with bad intentions. S3 buckets can hold sensitive data, host proprietary code, etc. Hackers with malicious intent love getting their hands on sensitive data, and they might also love the opportunity to host malware on an unsecured S3 bucket. There are many reasons to secure your cloud storage solutions, which is why I test S3 buckets while bug bounty hunting.
First of all, unless you are running legal security testing of some sort, you probably should not look for S3 Buckets to test. If you are Bug Bounty Hunting and looking for vulnerabilities, make sure S3 Buckets/Cloud Storage Infrastructure is in-scope for the target you are testing. In other words, make sure it’s legal before you test the security of an S3 bucket.
Techniques For Finding S3 Buckets
Use a scanner to find S3 Buckets. There is tons of them on github.Here is an example of a scanner written in ruby by github user nahamsec, aka Behrouz Sadeghipour. The scanner is called lazys3, and you can find it on github here. Reference the image below to see lazys3 in action scanning for S3 Buckets with a specific keyword in it. I redacted the actual keyword, but the syntax to run the tool is: ruby lazys3.rb [keyword]
When lazys3 finds a bucket, if you want to view the bucket you can use a browser to travel to: [bucket-name].s3.amazonaws.com OR s3.amazonaws.com/[bucket-name]
More Scanners
Here is a github repository from user “mxm0z” that shows multiple different scanners that scan for existing Amazon AWS S3 buckets. This is the link: mxm0z awesome-sec-s3 Bucket Scanner.
2. Use the GrayHatWarFare website to instantly find publicly accessible S3 Buckets from multiple different storage providers (not just Amazon). GrayHatWarFare is the website I mentioned earlier that displays thousands of publicly accessible S3 buckets. There is a free version, but the amount of bucket results you get will be limited compared to the paid version. Here is the link to the website. Reference the image below where I utilize the free version of GrayHatWarFare to keyword search for specific buckets:
You can also keyword search for publicly accessible files, instead of buckets. Reference the image below where I use GrayHatWarFare to keyword search for publicly accessible files.
3. Google Dorking for S3 buckets is also possible. This means you can utilize Google’s search engine to look for S3 buckets indexed by google. The good thing about Google Dorking is you can look for other types of buckets, as well. For example, to find Amazon S3 buckets, you can search “s3.amazonaws.com” and to find Google Cloud Platform Storage buckets, you can search “storage.googleapis.com”. Reference example syntax for Google Dorking for cloud storage (and other S3 related solutions) below.
Example Google Dorking Syntax
site:websitename.com inurl:".s3.amazonaws.com"
site:websitename.com inurl:"storage.googleapis.com"
4. Using BurpSuite: While you are directly browsing the website you are testing, use BurpSuite HTTP History to search for requests and responses made to Cloud Storage Services. This can be done in BurpSuite Community Edition (the free version) because you can filter the HTTP History for certain strings. Reference the images below to see me filter my BurpSuite HTTP History to search for the string “amazonaws” and then manually search the filtered results for instances of the string “s3”.
You can now manually go through the filtered results and search for the text “s3” in the requests and responses.
Look at all those S3 buckets just waiting to be tested.
Tip: Sometimes, while you are browsing a website, you might notice requests and responses from S3 buckets that do not belong to the website you are testing. In that case, you can check to see if the company that owns the S3 bucket has a Bug Bounty Program (or a Vulnerability Disclosure Program).
There is an issue you might run into when scanning for buckets, or searching for buckets on GrayHatWarFare. That issue being that it is hard to prove that a company owns an S3 Bucket, unless you find the bucket while directly browsing the company website. That is why I generally only test buckets I find while directly browsing the target website and noticing the bucket in my BurpSuite Proxy History.
Here’s an example, when I use lazys3 to scan for buckets with the keyword “indeed”, even if I find a public bucket, it still might not belong to Indeed.com. Therefore I can not prove that the S3 bucket is in-scope for the Indeed Bug Bounty Program. On the other hand, if I was browsing Indeed.com and I uploaded my resume and in BurpSuite I notice that a POST request was made to “indeed-resumes.s3.amazonaws.com”…then that S3 bucket can most likely be tested for Indeed’s Bug Bounty Program. Always read the individual Bug Bounty Program’s Brief to be certain!
Better Ways To Find S3 Buckets?
I’m 1000% certain there are better ways to find S3 Buckets, especially if you have a paid version of BurpSuite. I would appreciate any of your techniques you would like to share in the comments!
I will specifically talk about testing Amazon AWS S3 Buckets in this section. To test an Amazon S3 Bucket:
See if you can access the bucket. This means manually traveling to the bucket in your browser: [bucket-name].s3.amazonaws.com OR s3.amazonaws.com/[bucket-name]Sometimes you can list (view) the contents of the bucket, but you can not access any of the files inside. Therefore an attacker listing the bucket contents would most likely not have any negative impact. On the other hand, if you can list the contents of the bucket, and view some of the sensitive files inside, then this might show negative impact.
2. Use AWS CLI to try and gather more information about the S3 bucket and the objects within it. For example, you can try and view the access control list of an object (file) in the S3 bucket. You can also use AWS CLI to try and recursively view the contents of the bucket without truncation (through the browser, you can only list 1000 files inside the bucket).
Using AWS CLI to recursively list the contents of a public S3 Bucket
You can recursively view the contents of an S3 bucket by using the AWS CLI “ls” command. You can try this command with credentials (provide your profile name) or without credentials (no-sign-request).
Syntax
aws s3 ls s3://bucket-name --recursive --profile <yourprofilename>
aws s3 ls s3://bucket-name --recursive --no-sign-request
Using AWS CLI To View The Access Control List Of A Bucket Object
You can also try to use AWS CLI to view the access control list of a file inside the bucket. Using this command I was able to gather more information about the owner of the bucket.
Syntax: aws s3api get-object-acl --profile <your profile name> --bucket <bucket-name> --key <name of file>
3. You can also attempt to upload a file to the S3 Bucket. If an attacker can upload files to your S3 bucket, they can do things like increase your storage costs, try to host malware, etc. Reference the images below where I use the AWS CLI tool to upload a file to an open S3 bucket.
Using AWS CLI To Upload A File To An S3 Bucket
Synax
With Credentials: aws s3 cp your-file.txt s3://bucket-name --profile <your profile name>
Without Credentials: aws s3 cp your-file.txt s3://bucket-name --no-sign-request
Ensuring The File Was Uploaded To The S3 Bucket
Now that we have uploaded the test file to the S3 bucket, we can ensure the file was uploaded by using the AWS CLI “ls” command to list the bucket contents. Notice my file “bugbounty_test.txt” is listed as part of the S3 bucket. Reference the image below.
Accessing The Uploaded File In the Browser
Now that I have uploaded the file into the insecure S3 bucket, it can be accessed through the browser. Reference the image below.
Here is a great video by NetworkChuck on YouTube that shows you how to setup and use the AWS CLI tool.
“In conclusion, the accessibility and security of S3 Buckets and other cloud storage solutions pose significant challenges, often leading to inadvertent exposure of sensitive data. Despite the potential risks, it’s crucial to understand the methods for finding and testing these buckets to safeguard against malicious exploitation. Various tools and techniques, such as scanners like lazys3 and resources like GrayHatWarFare, provide avenues for identifying publicly accessible buckets. However, verifying ownership and ensuring legality are paramount considerations before conducting any security testing. Furthermore, while scanning tools offer efficiency, manual inspection through tools like BurpSuite remains vital for thorough assessment. Ultimately, proactive testing and responsible disclosure are essential steps in mitigating risks and enhancing the security posture of cloud storage infrastructure.” — ChatGPT
Thanks ChatGPT. Now that you’re finished reading, go make sure your S3 Buckets are secure!
If you enjoyed this article, please press the clap button and share it with your people’s who are trying to learn #Cybersecurity and #CloudSecurity. If you have better techniques for finding S3 buckets, leave a comment!
Thank You For Reading And Thank You For Learning!