Mastering Unauthenticated XSS Detection: Best Burp Suite Configurations for XSS Hunting

5 months ago 33
BOOK THIS SPACE FOR AD
ARTICLE AD

Mayank Kumar Prajapati

Burp Suite is one of the most popular tools for vulnerability scanning and manual testing created by PortSwigger. It is commonly used by bug bounty hunters to detect and exploit online application vulnerabilities. Burp Suite has a number of capabilities that enable both automated and manual security testing. In this blog, we would be exploring the best burp suite scan configuration to find out XSS vulnerability without sending too much heavy load on the server.

Benefits of hunting XSS Vulnerabilities on Public-Facing PagesPreparation before setting up scan configurationCustom scan configurations for crawling the targetCustom scan configurations for auditing the targetAnalyzing the result and finding XSSMitigations against XSS vulnerability

XSS vulnerabilities in public-facing pages can have an immediate and widespread impact, affecting all site visitors without any authentication

Browse the target website while having burp proxy running in the background. For demonstration purpose we will be using a vulnerable site http://testphp.vulnweb.com/

Setting up scope

Navigate to Target > Scope Settings, enable Advance Scope control settings and click on Add Button.

Under filter Settings, click on Show only in-scope items so that we can focus only on our target.

Right Click on your Target shown in the screenshot above and select Scan. The Scan dialogue box appears in front of you, select Scan Type as Crawl.

Scan Configuration

Navigate to Scan Configuration and choose Select from Library.

Select the “Crawl Strategy — Faster” option. For demonstration purposes, we have chosen this strategy. However, based on the available time, you may choose “Crawl Strategy — Most Complete,” as it is the most effective. The drawback of this approach is that it requires significant amount of time for its completion.

Resource Pool Settings🏊

Create a new resource pool that sends lesser number of requests while crawling as we may get blocked due to heavy load on the server. We can use the below configuration for the same. Name this pool as Server Friendly.

Once crawling started you can see it on dashboard under Tasks.

Right click on your Target shown in the screenshot above and select Scan. The Scan dialogue box appears in front of you, select Scan Type as Audit Selected Items.

We can also consolidate the items and perform operations like Remove duplicate items, Remove items with no parameters so that our scanner does take lesser time.

Scan Configuration

Navigate to Scan Configuration and click on New as shown in below screenshot.

Give the configuration name as Reflected Input Custom Scan and do the below configuration under Audit Optimization.

In Issues Reported Settings, choose Select individual issues and choose the 4 issues mentioned below.
Input returned in response(stored)
Input returned in response(reflected)
Suspicious input transformation(stored)
Suspicious input transformation(reflected)

In Handling Application Errors During Audit, do the below configuration.

In JavaScript Analysis, below configuration would work fine.

Now its time to save this scan configuration, always tick Save to Library option so that we can use the same settings again and again on different targets as well.

Resource Pool Settings🏊

Use the same resource pool which we created during crawling. Increase or decrease the concurrent requests and delay as per the server’s load handling capability.

With the above audit scan, we would see the results shown below on the Dashboard of the BurpSuite Tool.

Choose one of the request and analyze it. BurpSuite automatically highlight the input area that is getting reflected in the response.

Send this request to repeater and repeat it. Observe the Content-Type response header, it should be text/html as XSS required input to be reflected in DOM(Document Object Model).

Analyze the reflection point.

The reflection point is inside an heading tag. Lets try injecting our basic XSS payload.
Payload used: mayank<script>alert(111)</script>

It seems it got executed😍, lets confirm it from browser as well.

(i). Output Encoding
(ii). WAF (Web Application Firewall)
(iii). Server side validation
(iv). Browser headers such as X-XSS-Protection, Cache-Control
(v). Cookie attributes(httponly)
(vi). Content Security Policy

Hope you enjoyed reading this writeup😊

Read Entire Article