Recon. Useful Tips for Bug Bounty — PART 1

3 hours ago 6
BOOK THIS SPACE FOR AD
ARTICLE AD

Recon. Useful Tips for Bug Bounty — PART 1

I start by identifying subdomains related to the target domain. Subdomains can reveal hidden services or entry points into the infrastructure, such as staging, development, or backup environments, which are often overlooked. There are many tools to simplify this work, for example, subfinder.

To discover subdomains, you can use subfinder. Here is the easiest command:

subfinder -d example.com -o subdomains.txt

Alternatively, you can use the following command to get more subdomains by enabling additional options like -all and -recursive:

subfinder -d example.com -all -recursive > subdomains.txt

After discovering subdomains, the next step is to check if the subdomains are live and responding. This helps you focus on the active subdomains for further testing.

Use httpx to check live subdomains:

httpx -l subdomains.txt -o live_subdomains.txt

This command will check each subdomain in subdomains.txt and output the live ones into live_subdomains.txt.

Scan for vulnerabilities based on CVEs: To check if a URL has any known vulnerabilities, you can use the following httpx command:httpx -url <target_url> -vulnDetect technologies used by the website: The -tech option in httpx detects the technologies used on the target site, such as web servers, programming languages, and frameworks:httpx -url <target_url> -techInclude additional ports (8080, 8888, 8000): Many web applications expose dashboards, control panels, or login pages on non-standard ports like 8080, 8888, or 8000. To include these ports in the scan, use this command:cat subdomains.txt | httpx -ports 80,8080,8000,8888 -threads 200 > subdomains_alive.txtURL normalization: To standardize URLs (removing extra slashes, index pages like index.html), use the -uro option:httpx -url <target_url> -uro

Now that you have the live subdomains, it’s time to gather URLs associated with these subdomains. This is useful because vulnerabilities often exist in specific parts of the web application (e.g., login forms, API endpoints).

Tool: GAU (Get All URLs)

GAU gathers URLs from multiple public sources like Common Crawl, Wayback Machine, and Censys. To use GAU to gather URLs from the subdomains:

cat available_subdomains.txt | gau > 1.txt

This will gather URLs associated with the subdomains listed in available_subdomains.txt and store them in 1.txt.

Tool: Waybackurls

The Waybackurls tool retrieves URLs archived by the Wayback Machine (Internet Archive). To fetch archived URLs from Wayback Machine:

cat available_subdomains.txt | waybackurls > 2.txt

This command will gather URLs archived in the Wayback Machine and store them in 2.txt.

After collecting URLs from GAU and Waybackurls, you can combine both lists and remove duplicates using this command:

cat 1.txt 2.txt | sort -u > sorturls.txt

This combines 1.txt and 2.txt, sorts the URLs, and removes duplicates, saving the unique URLs in sorturls.txt.

The next step is to discover parameters within the URLs, such as query parameters (e.g., ?id=, ?user=) and form parameters. These are common attack vectors.

Tool: ParamSpider

To identify parameters in the URLs you’ve gathered, use ParamSpider:

paramspider -l sorturls.txt

This will identify parameters in the URLs listed in sorturls.txt.

You can also specify specific file extensions to focus on certain types of URLs:

paramspider -d example.com -e php,aspx,jpg,png

This will look for parameters in URLs with specific file extensions (php, aspx, jpg, png) for the domain example.com.

Tool: Arjun

Another tool to enumerate parameters is Arjun. To use Arjun to find parameters on a specific page:

arjun -u "http://example.com/search"

This will scan the search URL of example.com for potential parameters.

Now that you have gathered a comprehensive attack surface with live subdomains, URLs, and parameters, you can begin using vulnerability scanners to identify potential weaknesses in the application.

Read Entire Article