Mastering Subdomain Enumeration

3 months ago 31


“In the Name of Allah, the Beneficent, the Merciful”

Mastering the art of subdomain enumeration is a crucial skill for those seeking to unlock the full potential of web architecture. By systematically discovering and mapping subdomains, researchers, security professionals, and curious enthusiasts can gain valuable insights, unravel digital footprints, and fortify cyber defenses. The process entails employing various techniques and tools to navigate through the labyrinthine structures of the internet, connecting the dots that form the intricate tapestry of online presence.

This is undetailed version of My write-up, The detailed one:

When talking about subdomain enumeration we have two phases Vertical/Horizontal domain correlation

Vertical VS HorizontalDiscovering the IP space# get the ASN from websites like

# find out the IP ranges that reside inside that ASN
apt-get install whois
whois -h -- '-i origin AS8983' | grep -Eo "([0-9.]+){4}/[0-9]+" | uniq -u > ip_ranges.txt

2. PTR records (Reverse DNS)

cat ip_anges.txt | mapcidr -silent | dnsx -ptr -resp-only -o ptr_recrds.txt

3. Favicon Search

cat urls.txt | python3 -o output

4. Finding related domains/acquisitions

use CHATGPT, Google, wikipedia, [ subfinder -d -o passive2.txt -all ] (it is important to add apis to get better results go to the detailed blog if u diddn`t git itInternet Archive → district → waybackurlsGithub Scraping → github-subdomainsGitLab Scraping → gitlab-subdomains → it is like database or something here u can get all subdomains for public bug bounty programs, yeah it is useless when you work in a private one.DNS Brute Forcing [ using puredns]#Prerequisites
git clone
cd massdns
sudo make install

#Installing the tool
go install

# Download Resolvers List

# You even can make yours
git clone
cd dnsvalidator/
pip3 install -r requirements.txt
pip3 install setuptools==58.2.0
python3 install
dnsvalidator -tL -threads 100 -o resolvers.txt

# Download dns wordlist

# Brute Forcing
puredns bruteforce best-dns-wordlist.txt -r resolvers.txt -w dns_bf.txt

2. Permutations

# Permutation words Wordlist
# Run
gotator -sub subdomains.txt -perm dns_permutations_list.txt -depth 1 -numbers 10 -mindup -adv -md | sort -u > perms.txt
# DNS resolve them and check for valid ones.
puredns resolve permutations.txt -r resolvers.txt > resolved_perms
# Hint: Collect subdomains that is not valid and make compinations then resolve them u may git valid unique subdomains that is hard to find
gotator -sub not_vali_subs.txt -perm dns_permutations_list.txt -depth 1 -numbers 10 -mindup -adv -md | sort -u > perms.txt

3. Google Analytics

git clone
cd AnalyticsRelationships/Python
sudo pip3 install -r requirements.txt
python3 -u

4. TLS, CSP, CNAME Probing

go install
cero | sed 's/^*.//' | grep -e "\." | sort -u
cat subdomains.txt | httpx -csp-probe -status-code -retries 2 -no-color | anew csp_probed.txt | cut -d ' ' -f1 | unfurl -u domains | anew -q csp_subdomains.txt
# cname
dnsx -retry 3 -cname -l subdomains.txt

5. Scraping(JS/Source code)

# Web probing subdomains
cat subdomains.txt | httpx -random-agent -retries 2 -no-color -o probed_tmp_scrap.txt

# Now, that we have web probed URLs, we can send them for crawling to gospider
gospider -S probed_tmp_scrap.txt --js -t 50 -d 3 --sitemap --robots -w -r > gospider.txt

#Cleaning the output
sed -i '/^.\{2048\}./d' gospider.txt
cat gospider.txt | grep -Eo 'https?://[^ ]+' | sed 's/]$//' | unfurl -u domains | grep "$" | sort -u scrap_subs.txt

# Resolving our target subdomains
puredns resolve scrap_subs.txt -w scrap_subs_resolved.txt -r resolvers.txt


go install -v

for sub in $( ( cat $subdomain_list | rev | cut -d '.' -f 3,2,1 | rev | sort | uniq -c | sort -nr | grep -v '1 ' | head -n 10 && cat subdomains.txt | rev | cut -d '.' -f 4,3,2,1 | rev | sort | uniq -c | sort -nr | grep -v '1 ' | head -n 10 ) | sed -e 's/^[[:space:]]*//' | cut -d ' ' -f 2);do
subfinder -d $sub -silent -max-time 2 | anew -q passive_recursive.txt
assetfinder --subs-only $sub | anew -q passive_recursive.txt
amass enum -timeout 2 -passive -d $sub | anew -q passive_recursive.txt
findomain --quiet -t $sub | anew -q passive_recursive.txt

cd subs/
cat horizontal/ptr_records.txt | sort -u > horizontal.txt
cat Vertical/Active/* | sort -u > active.txt
cat Vertical/Pssive/* | sort -u > passive.txt
cat * | sort -u > all_subs.txt
cat all_subs.txt | httpx -random-agent -retries 2 -no-color -o filtered_subs.txt
Read Entire Article