BOOK THIS SPACE FOR AD
ARTICLE ADThere are a number of tools we can use to spider the site such as OWASP-ZAP or Web Scarab but such tools can be very noisy and don’t offer any stealth. Any security device or security engineer worth their overpaid salary will likely notice the additional traffic and rapid requests. One of the ways to avoid this detection is to scan the archive of the website at archive.org (as you know, archive.org maintains a repository of all websites from the past). Of course, these archives will not likely be identical to the live site, but will likely have enough in common to minimize false positives while at the same time, not alerting the site owner.
There is an excellent tool for finding these parameters using archived websites called ParamSpider. Let’s take a look at it here in this tutorial.
Our first step is to download and then install paramspider. We can use git clone to clone it into our system.
kali > sudo git clone https://github.com/devanshbatham/ParamSpider
Now, navigate to the new directory ParamSpider and list the contents of the directory.
kali > cd Paramspider
kali > ls -l
Note the requirements.txt file. We can use that file to load all this tool’s requirements using pip such as;
kali > pip install -r requirements.txt