Tips

NTLM cracking with --remove

cat .\hash.txt | .\cut.exe -d ":" "-f1,4" | Out-File -FilePath ntlm.txt -Encoding utf8

Script sort cracked NTLM

python3 script_passwords.py hashes.txt cracked.txt

#!/usr/bin/env python

import sys

print("---start process---")

file_hashes = sys.argv[1]
file_cracked = sys.argv[2]

with open(file_hashes) as f:
    hashes = [line.rstrip() for line in f]

with open(file_cracked) as f:
    cracked = [line.rstrip() for line in f]

f = open("output.txt", "w")

for hash in hashes:
	for crack in cracked:
		a=crack.split(":")
		if a[0] in hash:
			f.write(crack)
			f.write("\n")
			
f.close()

The ouput.txt file is generated with all the NTLM hashes, including repetitions.

Finally, the top of the passwords can be see:

cat output.txt | sort | uniq -c | sort -nr
or
Get-Content .\output.txt | Group-Object | Sort-Object Count -Descending | Select-Object Name, Count

Create customized dic from rockyou

# grep -Ei 'batman|arkham|joker|alfred|bruce' /usr/share/wordlists/rockyou.txt > batman.txt
# wc -l batman.txt
5532 batman.txt

Password Profiling / Skweez && CEWL

.\skweez.exe https://xxxx/pt-pt https://xxx/pilotos -n 16 -m 1 -o teste.txt
cewl www.megacorpone.com -m 6 -w megacorp-cewl.txt

Chartset custom hashcat

Hashcat methodology cracking

  • Cracking wordlist based

  • Cracking with Rules

  • Hybrid cracking: Wordlist + mask && mask + Wordlist.

Hint: use ?d (several ...) and ?s?d?d ..n

  • Brute-force cracking: ?sTarget?s?d?d?d?d (incremental 7 - 15)

  • Pure brute-force with chartset: (3) ?l?u ?3?3?3?3?3?3?3

Password analysis (Active Directory)

Replace several "domains/users" entries for the same domain.

Bonus: run powershell script to get target groups ;)

#import powerview module!
#pick the target groups manually or filter *admin*
Get-DomainGroup -Properties name | Out-File -FilePath domaingroups.txt

#create target file with domain groups: groups.txt

#execute the script and change the vars!

-------------------------------------------------------------------------
$domain="org_domain_xpto.pt\"
$workdir_files=".\workdir_name\"
$dc_dump_file="hash.txt"
$cracked_file="cracked.txt"

foreach($group in Get-Content .\groups.txt) {
    if($line -match $regex){
        Write-Host "building group file: $group" -ForegroundColor red -BackgroundColor white 
        Get-DomainGroupMember $group | Select-Object -Property MemberDomain, MemberName, MemberObjectClass -ExpandProperty MemberName |  foreach {if($_.MemberObjectClass -eq "user"){$domain+$_.MemberName}} | Out-File -FilePath "$group.txt" -Encoding utf8
        
    }
}

Write-Host "DPAT groups:"
Write-Host "python.exe .\dpat.py -n $workdir_files$dc_dump_file -c $workdir_files$cracked_file -g " -NoNewline
foreach($group in Get-Content .\groups.txt) {
    if($line -match $regex){
        If ((Get-Content "$group.txt")) {
          Write-Host " '$workdir_files$group.txt'" -NoNewline
        }
    }
}
Write-Host ""
Write-Host "Done! ;)" -ForegroundColor red -BackgroundColor white

Via BloodHound or Neo4J

//get all groups
MATCH (g:Group) WHERE g.name CONTAINS 'ADMIN'
RETURN g.name AS GroupName

//get all users from group
Match (u:User)-[:MemberOf]->(g:Group) WHERE g.name CONTAINS "DOMAIN ADMINS" return u.name,g.name

//I like that: get all users from domais like "ADMIN"
Match (u:User)-[:MemberOf]->(g:Group) WHERE g.name CONTAINS "ADMIN" return u.name,g.name

The Neo4J output can be exported as CSV: user,group.

The following script generates group files with target users. We just need to execute it:

python3 scrypt.py export.csv domain_ntds.txt
import sys
import os
import shutil

def copy_file_to_folder(source_file, destination_folder):
    # Extract the filename from the source file path
    file_name = os.path.basename(source_file)
    
    # Destination file path
    destination_file = os.path.join(destination_folder, file_name)

    # Copy the file to the destination folder
    shutil.copy2(source_file, destination_folder)

    # Print the name of the copied file
    print(f"File '{file_name}' copied to '{destination_folder}'")

def load_hashes_users_domains(file_path):
    hashes_users_domains = []
    with open(file_path, 'r') as file:
        for line in file:
            if '\\' not in line:
                continue
            parts = line.strip().split(':')
            domain, user = parts[0].split('\\')
            hash_value = parts[3]
            hashes_users_domains.append((domain, user, hash_value))
    return hashes_users_domains

def change_domain_prefix(input_folder, hashes_users_domains):
    for file_name in os.listdir(input_folder):
        if file_name.endswith('.txt'):
                file_path = os.path.join(input_folder, file_name)
                with open(file_path, 'r') as file:
                    lines = file.readlines()
                 
                with open(file_path, 'w') as file:
                    for line in lines:
                         for domain, user, hash_value in hashes_users_domains:
                              parts = line.strip().split("\\")
                              if parts[1].strip().lower() == user.strip().lower():
                                  modified_line = domain + "\\" + parts[1] + "\n"
                                  file.write(modified_line)
                              else:
                                  continue  

def group_users(source_file, folder_name):
    # Dictionary to store users grouped by group names
    group_users_dict = {}

    created_files = []  # List to store the names of created files

    with open(source_file, 'r') as file:
        for line in file:
            # Splitting each line into user and group
            user, group = line.strip().split(',')

            # Remove everything after "@" character
            user = user.split('@')[0]

            # Add prefix to the user
            user_with_prefix = f"{prefix}\{user}"

            # Check if the group already exists in the dictionary
            if group in group_users_dict:
                # Append the user to the existing group
                group_users_dict[group].append(user_with_prefix)
            else:
                # Create a new group entry in the dictionary
                group_users_dict[group] = [user_with_prefix]

    # Writing users to separate output files named after each group inside the folder
    for group, users in group_users_dict.items():
        output_file = os.path.join(folder_name, f"{group}.txt")
        with open(output_file, 'w') as out:
            for user in users:
                out.write(user + '\n')
        created_files.append(output_file)

    # Generating the command with the -g parameter containing the names of the files created
    command_files = ' '.join(f'"{file}"' for file in created_files)
    command = f"python3 dpat.py -n '{folder_name}/{dump_file}' -c '{folder_name}/cracked.txt' -g {command_files}"
    print("\n\n\n" + command + "\n\n\n")

    print(f"All users grouped by the prefix '{prefix}' have been saved to '{folder_name}' folder.")

if __name__ == "__main__":
    if len(sys.argv) != 3:
        print("Usage: python script.py <csv_neo4j> <dump_ntds> ")
        sys.exit(1)
    
    input_file = sys.argv[1]
    dump_file = sys.argv[2]
    prefix = input("Enter DOMAIN prefix for all users: ")
    folder_name = f"{prefix}_user_files"
    os.makedirs(folder_name, exist_ok=True)
    input_folder = folder_name
    group_users(input_file, folder_name)
    
    hashes_users_domains = load_hashes_users_domains(dump_file)
    
    change_domain_prefix(folder_name, hashes_users_domains)
    print("Domain prefixes have been updated in the files.")
    
    copy_file_to_folder(dump_file, folder_name)
    

NTDS Active Users

By getting the NTDS file, we can exfiltrate the active users from neo4j database, and get a new NTDS file with just the active users. For this, we can use the following script.

# Definir os caminhos dos ficheiros
file1_path = 'ntds_file.txt'
file2_path = 'sam_active_users.txt'
output_path = 'matching_users.txt'

# Ler o ficheiro com o dump de palavras-passe
with open(file1_path, 'r') as file1:
    dump_data = file1.readlines()

# Ler o ficheiro com os SAM users ativos
with open(file2_path, 'r') as file2:
    active_users = [line.strip() for line in file2.readlines()]

# Abrir o ficheiro de saída para escrever os resultados
with open(output_path, 'w') as output_file:
    # Iterar por cada linha no dump de palavras-passe
    for line in dump_data:
        # Extrair o sam-user de cada linha do dump
        parts = line.split(':')
        if len(parts) >= 2:
            domain_user = parts[0]
            sam_user = domain_user.split('\\')[-1]
            # Verificar se o sam-user está na lista de SAM users ativos
            if sam_user in active_users:
                # Escrever a linha correspondente no ficheiro de saída
                output_file.write(line)

print(f"Resultados com match foram guardados em {output_path}")

AD_Miner

AD Miner is an Active Directory audit tool that leverages cypher queries to crunch data from the #Bloodhound graph database to uncover security weaknesses.

BONUS

Last updated