$ Don’t Leave Money on the Table: My Automated Hunt for $50-$500 Info Disclosure Bugs ️‍♂️
文章介绍了通过自动化工具寻找信息泄露漏洞(如敏感文件、API密钥)的方法,并强调这些看似低危的漏洞可积累赏金。作者分享了使用subfinder、httpx、ffuf等工具进行子域名发现、目录枚举和服务器配置检查的技巧,建议利用自动化提高效率并持续挖掘潜在漏洞。 2025-7-13 05:56:25 Author: infosecwriteups.com(查看原文) 阅读量:9 收藏

Aman Sharma

Let’s be honest: we all dream of the big RCE. But while you’re chasing that unicorn, there’s a steady stream of smaller, often overlooked bounties waiting. I’m talking about information disclosure bugs. They might seem “Low” or “Medium” severity ($50-$500), but finding a bunch of them consistently? That’s how you build your bug bounty bankroll.

The secret? Automation. Stop manually clicking through every page. Stop squinting at every header. Let your tools do the grunt work. This isn’t theoretical fluff; this is about turning those boring scans into actual cash.

Imagine a company’s hidden admin dashboard, sensitive API keys, or even just old, forgotten backup files sitting exposed. That’s information disclosure. It’s like finding a treasure map. While it might not let you take over an account directly, these leaks are goldmines because they can:

  • Lead to bigger bugs: Those exposed API keys might give you access to internal services.
  • Help with recon: Knowing server versions or internal paths makes chaining easier.
  • Be a direct bounty: Many programs pay for PII leaks (even if minor) or clear misconfigurations.

These bugs often slip past developers because they’re not obvious code flaws; they’re usually configuration mistakes or leftover files.

You don’t need to be a coding wizard to automate this. Basic scripting and powerful open-source tools are your best friends.

1. Mapping the Battlefield: Subdomains & URLs

You can’t find exposed info if you don’t know where to look. These tools help you discover every corner of your target.

  • Tools: subfinder, httpx, gau (Get All URLs), waybackurls.

How I use it: I start by finding all subdomains, then check which ones are live. After that, I pull every URL ever recorded for those live sites from historical archives. That old beta.target.com might still be alive with exposed files!

# Find all subdomains for your target
subfinder -d example.com -silent > example_subs.txt

# Check which subdomains are live (HTTP 200, 302, 403 responses)
httpx -l example_subs.txt -mc 200,302,403 -tech-detect -title -silent > example_live_sites.txt

# Get all known URLs from historical data (Wayback Machine, Common Crawl, etc.)
# This finds old, forgotten paths that might be vulnerable
gau --subs example.com | tee example_all_urls.txt

2. Knocking on Every Door: Directory & File Enumeration

Companies often leave backup files, config files, or even entire Git repositories exposed. These are often easy pickings.

  • Tools: ffuf (my favorite, or dirsearch, gobuster).
  • Wordlists: directory-list-2.3-medium.txt (for general paths), raft-large-files.txt (for common file names). You can find these in the SecLists repository.
  • How I use it: I run ffuf against my live URLs, trying common directory and file names, and specific extensions.
# Fuzzing a live subdomain for common sensitive files/directories
# -w: wordlist of common paths (e.g., from SecLists)
# -u: target URL, FUZZ is where the wordlist items go
# -e: extensions to try (.git, .env, .bak, .zip, .yml, .json)
# -mc: match status codes (200 OK is a hit, 403 Forbidden might mean a directory exists)
# -recursion: go deeper into found directories
ffuf -w /path/to/SecLists/Discovery/Web-Content/directory-list-2.3-medium.txt -u https://dev.example.com/FUZZ \
-e .git,.env,.bak,.zip,.yml,.json -mc 200,403 -recursion -rate 50 -of json -o dev_leaks.json

# For API-specific endpoints (e.g., /v1/users, /graphql)
# Build a custom wordlist by extracting from JS files (LinkFinder) or GitHub searches.
ffuf -w my_api_wordlist.txt -u https://api.example.com/FUZZ -mc 200 -H "Content-Type: application/json"

Pro Tip: Fuzzing with Timestamps! Sometimes backups are named like backup_20231001.tar. Generate dates and fuzz!

for i in {2020..2023}; do echo "backup_${i}"; done > dates.txt
ffuf -w dates.txt -u https://target.com/FUZZ.tar

3. Peeking at the Server: Headers & Misconfigurations

Servers love to brag about themselves, and sometimes that bragging leaks valuable info.

  • Tools: nmap, nikto, curl, custom Python scripts.
  • How I use it:
  • Nmap: Great for identifying server versions, open ports, and dangerous HTTP methods
# Basic Nmap scan for service versions and HTTP headers
nmap -sV --script=http-headers,http-title -p 80,443 target.com

# Check for dangerous HTTP methods (PUT, DELETE, TRACE)
nmap -p 80,443 --script http-methods --script-args http-methods.url-path='/admin' target.com

Nikto: Automates checks for thousands of common web server vulnerabilities and insecure headers.

# Basic Nikto scan
nikto -h https://target.com

# Evade detection (tries to be stealthier)
nikto -h target.com -Tuning 1 -evasion 8

Quick Header Check (curl):

curl -I https://target.com
# Look for X-Powered-By, Server, X-Debug-Token, X-AspNet-Version

Python Script for Header/Content Leaks:

import requests

def check_for_info_leaks(url):
print(f"Checking: {url}")
try:
response = requests.get(url, timeout=5)

# Check Headers for versions, debug info
for header, value in response.headers.items():
if header.lower() in ['server', 'x-powered-by', 'via', 'x-debug-token', 'x-aspnet-version']:
print(f" [+] Header Leak: {header}: {value}")

# Check Content for sensitive keywords
content = response.text.lower()
if ".env" in content or "api_key" in content or "secret_key" in content:
print(" [+] Content Leak: Found sensitive keywords!")
if "internal_ip" in content or "192.168." in content or "10." in content:
print(" [+] Content Leak: Found internal IP patterns!")
if "stack trace" in content or "error at" in content:
print(" [?] Content Leak: Found potential stack trace/verbose error!")

except requests.exceptions.RequestException as e:
print(f" Error fetching {url}: {e}")
print("-" * 30)

if __name__ == "__main__":
# Example: Scan a list of live URLs from your httpx output
urls_from_httpx = ["https://www.example.com", "https://api.example.com"]
for url in urls_from_httpx:
check_for_info_leaks(url)

Secret Tips from Real-World Hunts

  • The .git/.svn Heist: Found a .git or .svn directory? Use git-dumper or svn-dumper to download the entire repository. You'll often find API keys, hardcoded secrets, or internal documentation.
  • Backup File Extensions: Always test common backup names like www.zip, backup.tar, or index.php~ (the tilde ~ is a common editor backup).
  • Proxy Everything: Route all your traffic through Burp Suite. Even when automating, manually inspect interesting responses. Burp’s Logger++ is a lifesaver for this.

This isn’t a complex setup. Think of it as a funnel:

  1. Expand Your Surface: Run subfinder, httpx, and gau to get a massive list of potential hunting grounds.
  2. Probe for Leaks: Use ffuf with targeted wordlists and extensions. Simultaneously, run your Python script to automatically check headers and content for quick wins.
  3. Analyze & Report: Review the findings from your tools. Look for clear text secrets, internal IPs, sensitive file contents, or detailed error messages.
  • Example: If ffuf finds target.com/.git/config with a 200 OK status, that's a direct bounty.
  • Example: If your Python script shows X-Powered-By: PHP/7.2.1 and Server: Nginx/1.18.0 in headers, or api_key in the page content, that's valuable info disclosure.

I used to ignore these bugs. Now, they’re a consistent source of smaller payouts. It’s not about being a genius; it’s about being methodical and letting automation find the needles in the haystack. You’d be surprised how many companies pay for exposed .env files or misconfigured S3 buckets.

So, next time you’re feeling stuck on a critical RCE, switch gears. Set up your automation for info disclosure. You might just find that a few $100 bounties are far more reliable than waiting for that single unicorn. It’s all about playing the numbers game smartly.

Follow & subscribe for daily write-up updates via mail on Medium


文章来源: https://infosecwriteups.com/dont-leave-money-on-the-table-my-automated-hunt-for-50-500-info-disclosure-bugs-%EF%B8%8F-%EF%B8%8F-e088eba923cf?source=rss----7b722bfd1b8d---4
如有侵权请联系:admin#unsafe.sh