Bitcoin Forum
June 14, 2024, 09:27:05 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: Is there any way to scan a website directory for files?  (Read 3767 times)
SgtSpike (OP)
Legendary
*
Offline Offline

Activity: 1400
Merit: 1005



View Profile
January 04, 2013, 05:38:11 PM
 #1

Just curious, is there any way to find out what files are stored on a particular website directory?  For instance, could someone figure out the filenames of all the files I have stored at the base directory of bitcoinfeedback.com?
yogi
Legendary
*
Offline Offline

Activity: 947
Merit: 1042


Hamster ate my bitcoin


View Profile
January 04, 2013, 05:48:50 PM
 #2

Generally it is not considered safe to allow access to a web server directory system, even if it's read only. This doesn't rule out the possibility that someone may have left a door open.

Crawling the web for links may uncover some of the files on your server. An exhaustive search could also uncover some of these files, but not all file types on servers are directly readable and will not succumb to this technique.

SgtSpike (OP)
Legendary
*
Offline Offline

Activity: 1400
Merit: 1005



View Profile
January 04, 2013, 05:52:33 PM
 #3

Generally it is not considered safe to allow access to a web server directory system, even if it's read only. This doesn't rule out the possibility that someone may have left a door open.

Crawling the web for links may uncover some of the files on your server. An exhaustive search could also uncover some of these files, but not all file types on servers are directly readable and will not succumb to this technique.
Thanks for the answer.

Let me draw up a few assumptions then:
The directory is left "open" (i.e., all files within it are accessible as long as the URL to said file is known), but has an index page to prevent a directory listing.  The file types are directly readable, and are NOT linked anywhere.

Would it be possible to discover the filenames of those files?
exxe
Full Member
***
Offline Offline

Activity: 187
Merit: 100



View Profile
January 04, 2013, 05:54:36 PM
 #4

In your case if someone knows the directory name yes: http://www.bitcoinfeedback.com/images/
You can disable this by removing the "Indexes" argument in the apache directory options. (and I would recommend that you do this)

If you have an index.html in a directory, listings are disabled on most webservers, and the content of index.html is shown.
exxe
Full Member
***
Offline Offline

Activity: 187
Merit: 100



View Profile
January 04, 2013, 06:00:55 PM
 #5

Let me draw up a few assumptions then:
The directory is left "open" (i.e., all files within it are accessible as long as the URL to said file is known), but has an index page to prevent a directory listing.  The file types are directly readable, and are NOT linked anywhere.

Would it be possible to discover the filenames of those files?
No. (A bot could brute force them but there is no direct way)
SgtSpike (OP)
Legendary
*
Offline Offline

Activity: 1400
Merit: 1005



View Profile
January 04, 2013, 06:05:19 PM
 #6

In your case if someone knows the directory name yes: http://www.bitcoinfeedback.com/images/
You can disable this by removing the "Indexes" argument in the apache directory options. (and I would recommend that you do this)

If you have an index.html in a directory, listings are disabled on most webservers, and the content of index.html is shown.
Haha, this discussion did lead me to discover this... uh... problem on my own site.  Wink  I do not have direct access to the apache config, but hopefully I can find a way to indirectly accomplish this.  At the very least, I could throw up an index in each directory.


Let me draw up a few assumptions then:
The directory is left "open" (i.e., all files within it are accessible as long as the URL to said file is known), but has an index page to prevent a directory listing.  The file types are directly readable, and are NOT linked anywhere.

Would it be possible to discover the filenames of those files?
No.
Thank you!
yogi
Legendary
*
Offline Offline

Activity: 947
Merit: 1042


Hamster ate my bitcoin


View Profile
January 04, 2013, 06:07:27 PM
 #7

Let me draw up a few assumptions then:
The directory is left "open" (i.e., all files within it are accessible as long as the URL to said file is known), but has an index page to prevent a directory listing.  The file types are directly readable, and are NOT linked anywhere.

Would it be possible to discover the filenames of those files?

You could use an exhaustive search by trying all possible file names, or even a dictionary attack. If you name your files with long random characters sequences, such as '8s6g86sfshf7h0fdaf73toh3i3ih.html', then this would render an exhaustive search impractical.

SgtSpike (OP)
Legendary
*
Offline Offline

Activity: 1400
Merit: 1005



View Profile
January 04, 2013, 06:10:43 PM
 #8

Let me draw up a few assumptions then:
The directory is left "open" (i.e., all files within it are accessible as long as the URL to said file is known), but has an index page to prevent a directory listing.  The file types are directly readable, and are NOT linked anywhere.

Would it be possible to discover the filenames of those files?

You could use an exhaustive search by trying all possible file names, or even a dictionary attack. If you name your files with long random characters sequences, such as '8s6g86sfshf7h0fdaf73toh3i3ih.html', then this would render an exhaustive search impractical.
How many filenames could be practically searched for on the average server over, say, a month's time?  Or how often can a person feasibly make an HTTP request?

In your case if someone knows the directory name yes: http://www.bitcoinfeedback.com/images/
You can disable this by removing the "Indexes" argument in the apache directory options. (and I would recommend that you do this)

If you have an index.html in a directory, listings are disabled on most webservers, and the content of index.html is shown.
Haha, this discussion did lead me to discover this... uh... problem on my own site.  Wink  I do not have direct access to the apache config, but hopefully I can find a way to indirectly accomplish this.  At the very least, I could throw up an index in each directory.

you probably can use .htaccess files to do it.
Thanks, you're right about that..!
Herodes
Hero Member
*****
Offline Offline

Activity: 868
Merit: 1000


View Profile
January 04, 2013, 06:46:17 PM
 #9

On a shared hosting environment, it might be possible to use ssh and access the same shared web host, and look at your files that way.
SgtSpike (OP)
Legendary
*
Offline Offline

Activity: 1400
Merit: 1005



View Profile
January 04, 2013, 06:52:55 PM
 #10

On a shared hosting environment, it might be possible to use ssh and access the same shared web host, and look at your files that way.
Will keep that in mind, thanks!
Bitsky
Hero Member
*****
Offline Offline

Activity: 576
Merit: 514


View Profile
January 04, 2013, 06:54:17 PM
 #11

How many filenames could be practically searched for on the average server over, say, a month's time?  Or how often can a person feasibly make an HTTP request?
A simple HEAD request would be enough for a scan. Speed mostly depends on your server: a good hardware and configuration can serve hundreds/thousands of requests per second.
Of course if the scan gets out of control you'll notice because your site will start lagging and your logs will blow up.
Even if you think you locked things down (like setting "Options -Indexes") you can run into problems eith e.g. a buggy script. If you pay attention you'll notice that security issues pop up all the time with the most common CMS, like Wordpress, Typo3, Joomla, Drupal etc. If an attacker finds a bug which makes a script return a directory listing you have lost.

Simple fix? Never store files you want to keep private in a location that is accessible by the webserver. Period.

Bounty: Earn up to 68.7 BTC
Like my post? Feel free to drop a tip to 1BitskyZbfR4irjyXDaGAM2wYKQknwX36Y
SgtSpike (OP)
Legendary
*
Offline Offline

Activity: 1400
Merit: 1005



View Profile
January 04, 2013, 06:55:53 PM
 #12

How many filenames could be practically searched for on the average server over, say, a month's time?  Or how often can a person feasibly make an HTTP request?
A simple HEAD request would be enough for a scan. Speed mostly depends on your server: a good hardware and configuration can serve hundreds/thousands of requests per second.
Of course if the scan gets out of control you'll notice because your site will start lagging and your logs will blow up.
Even if you think you locked things down (like setting "Options -Indexes") you can run into problems eith e.g. a buggy script. If you pay attention you'll notice that security issues pop up all the time with the most common CMS, like Wordpress, Typo3, Joomla, Drupal etc. If an attacker finds a bug which makes a script return a directory listing you have lost.

Simple fix? Never store files you want to keep private in a location that is accessible by the webserver. Period.
Wouldn't hundreds/thousands of requests per second be deemed a DDOS attack by any semi-competent webhost?
Herodes
Hero Member
*****
Offline Offline

Activity: 868
Merit: 1000


View Profile
January 04, 2013, 07:01:04 PM
 #13

On a shared hosting environment, it might be possible to use ssh and access the same shared web host, and look at your files that way.
Will keep that in mind, thanks!

Just to be clear; it would implicate that the attacker gets access to the server through a web hosting account, which he of course could hack, or buy legitimately. It's not trivially easy, but if you have a system that you want few people to look at, you should go for a VPS.
Bitsky
Hero Member
*****
Offline Offline

Activity: 576
Merit: 514


View Profile
January 04, 2013, 07:10:45 PM
 #14

Wouldn't hundreds/thousands of requests per second be deemed a DDOS attack by any semi-competent webhost?
Depends. What if you launched a big ad campaign and suddenly you get a mass of requests? You would not want your webhost to shut your site down because he thinks something is wrong.
Granted, that many requests from a single host would be suspicious, but if the attacker has resources (i.e. botnet) he can spread those requests over many hosts.
If you run your own server, you could lay traps. Like, letting file2ban monitor logs for requests for e.g. the infamous info.php and block that IP.
If you absolutely have to store sensitive information on your webserver (data exchange, backup, etc) at least put it into a Truecrypt container and upload that one instead.

you should go for a VPS.
Like Linode? Tongue

Bounty: Earn up to 68.7 BTC
Like my post? Feel free to drop a tip to 1BitskyZbfR4irjyXDaGAM2wYKQknwX36Y
Herodes
Hero Member
*****
Offline Offline

Activity: 868
Merit: 1000


View Profile
January 04, 2013, 07:30:02 PM
 #15

Like Linode? Tongue

No, somebody competent!
SgtSpike (OP)
Legendary
*
Offline Offline

Activity: 1400
Merit: 1005



View Profile
January 04, 2013, 08:48:41 PM
 #16

Wouldn't hundreds/thousands of requests per second be deemed a DDOS attack by any semi-competent webhost?
Depends. What if you launched a big ad campaign and suddenly you get a mass of requests? You would not want your webhost to shut your site down because he thinks something is wrong.
Granted, that many requests from a single host would be suspicious, but if the attacker has resources (i.e. botnet) he can spread those requests over many hosts.
If you run your own server, you could lay traps. Like, letting file2ban monitor logs for requests for e.g. the infamous info.php and block that IP.
If you absolutely have to store sensitive information on your webserver (data exchange, backup, etc) at least put it into a Truecrypt container and upload that one instead.

you should go for a VPS.
Like Linode? Tongue
Alright, I'll keep that in mind, thank you!
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!