Title: Is there any way to scan a website directory for files? Post by: SgtSpike on January 04, 2013, 05:38:11 PM Just curious, is there any way to find out what files are stored on a particular website directory? For instance, could someone figure out the filenames of all the files I have stored at the base directory of bitcoinfeedback.com?
Title: Re: Is there any way to scan a website directory for files? Post by: yogi on January 04, 2013, 05:48:50 PM Generally it is not considered safe to allow access to a web server directory system, even if it's read only. This doesn't rule out the possibility that someone may have left a door open.
Crawling the web for links may uncover some of the files on your server. An exhaustive search could also uncover some of these files, but not all file types on servers are directly readable and will not succumb to this technique. Title: Re: Is there any way to scan a website directory for files? Post by: SgtSpike on January 04, 2013, 05:52:33 PM Generally it is not considered safe to allow access to a web server directory system, even if it's read only. This doesn't rule out the possibility that someone may have left a door open. Thanks for the answer.Crawling the web for links may uncover some of the files on your server. An exhaustive search could also uncover some of these files, but not all file types on servers are directly readable and will not succumb to this technique. Let me draw up a few assumptions then: The directory is left "open" (i.e., all files within it are accessible as long as the URL to said file is known), but has an index page to prevent a directory listing. The file types are directly readable, and are NOT linked anywhere. Would it be possible to discover the filenames of those files? Title: Re: Is there any way to scan a website directory for files? Post by: exxe on January 04, 2013, 05:54:36 PM In your case if someone knows the directory name yes: http://www.bitcoinfeedback.com/images/
You can disable this by removing the "Indexes" argument in the apache directory options. (and I would recommend that you do this) If you have an index.html in a directory, listings are disabled on most webservers, and the content of index.html is shown. Title: Re: Is there any way to scan a website directory for files? Post by: exxe on January 04, 2013, 06:00:55 PM Let me draw up a few assumptions then: No. (A bot could brute force them but there is no direct way)The directory is left "open" (i.e., all files within it are accessible as long as the URL to said file is known), but has an index page to prevent a directory listing. The file types are directly readable, and are NOT linked anywhere. Would it be possible to discover the filenames of those files? Title: Re: Is there any way to scan a website directory for files? Post by: SgtSpike on January 04, 2013, 06:05:19 PM In your case if someone knows the directory name yes: http://www.bitcoinfeedback.com/images/ Haha, this discussion did lead me to discover this... uh... problem on my own site. ;) I do not have direct access to the apache config, but hopefully I can find a way to indirectly accomplish this. At the very least, I could throw up an index in each directory.You can disable this by removing the "Indexes" argument in the apache directory options. (and I would recommend that you do this) If you have an index.html in a directory, listings are disabled on most webservers, and the content of index.html is shown. Let me draw up a few assumptions then: No.The directory is left "open" (i.e., all files within it are accessible as long as the URL to said file is known), but has an index page to prevent a directory listing. The file types are directly readable, and are NOT linked anywhere. Would it be possible to discover the filenames of those files? Title: Re: Is there any way to scan a website directory for files? Post by: yogi on January 04, 2013, 06:07:27 PM Let me draw up a few assumptions then: The directory is left "open" (i.e., all files within it are accessible as long as the URL to said file is known), but has an index page to prevent a directory listing. The file types are directly readable, and are NOT linked anywhere. Would it be possible to discover the filenames of those files? You could use an exhaustive search by trying all possible file names, or even a dictionary attack. If you name your files with long random characters sequences, such as '8s6g86sfshf7h0fdaf73toh3i3ih.html', then this would render an exhaustive search impractical. Title: Re: Is there any way to scan a website directory for files? Post by: SgtSpike on January 04, 2013, 06:10:43 PM Let me draw up a few assumptions then: The directory is left "open" (i.e., all files within it are accessible as long as the URL to said file is known), but has an index page to prevent a directory listing. The file types are directly readable, and are NOT linked anywhere. Would it be possible to discover the filenames of those files? You could use an exhaustive search by trying all possible file names, or even a dictionary attack. If you name your files with long random characters sequences, such as '8s6g86sfshf7h0fdaf73toh3i3ih.html', then this would render an exhaustive search impractical. In your case if someone knows the directory name yes: http://www.bitcoinfeedback.com/images/ Haha, this discussion did lead me to discover this... uh... problem on my own site. ;) I do not have direct access to the apache config, but hopefully I can find a way to indirectly accomplish this. At the very least, I could throw up an index in each directory.You can disable this by removing the "Indexes" argument in the apache directory options. (and I would recommend that you do this) If you have an index.html in a directory, listings are disabled on most webservers, and the content of index.html is shown. you probably can use .htaccess files to do it. Title: Re: Is there any way to scan a website directory for files? Post by: Herodes on January 04, 2013, 06:46:17 PM On a shared hosting environment, it might be possible to use ssh and access the same shared web host, and look at your files that way.
Title: Re: Is there any way to scan a website directory for files? Post by: SgtSpike on January 04, 2013, 06:52:55 PM On a shared hosting environment, it might be possible to use ssh and access the same shared web host, and look at your files that way. Will keep that in mind, thanks!Title: Re: Is there any way to scan a website directory for files? Post by: Bitsky on January 04, 2013, 06:54:17 PM How many filenames could be practically searched for on the average server over, say, a month's time? Or how often can a person feasibly make an HTTP request? A simple HEAD request would be enough for a scan. Speed mostly depends on your server: a good hardware and configuration can serve hundreds/thousands of requests per second.Of course if the scan gets out of control you'll notice because your site will start lagging and your logs will blow up. Even if you think you locked things down (like setting "Options -Indexes") you can run into problems eith e.g. a buggy script. If you pay attention you'll notice that security issues pop up all the time with the most common CMS, like Wordpress, Typo3, Joomla, Drupal etc. If an attacker finds a bug which makes a script return a directory listing you have lost. Simple fix? Never store files you want to keep private in a location that is accessible by the webserver. Period. Title: Re: Is there any way to scan a website directory for files? Post by: SgtSpike on January 04, 2013, 06:55:53 PM How many filenames could be practically searched for on the average server over, say, a month's time? Or how often can a person feasibly make an HTTP request? A simple HEAD request would be enough for a scan. Speed mostly depends on your server: a good hardware and configuration can serve hundreds/thousands of requests per second.Of course if the scan gets out of control you'll notice because your site will start lagging and your logs will blow up. Even if you think you locked things down (like setting "Options -Indexes") you can run into problems eith e.g. a buggy script. If you pay attention you'll notice that security issues pop up all the time with the most common CMS, like Wordpress, Typo3, Joomla, Drupal etc. If an attacker finds a bug which makes a script return a directory listing you have lost. Simple fix? Never store files you want to keep private in a location that is accessible by the webserver. Period. Title: Re: Is there any way to scan a website directory for files? Post by: Herodes on January 04, 2013, 07:01:04 PM On a shared hosting environment, it might be possible to use ssh and access the same shared web host, and look at your files that way. Will keep that in mind, thanks!Just to be clear; it would implicate that the attacker gets access to the server through a web hosting account, which he of course could hack, or buy legitimately. It's not trivially easy, but if you have a system that you want few people to look at, you should go for a VPS. Title: Re: Is there any way to scan a website directory for files? Post by: Bitsky on January 04, 2013, 07:10:45 PM Wouldn't hundreds/thousands of requests per second be deemed a DDOS attack by any semi-competent webhost? Depends. What if you launched a big ad campaign and suddenly you get a mass of requests? You would not want your webhost to shut your site down because he thinks something is wrong.Granted, that many requests from a single host would be suspicious, but if the attacker has resources (i.e. botnet) he can spread those requests over many hosts. If you run your own server, you could lay traps. Like, letting file2ban monitor logs for requests for e.g. the infamous info.php and block that IP. If you absolutely have to store sensitive information on your webserver (data exchange, backup, etc) at least put it into a Truecrypt container and upload that one instead. you should go for a VPS. Like Linode? :PTitle: Re: Is there any way to scan a website directory for files? Post by: Herodes on January 04, 2013, 07:30:02 PM Title: Re: Is there any way to scan a website directory for files? Post by: SgtSpike on January 04, 2013, 08:48:41 PM Wouldn't hundreds/thousands of requests per second be deemed a DDOS attack by any semi-competent webhost? Depends. What if you launched a big ad campaign and suddenly you get a mass of requests? You would not want your webhost to shut your site down because he thinks something is wrong.Granted, that many requests from a single host would be suspicious, but if the attacker has resources (i.e. botnet) he can spread those requests over many hosts. If you run your own server, you could lay traps. Like, letting file2ban monitor logs for requests for e.g. the infamous info.php and block that IP. If you absolutely have to store sensitive information on your webserver (data exchange, backup, etc) at least put it into a Truecrypt container and upload that one instead. you should go for a VPS. Like Linode? :P |