Apache is the world’s most popular web server on the internet, serving almost 50% of all websites. By default, every file under its document root is available for download on the internet so it’s a good idea to review the possible configuration settings and make sure that no unwanted information is made public.
There are a few basic configuration options that go a long way in providing a safety net and protecting against accidental information disclosure.
Configuration structure, .htaccess files
Most of the configuration directives we are going to look at today are available to both the system administrator and the casual user. System-level configuration files live under the /etc/apache2 folder that cannot be changed by the casual user, these are meant to be configured at server startup. Some Apache settings can be changed by creating/editing a special file called “.htaccess”. This file contains configuration directives that apply to the folder and all subfolders the “.htaccess” file is uploaded into.
These options are called run-time configuration directives and their documentation is available at the following link: https://httpd.apache.org/docs/2.4/mod/quickreference.html
Each configuration option has a list of available contexts that tells where they can be used
- Server config (anywhere in the Apache config)
- Virtual host (inside the virtual host definition: <VirtualHost>)
- Directory (inside a directory block: <Directory>
- .htaccess (in .htaccess files)
Performance of run-time configuration
There is a minor performance hit when using .htaccess files, because every time a file is downloaded, Apache needs to traverse all the folders back to the root folder and check each one for the existence of “.htaccess” directives. High-performance configurations sometimes disable .htaccess completely but that may render various website engines unusable because they blindly assume that “.htaccess” would work.
Options in system-level Apache config files are only read and parsed once when the web-server is started so they provide better performance compared to “.htaccess” options, that need to be opened and parsed by the web-server every time a new HTTP request comes in.
Corrupting a “.htaccess” file will instantly cause the whole website (or the website under the subfolder it was uploaded into) to display a “500 Internal Error” instead. Apache error logs can tell you what option is causing this and as soon as the file is fixed (or removed if it was just uploaded) should quickly fix the error.
If .htaccess is disabled on the server, simply adding the following block to the Apache2 virtual host will enable them:
<Directory /var/www/webroot/here> AllowOverride All </Directory>
It is also possible to limit .htaccess changes to specific directives only, you can find more information about it here: https://httpd.apache.org/docs/2.4/mod/core.html#allowoverride
Securing folder indexes
As a rule of thumb, only files that are meant to be public should ever be uploaded into the Apache document root folder (the folder that is the root of the website it is serving). It’s a typical mistake to upload files into random folders assuming that no one will find them. Looking at Apace access logs, it’s apparent that random robots are scanning typical folders daily.
By default, any folder that has no index file in them (index.html / index.php / index.htm) will display a list of files and folders instead. This is called an “open directory” and it provides easy access to download all files / folders from a published directory. To disable it, add the following snippet to the “.htaccess” file of the root folder of the website:
This will make Apache display a 403 Forbidden error instead of serving the list of files to the visitor every time there is no index available. Because of the way .htaccess works, this will apply to the whole website (subfolders included).
It’s also possible to disable this on the system level for all websites, by putting the same directive into the system Apache configuration in /etc/apache2. As always, you should always test configuration changes to make sure they work.
Banning the download of dotfiles
In Linux (and any *nix/BSD derivative system including macOS) files starting with dot are “hidden” files. They sometimes contain sensitive information (like developer environment preferences including usernames / passwords) so it’s useful to disable the serving of files and folders starting with “.” altogether.
By default, Apache is set to disallow the download of any file that starts with “.ht” to protect .htaccess and related files from public access but it is better to extend this to all dotfiles.
This system level configuration snippet disables dotfile access of files / folders alltogether:
<FilesMatch "^\."> Order allow,deny Deny from all Satisfy All </FilesMatch> <DirectoryMatch "^(.*/)*\..*"> Order allow,deny Deny from all Satisfy All </DirectoryMatch> <LocationMatch "\/\..*"> Order allow,deny Deny from all Satisfy All </LocationMatch>
There is one common case where this may cause problems, LetsEncrypt SSL certificates are issued using files in dotfile folders, so they should be added as an exception:
<LocationMatch "^\/\.well-known\/acme-challenge\/.*"> Allow from all </LocationMatch>
Unfortunately, these settings only work on the system level because there is no <LocationMatch> and <DirectoryMatch> functionality inside .htaccess blocks. The same effect can be achieved by using a rewrite rule in “.htaccess”, this will need mod_rewrite to be enabled:
RewriteEngine On RewriteRule "(^|/)\." "-" [F,L]
What this does is that it enabled rewrites then checks the URL if it matches the pattern “<beginning of url>.” or “/.” and blocks access using the [F] flag. If mod_rewrite is not enabled, this block will cause a 500 error – it’s possible to wrap this into conditional directives like “<IfModule mod_rewrite.c>” and “</IfModule>” to only run if mod_rewrite is enabled but that would mean that disabling it would silently enable access of dotfiles and in this case, showing an 500 error is preferable.
Limit access to IP addresses
While webservers often exist to serve files to the general public, some sensitive parts or folders can easily be protected from exploitation and information leaks. Using .htaccess files and Deny/Allow directives, any folder can be protected to be available from specific IP addresses only by adding the following code block:
Order Deny,Allow Deny from all Allow from 184.108.40.206 Allow from 220.127.116.11 Allow from 192.168.0.0/16
This will block access to all files and folders and only allow access from the IP addresses listed in the “Allow” directives and deny everyone else by default. Allow and Deny also accepts whole ip ranges or network blocks (called the CIDR notation, that line ending in /16). You should always test access controls by at least trying to access from a different location or simply changing the list of allowed IPs to a different IP to verify that it blocks access as it should.
In the second part of this article, we’ll look at the more advanced examples, including
- blocking TOR access to specific files or the whole website
- limiting access to a specific country using geolocation lookups
- client-side SSL certificates
- password protecting folders
Update: Part 2 is available here: Protect files with Apache, part 2