Using .htaccess files on a Webfusion shared service

This is a subsidiary article to Using PHP applications on a Webfusion hosted service (Linux), and in this want to discuss how to use .htaccess files on a Webfusion Linux shared service to implement sub-domains within your own domain, and to improve performance.

Apache checks every directory in a file’s path for an .htaccess file and processes each that it finds. There is therefore a performance overhead for these, so I prefer to keep to a single .htaccess file in my root public_html and include any required directives in that.

Enhancing User Response

Whilst the core of a webpage content is the HTML file (typically generated by a PHP script in most PHP based applications), there is usually a lot of supplementary content and information used to render it: style sheets, locally executed javascripts, images, etc. Such supplementary content is nearly always static, and therefore response for the end user’s will also be a lot better if this content is not downloaded on every page fetch. This will also reduce network traffic and server load.

The HTTP protocol includes response headers which enable browsers to determine when any locally cached copies can be used. There are two levels of caching / query avoidance:

  • Unconditonal.  Here the server defines an explicit age for the file using either of the Expires or Cache-Control response headers, and this is done based on the age of any previously cached content.
  • Conditional. Here the browser makes a conditional fetch from the server based on the Etag or Last-Modified response headers. This still involves querying the server, but the content itself is only downloaded if it has changed from the previously cached content. All modern browsers will perform a conditional fetch if previously cached content has expired.

The risk with any caches is that they become out of date and the browser doesn’t detect this. This can mean that the user will end up seeing stale data and not realise this. For this reason, I prefer to set the maximum age to 1-7 days, where the content is not truly static (for example, jpeg images that might be updated). This means than in practice such content will be cached for a browsing session, but that the browser will still do occasional conditional fetches to validate the content. Here are the relevant lines from my .htaccess file:

ExpiresActive On
ExpiresByType text/css "access plus 1 month"
ExpiresByType text/javascript "access plus 1 month"
Header add "Cache-Control" "max-age=604800"

Because the HTTP protocol offers different mechanisms for cache-control, there is no single correct way to do this. A google query on “browser caching htaccess” will give you more details and alternative examples.

Note that the Apache core FileETag directive implements the Etag response header to achieve conditional caching. However, you won’t normally need to use this as the default setting (FileETag INode MTime Size) does what is needed here.

Protecting against access to forbidden content

Many applications use local directories for storing data and this data should not be directly accessible by an HTTP request under any circumstance. One teqhnique to implement this is by using per-directory .htaccess files to prevent such content access. I use a simpler convention: I prefix any such private directories or files with an underscore character. I also follow the Linux convention of files prefixed with a fullstop. The following filter implements this policy:

RewriteRule (^_|/_|^\.|/\.)   -    [forbidden]

Implementing subdomains

There are two key components to this: first, understanding how the Webfusion Apache configuration maps domains onto your public_html directory hierarchy and the environment context, and secondly, how to use rewrite rules within .htaccess files to map these onto subdirectories etc. In my case, I have four subdomains on my website, so I have the following rewrite rules:

RewriteCond %{REQUEST_URI}  !^/(blog|b|janetscurtains|files|terry)/ 
RewriteCond %{HTTP_HOST}    ^(blog|janetscurtains|files|terry)\.ellisons\.org   [nocase]
RewriteRule ^(.*)           %1/$1                                               [skip=1]

The second condition checks the host for one of the four subdomains blog, janetscurtains, files and terry. At the same time, the match variable %1 is assigned to the subdomain name. The problem with per-directory rewrite rules is that the Apache system automatically retries these rules until there are no matches, and this can cause a run-away loop; hence the first condition, which prevents the substitution on line 3 if the root directory is one of the subdomain directories. Note the additional check for the sub-directory b which is used as a mapping destination for the blog as discussed in the following section.

Implementing application based mapping

Like the MediaWiki engine all URIs in my blog are dispatched to a single script /b/index.php . So the URI of this article, for example, /article/shared-hosting/using-htaccess-files-on-a-webfusion-shared-service/ first gets mapped to /blog/article-33 by the above subdomain rewrite rules. This is then processed by the following rules:

RewriteRule ^blog/$                             /b/index.php?page=index         [skip=2,QSA]
RewriteRule ^blog/(themes|images|includes)/(.*) /b/$1/$2                        [skip=1]
RewriteRule ^blog/(.*)                          /b/index.php?page=$1            [last,QSA]

The first handles the default home case. The second provides direct access to non-script directories that contain styles, images, etc.. The third handles the remainder which will include /b/index.php?page=article-33 . Also note the use of skip directives and QSA flag to merge and GET parameters.

Leave a Reply