Getting htaccess Mod-rewrite rules working locally with XAMPP

After spending a whole 2 hours of my life trying to get Apache mod-rewrite rules working with XAMPP on a local computer, I thought I’d share my results as I seemingly tried everything. The problem, I have a simple mod-rewrite rule in my htaccess file. When I upload this to my online web host everything is fine – the working htaccess file for my online host:

RewriteBase /
RewriteEngine on
RewriteRule amnesia/resetpass(.*) recover-password.php$1 [PT]

So typing in www.domain.com/amnesia/resetpass does a simple re-write to www.domain.com/recover-password.php, without the user ever knowing. All is fine. However, when I treid to get this seemingly simple rule to work with XAMPP I ran into problems, getting 404 and 500 responses from the server – obviously quite a pain as this essentially means I can’t test the site using my own web server (E.g. localhost). The site hosted from my computer via the normal setup E.g. xampp/htdocs/mysite. I’ll jump straight to the solution and then explain exactly what things were changed – the working htaccess file is below:

RewriteEngine on
RewriteBase /mysite
options +FollowSymLinks
RewriteRule amnesia/resetpass(.*) recover-password.php$1 [PT]

Firstly, the extra line that uses the +FollowSymLinks directive was added. To explain this I’ll quote straught from the Apache documentation:

To enable the rewriting engine for per-directory configuration files, you need to set “RewriteEngine On” in these files andOptions FollowSymLinks” must be enabled. If your administrator has disabled override of FollowSymLinks for a user’s directory, then you cannot use the rewriting engine. This restriction is needed for security reasons.

The re-write base has been changed to the relative path of the website directory. To finish up, open the http.conf file (the default settings for XAMPP, that get overwritten with you .htaccess file rules on a directory basis), located by default at C:\xampp\apache\conf\http.conf. Find all occurances of AllowOverride None and change it to AllowOverride All. After restarting XAMPP everythign should work. In a nutshell changing the AllowOverride directive in the http.conf file decalres which directives in .htaccess files can override directives from httpd.conf, this is discussed in more dept over here, but basically by having this directive set to None, you’re stopping individual htaccess files from working locally.

SEO Friendly URLs With Mod Rewrite

So called ‘dirty URLS’ (E.g. www.domain.com/products.asp?id=45&cat=34&mode=view) not only look untidy but also pose a security risk as they expose the underlying technology used, in this case, ASP. A much preferred URL in this case would be www.domain.com/product/45 or even better, www.domain.com/product/product_keywords-here. The latter URL structure not only improves useability for your site (the URL makes more sense to the user) and is argued to improve search engine rankings. There is a lot of debate on this subject, but everyone agress that these so called pretty URL’s don’t hurt anything and mainly improve user experience.  Google has also recently posted a video (obviously not giving much away) saying that SEO friendly URL’s do in fact make a small difference and don’t hurt SERPs’

Take the example of this very blog. Pretty urls are used to display the post title and id within the url. There is the option to simply include the title, but this has been proven to slow down general performance of your blog. I digress, let’s get onto some examples where simple URL rewriting with mod rewrite is useful.

Continue reading SEO Friendly URLs With Mod Rewrite

CSS Sprities and Website Optimization

One of the latest and most well established design practices are CSS sprites. This is the practice of combing multiple images into a larger and single, composite image. By using the CSS background-position property selected portions of that master image (or sprite) are displayed. The main issue here is how can a larger image witha larger file size be beneficial, especially when compared to several smaller images? The answer lies in HTTP requests and Yahoo’s 80/20 rule explains this much better than I could! To summarise, the numbers of HTTP requests to the websites is drastically cut, thus loading the page much faster in  single request. Another major beenfit is that not Javascript is required for mouseover code, so you can make image rollovers easily. I have used this technque in the past, but like a lot of people never knew it was called sprites.

In fact, using sprites are so effective many of the internets biggest site’s are using them, all in slightly different ways. On such sites a truely huge number of requests are saved every day. For example, Youtube, Google, AOL,  Amazon and Apple all use CSS sprites. Take the mimilist example of Google (left):

Google CSS Sprite

Youtube, does things slightly different and uses a absolutely very simple sprite and applied the background-position property to each link class. Here’s the simple HTML used for the list: Continue reading CSS Sprities and Website Optimization

Reasons to let Google Host your JQuery Files

It’s often the case that I see busy sites hosting copies of the JQuery library locally. E.g

<script src="/js/jQuery.min.js" type="text/javascript"></script>

The preferred and better way is to host your JQuery through Google E.g.

<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js" type="text/javascript"></script>

So, why is this better? Well there are several valid reasons:

CDN (Content Delivery Network) – Google’s datacenters are located over a range of locations and when a user requests content the closest location is automatically chosen. This is better because it does not force users to download from a single server location (E.g your server) and the chances are Google will be able to serve content faster than your webhost. A similar theory is used for the popularweb based game called quakelive. Usually CDN‘s are a service you pay for, but you’re getting this free through Google!

Less server load – When all your website’s files are located on a single server, downloading them simulainiously increases server load and some users will recieve delays while files download. By having an external location for your JQuery library the latter is not an issue.

Improved caching – This is the biggest benefit as users will not have to re-download content. Hosting JQuery on your own server will cause a first time visitor to download the whole file, even if they have several copies of the same file from other sites. Through Google’s CDN, re-requests for the same file will result in a response to cache the file for up to one year, as it understands that it is a repeat request for a duplicate file.

Local Bandwidth savings – by letting Google host the file for you, you are in essence saving bandwidth. For personal sites this may not be an issue, but busy sites will notice significant bandwidth savings.

Google actually suggests using a .load() function to load the library (see below), but this not only interrupts JQuery’s killer feature (document.ready), but also causes an extra HTTP request. Personally I prfer the old fashioned script method, even though there are several other valid reasons to use the .load() method.

<script type="text/javascript" 
        src="http://www.google.com/jsapi"></script>
<script type="text/javascript">
  google.load("jquery", "1.3.2");
  google.setOnLoadCallback(function() {
  });
</script>

Fixing Common W3C Validation Errors and SEO

Yet another thing to check when doing SEO is that your site validates via the w3c validation checker. A site that is xHTML valid will recieve more frequent search engine crawls and more importantly, longer crawl times. I won’t bore you with further details about why validation is a good thing (it’s a huge subject), but if you must there is a great article about the subject right here. Creating a site to an xHTML valid standard encourages better coding practice and more semantic coding – making your site easier to crawl. You are also giving your site a betttr chance of displaying the same across multiple and future browsers.

Another less known theory is that spiders get full when crawling a page, semantic coding practice will allow for cleaner and more lightweight code. For instance, when crawling a badly coded page with lots of line styles and JavaScript (E.g. content not useful to a spider) the spider may become full too quickly and leave – missing you important content contained further on within the page.

Validating your site to at least xHTML Transaitional 1.0 (the test strict version, compared to xHTMl 1.0 Strict) is highly encouraged and is an area often ignored by developers. Below, I’ll quickly outline some of the common validation errors and how to easily fix them:

cannot generate system identifier for general entity X – 99% of the time this relates to errors with entity references such as ampersands in URLs. E.g. having an url like product.php?id=2&mode=view would result in this error as the ‘&’ wasn;t used within the url.

required attribute “alt” not specified –  simply find the line number and add an alt tag for the image. The presence of an alt tag is required for both transitional and strict doc types.

XML Parsing Error: Opening and ending tag mismatch – Depending on how organised you are when coding this fix can take a matter of seconds or a lot longer. It relates to unclosed block level tags, such as a table or div. One plus point is that fixing such an error often results in several validation errors being fixed at once.

Continue reading Fixing Common W3C Validation Errors and SEO