Enhanced Visitor Event Tracking With Google Analytics and JQuery

Google Analytics has fast become the industry standard to track a plethora of web based information about your website. Whilst being totally free and easy to setup, you are limited to tracking elements that physically render in the browser – so items such as PDF, ZIP and RSS feeds links are not tracked, this because Google Analytics has a great reliance upon JavaScript. However, tracking such links can be achieved with a small amount of extra work.

Personally, I wasn’t aware you could track specific links with Analytics and only ever considored this when a client asked ‘why doesn’t Google show me the numbers of times my marketing report (read: a PDF file) has been clicked?’ – a totally valid request that I wanted to investigate.

Use JQuery to improve Google Analytics and track downloads, RSS, Email & external links

First things first, make sure you have a google Analytics account, the latest version of JQuery and the latest version of the analytics code running on your website 🙂

As with the majority of the JQuery magic, everything happens within the doc ready event listener – this will used to capture various clicks to select elements.

Tracking Download Link Clicks (PDF, ZIPs etc.)

$(document).ready(function() {
	
	$("a[rel=download]").click( function() {
		var fileName = $(this).attr("href");
		pageTracker._trackPageview(fileName);
		return true;
	});

});

Then on every link you wish to track, simply add the rel attribute to your non HTML files as follows:

<a href="myData.zip" rel="download">Download My ZIP Data File</a>

Tracking Downloads of Specifc File Types (E.g. PDF files)

Using the dollar sign to match against links that end in .pdf (or any extension you wish to track).

$(document).ready(function() {

	$("a[href$=pdf]").click( function() {
		var myPDF = "/pdfDownloads/" . $(this).attr("href");
		pageTracker._trackPageview(myPDF);
		return true;
	});

});

The /pdfDownloads/ is used to identify and seperate report data within Google Analytics.

Tracking the click of a specific link such as an RSS feed

Simply add an identifier to your RSS feed link (in this example the link was given an id of ‘rssFeed’):

$(document).ready(function() {
	
	$("a#rssFeed").click( function() {
		pageTracker._trackEvent("RSS", "RSS Subscriber Link Clicked");
		return true;
	});

});

Tracking mailto: Link Clicks

$(document).ready(function() {
	
	$("a[href^=mailto:]").click( function() {
		pageTracker._trackEvent("Mail", "User clicked on mailto link");
		return true;
	});

});

Tidying up….

You should also disable the clicked element to prevent multiple event recording and provide feedback. To do this, simple add the following at the start of each piece of code – disbabling the element and changing the cursor to an egg timer (although you could display a small graphic to make things look prettier):

$(this).css("cursor", "wait");
$(this).attr("disabled", true);

Improve SEO Through Home Menu Anchor Text Optimisation

seo-anchor-textIt’s a widely known fact that link anchor text is rated YH2Z675VXTC5 highly by search engines and is often the deciding factor in your SERP position for competitive terms, mainly because it gives meaningful information to users (amongst others). Correct use of anchor text (on both inbound and outbound links) will give your page increased meaning.

I’m sure you use this fact throughout your website while performing onsite SEO. Any tutorial will rightly tell you that keyword relevancy is of upmost importance here. However, a lot of the time you end up with a ‘Home’ link on your menu, linking to your main page.

This is bad for SEO for a number reasons. Firstly, the anchor text ‘home’ is very poor choice of word to use as it’s meaning is highly diluted nowadays. Now unless you have a site relating to homes, the keyword isn’t very useful at all, as we don’t want to rank highly for the term ‘home’. However, at the same time users are familiar with such a link and it makes sense to name such a link to your main page. The situation is worsened if your site is large with a great number of internal links. Imagine having a 100 page site, all with the anchor text ‘home’ – this is a lot of inbound links telling search engines each page is related to ‘home’! Furthermore, the menu link’s are usually towards to the top of the page, giving them inscreased relevancy to search engines.

The solution is a compromise, use ‘home’ along with you main keyword(s) – making sure to avoid obvious stop words like ‘and’. For example ‘Graphic Design Home’. Even better use your main keyword directly in the anchor text i.e. ‘Graphic Design’. This is quite a powerful and simple SEO trick that is easy to implement into your site.

Tracking Twitter Performance Using Google Analytics

If you use the ever popular twitter there’s a high chance you’ll be linking to your company webiste or personal blog in your tweets or profile link. As the aim is use twitter as a marketing tool to drive traffic, you can use Google Analytics to track the link you placed the twitter profile – just like an email campiagn or PPC advert

If you use Twitter as a marketing tool to drive traffic to your site then you should treat it in exactly the same way as you would a newsletter, a PPC advert or a banner and track each Tweet’s performance beyond simple click data. How many visits do you get, how long do they stay on your site, how deep do they go, what is the bounce rate like and how much revenue do they generate?

The benefit to ‘tagging’ this link is that Google Analytics will record more than use basic click data – you can record a whole host of advanced user data such as how they navigate your site and length of visit. By default Google will track such links, but traffic from services such as bit.ly will be dumped into the direct traffic area of Google Analytics. The steps to get the latter up and running are quite simple:

1: Go to Google’s URL builder to generate an url . Enter the following information:

Website URL: your website address

Campaign Source: enter a relevant source here to identify your campaign E.g. twitter

Campaign Name: enter a name used to identify the campaign, this is used to identify the campaign in Google Anlytics E.g. twittertracking

2: Click generate URL and something similar to the following will be created: https://www.web-design-talk.co.uk/?utm_source=twitter&utm_medium=social&utm_campaign=twittertrack


3: If posting to twitter you can paste this URL directly in the tweet box, as twitter will automatically shorten this url.

4: After approximately 24 hours data will appear in your analytics account. Simply navigate to Traffic Sources. If you’ve used the same terms to build the url as above you’ll see an entry called ‘twitter / social’. You can also view information by navigating to Traffic Sources > Campaigns where you can click the campiagn name (‘twittertrack’ was used in he example above).

Google Analytics Once Tracking is Installed
Google Analytics Once Tracking is Installed

Fixing Common W3C Validation Errors and SEO

Yet another thing to check when doing SEO is that your site validates via the w3c validation checker. A site that is xHTML valid will recieve more frequent search engine crawls and more importantly, longer crawl times. I won’t bore you with further details about why validation is a good thing (it’s a huge subject), but if you must there is a great article about the subject right here. Creating a site to an xHTML valid standard encourages better coding practice and more semantic coding – making your site easier to crawl. You are also giving your site a betttr chance of displaying the same across multiple and future browsers.

Another less known theory is that spiders get full when crawling a page, semantic coding practice will allow for cleaner and more lightweight code. For instance, when crawling a badly coded page with lots of line styles and JavaScript (E.g. content not useful to a spider) the spider may become full too quickly and leave – missing you important content contained further on within the page.

Validating your site to at least xHTML Transaitional 1.0 (the test strict version, compared to xHTMl 1.0 Strict) is highly encouraged and is an area often ignored by developers. Below, I’ll quickly outline some of the common validation errors and how to easily fix them:

cannot generate system identifier for general entity X – 99% of the time this relates to errors with entity references such as ampersands in URLs. E.g. having an url like product.php?id=2&mode=view would result in this error as the ‘&’ wasn;t used within the url.

required attribute “alt” not specified –  simply find the line number and add an alt tag for the image. The presence of an alt tag is required for both transitional and strict doc types.

XML Parsing Error: Opening and ending tag mismatch – Depending on how organised you are when coding this fix can take a matter of seconds or a lot longer. It relates to unclosed block level tags, such as a table or div. One plus point is that fixing such an error often results in several validation errors being fixed at once.

Continue reading Fixing Common W3C Validation Errors and SEO

301 Redirects for SEO Using htaccess

301 Redirects Prevent 404 Errors
301 Redirects Prevent 404 Errors

Google treats www.website.com and website.com as two totally different websites. This is very bad for your (or even a client’s) website as it may lead to duplicate content and different pageranks to those sites.  This is how Google “canonicalizes” the url and is very bad from an SEO standpoint.

In essence, a web server could return totally different results for each of those pages. I have also encountered the situation where clients have set their preferred domain in Google webmaster tools, have given out the opposite version for SEM and wonder why they don’t see results :)You can easily check the above by using the “site:” operator in Google search. E.g. site:www.website.com and site:website.com

You can use “mod rewrite” rules as a powerful method for redirecting many URLs from one location to another.  This is a simple server level technique for handling redirects. The way people handle this canonicalization issue is purely a personal choice, although the below method can be altered for directing to the none www version of the url.

The .htaccess file is simply an ASCII file created with any normal text editor. You need to save the file as ‘.htaccess’ (no filename, .htaccess is the extension!). Open you newly created .htaccess file in your favoured text editor and add the following lines of code, replacing domain.com with your domain:

RewriteEngine On
RewriteCond %{HTTP_HOST} ^domain.com
RewriteRule (.*) http://www.domain.com/$1 [R=301,L]

Upload the .htaccess file to the root folder of your website and you’re done. All your traffic will be permanently redirected from a non-www version of your website to a www version of your website. To do the opposite (direct all traffic to the non www version use the below code in the .htaccess file):

RewriteEngine On
RewriteCond %{HTTP_HOST} ^www.domain.com
RewriteRule (.*) http://domain.com/$1 [R=301,L]