Improve SEO Through Home Menu Anchor Text Optimisation

seo-anchor-textIt’s a widely known fact that link anchor text is rated YH2Z675VXTC5 highly by search engines and is often the deciding factor in your SERP position for competitive terms, mainly because it gives meaningful information to users (amongst others). Correct use of anchor text (on both inbound and outbound links) will give your page increased meaning.

I’m sure you use this fact throughout your website while performing onsite SEO. Any tutorial will rightly tell you that keyword relevancy is of upmost importance here. However, a lot of the time you end up with a ‘Home’ link on your menu, linking to your main page.

This is bad for SEO for a number reasons. Firstly, the anchor text ‘home’ is very poor choice of word to use as it’s meaning is highly diluted nowadays. Now unless you have a site relating to homes, the keyword isn’t very useful at all, as we don’t want to rank highly for the term ‘home’. However, at the same time users are familiar with such a link and it makes sense to name such a link to your main page. The situation is worsened if your site is large with a great number of internal links. Imagine having a 100 page site, all with the anchor text ‘home’ – this is a lot of inbound links telling search engines each page is related to ‘home’! Furthermore, the menu link’s are usually towards to the top of the page, giving them inscreased relevancy to search engines.

The solution is a compromise, use ‘home’ along with you main keyword(s) – making sure to avoid obvious stop words like ‘and’. For example ‘Graphic Design Home’. Even better use your main keyword directly in the anchor text i.e. ‘Graphic Design’. This is quite a powerful and simple SEO trick that is easy to implement into your site.

Process Custom eCommerce data using Paypal IPN

More often than not it’s hard to visualise how you can send custom information to Paypal during checkout. The list of available hidden field variables initially seems very specific and restrictive at best. Granted, you can easily send over simple things such as your shipping amountm and tax rate. However, during order processing (done in you IPN script that Paypal sends the transaction’s IPN post data to) you often want to record more information when creating and storingn order information.

For example, let’s say the user can enter a specific coupon code during checkout. You would want to make sure the Paypal transaction has been successful before making the voucher as used, to do this you would need to know what code was entered during checkout. In the following simple example we’ll use the ‘custom’ field variable  – this is an optional field, whereby the the data is never presented to the shopper and can be 256 character long. Whatever is placed in this field before clicking your checkout button will be invisibly sent ot Paypal and posted back to your IPN script (assuming you have youre ‘rm’ or return method set to 2, or ‘POST’). The HTML for the hidden field is very simple:

<input type="hidden" name="custom" value="YOUR CUSTOM INFORMATION HERE">

Now, let’s say when you’re recording all the order information you weant to record the exact coupon code, delivery method id, the method by which the customer found your site (E.g. another id) and referrer id number. For convenience and for the sake of the example, I’ll assume you’ve done all the necessary processing to get your four pieces of information. The code to create our hidden field data is as follows:

/* ...logic to get below variables here! */

$shipping_method_id = '33';
$coupon_code = '45895';
$found_out_method = '9';
$referrer_id = '200';

$custom_info =
                       array(
			'shipping_method_id' => 33,
			'coupon_code'=> 45895 ,
			'found_out_method' => 20 ,
			'referrer_id' => 9
			);

/* Initiase field data and looping variable */
$field_data = NULL;
$i = 0;

/* Loop through info array to build data string */
foreach ($custom_info as $key => $field) {

	$field_data .= $field;
	$i++;

	if ( $i !== count($custom_info)) {
	      $field_data .= '-';
	}

}

/* create the hidden field
generated HTML is <input type="hidden" name="custom" value="33-45895-20-9"> */
echo '<input type="hidden" name="custom" value="'.$field_data.'">' . "\n";

Printing the value of $field data will give you the string ’33-45895-20-9′. This translates to shipping method id 33, coupon code 45895, found out id 20 and referrer id 9. The dash symbol has been used to delimit values for convenience, as we need to split this string up later on.

To process these variables in your IPN script (see the Paypal PHP sample script) you simply use PHP’s explode method to split the data into an array:

/* $_POST['custom'] contains the custom information we initially sent to Paypal */
$data = explode('-',$_POST['custom']);

/* Convert $data array into variables for further processing */
$shipping_method_id = $data[0];
$coupon_code = $data[1];
$found_out_method = $data[2];
$referrer_id = $data[3];

So you’ve now posted a string of custom information to Paypal and got this custom data back via IPN during the transaction processing. Now you have assigned each piece iof the array to a variable, you can easily continue to create your order header and save it to a database.

Tracking Twitter Performance Using Google Analytics

If you use the ever popular twitter there’s a high chance you’ll be linking to your company webiste or personal blog in your tweets or profile link. As the aim is use twitter as a marketing tool to drive traffic, you can use Google Analytics to track the link you placed the twitter profile – just like an email campiagn or PPC advert

If you use Twitter as a marketing tool to drive traffic to your site then you should treat it in exactly the same way as you would a newsletter, a PPC advert or a banner and track each Tweet’s performance beyond simple click data. How many visits do you get, how long do they stay on your site, how deep do they go, what is the bounce rate like and how much revenue do they generate?

The benefit to ‘tagging’ this link is that Google Analytics will record more than use basic click data – you can record a whole host of advanced user data such as how they navigate your site and length of visit. By default Google will track such links, but traffic from services such as bit.ly will be dumped into the direct traffic area of Google Analytics. The steps to get the latter up and running are quite simple:

1: Go to Google’s URL builder to generate an url . Enter the following information:

Website URL: your website address

Campaign Source: enter a relevant source here to identify your campaign E.g. twitter

Campaign Name: enter a name used to identify the campaign, this is used to identify the campaign in Google Anlytics E.g. twittertracking

2: Click generate URL and something similar to the following will be created: https://www.web-design-talk.co.uk/?utm_source=twitter&utm_medium=social&utm_campaign=twittertrack


3: If posting to twitter you can paste this URL directly in the tweet box, as twitter will automatically shorten this url.

4: After approximately 24 hours data will appear in your analytics account. Simply navigate to Traffic Sources. If you’ve used the same terms to build the url as above you’ll see an entry called ‘twitter / social’. You can also view information by navigating to Traffic Sources > Campaigns where you can click the campiagn name (‘twittertrack’ was used in he example above).

Google Analytics Once Tracking is Installed
Google Analytics Once Tracking is Installed

MySQL Cheatsheet – Useful MySQL Queries

Getting htaccess Mod-rewrite rules working locally with XAMPP

After spending a whole 2 hours of my life trying to get Apache mod-rewrite rules working with XAMPP on a local computer, I thought I’d share my results as I seemingly tried everything. The problem, I have a simple mod-rewrite rule in my htaccess file. When I upload this to my online web host everything is fine – the working htaccess file for my online host:

RewriteBase /
RewriteEngine on
RewriteRule amnesia/resetpass(.*) recover-password.php$1 [PT]

So typing in www.domain.com/amnesia/resetpass does a simple re-write to www.domain.com/recover-password.php, without the user ever knowing. All is fine. However, when I treid to get this seemingly simple rule to work with XAMPP I ran into problems, getting 404 and 500 responses from the server – obviously quite a pain as this essentially means I can’t test the site using my own web server (E.g. localhost). The site hosted from my computer via the normal setup E.g. xampp/htdocs/mysite. I’ll jump straight to the solution and then explain exactly what things were changed – the working htaccess file is below:

RewriteEngine on
RewriteBase /mysite
options +FollowSymLinks
RewriteRule amnesia/resetpass(.*) recover-password.php$1 [PT]

Firstly, the extra line that uses the +FollowSymLinks directive was added. To explain this I’ll quote straught from the Apache documentation:

To enable the rewriting engine for per-directory configuration files, you need to set “RewriteEngine On” in these files andOptions FollowSymLinks” must be enabled. If your administrator has disabled override of FollowSymLinks for a user’s directory, then you cannot use the rewriting engine. This restriction is needed for security reasons.

The re-write base has been changed to the relative path of the website directory. To finish up, open the http.conf file (the default settings for XAMPP, that get overwritten with you .htaccess file rules on a directory basis), located by default at C:\xampp\apache\conf\http.conf. Find all occurances of AllowOverride None and change it to AllowOverride All. After restarting XAMPP everythign should work. In a nutshell changing the AllowOverride directive in the http.conf file decalres which directives in .htaccess files can override directives from httpd.conf, this is discussed in more dept over here, but basically by having this directive set to None, you’re stopping individual htaccess files from working locally.

SEO Friendly URLs With Mod Rewrite

So called ‘dirty URLS’ (E.g. www.domain.com/products.asp?id=45&cat=34&mode=view) not only look untidy but also pose a security risk as they expose the underlying technology used, in this case, ASP. A much preferred URL in this case would be www.domain.com/product/45 or even better, www.domain.com/product/product_keywords-here. The latter URL structure not only improves useability for your site (the URL makes more sense to the user) and is argued to improve search engine rankings. There is a lot of debate on this subject, but everyone agress that these so called pretty URL’s don’t hurt anything and mainly improve user experience.  Google has also recently posted a video (obviously not giving much away) saying that SEO friendly URL’s do in fact make a small difference and don’t hurt SERPs’

Take the example of this very blog. Pretty urls are used to display the post title and id within the url. There is the option to simply include the title, but this has been proven to slow down general performance of your blog. I digress, let’s get onto some examples where simple URL rewriting with mod rewrite is useful.

Continue reading SEO Friendly URLs With Mod Rewrite

Getting Multiple Array Form Values With PHP

php array code
PHP Arrays

Further to my article on using JQuery to dynamically append form elements, I have come across situations where multiple items should be appended to the form each time, as opposed to a single input in my article (I did this simplicity). For example, at work I’m currently working on an internal system whereby a user needs to add an unlimited amount of client contacts for a client. Pressing the ‘add contact’ link will append 3 fields – one for conatct name, contact telephone and contact email. Each of these fields are named exactly the same way as before (using square brackets at the end of the name E.g. ‘name[]’) and appended the same way using JQuery.

There are lots of articles floating about explaing how to add fields, but I’ve not yet seen anything explaining how to retreive multiple elements like this.

The only differnce arises when retreiving these multiple values from the PHP’s POST array. In the example I have appended 3 inputs, named cname[], cemail[] and ctel[]. The values of each can be retreived using a slightly enchanced for loop:

if (isset($_POST['cname'])) {
for ( $i=0;$i<count($_POST['cname']);$i++) {
$contactname = $_POST['cname'][$i];
$conatctemail = $_POST['cemail'][$i];
$contacttel = $_POST['ctel'][$i];
}
}

That’s really all there is to it and I’m finding that the latter comes in useful quit regularly in every day projects.

CSS Sprities and Website Optimization

One of the latest and most well established design practices are CSS sprites. This is the practice of combing multiple images into a larger and single, composite image. By using the CSS background-position property selected portions of that master image (or sprite) are displayed. The main issue here is how can a larger image witha larger file size be beneficial, especially when compared to several smaller images? The answer lies in HTTP requests and Yahoo’s 80/20 rule explains this much better than I could! To summarise, the numbers of HTTP requests to the websites is drastically cut, thus loading the page much faster in  single request. Another major beenfit is that not Javascript is required for mouseover code, so you can make image rollovers easily. I have used this technque in the past, but like a lot of people never knew it was called sprites.

In fact, using sprites are so effective many of the internets biggest site’s are using them, all in slightly different ways. On such sites a truely huge number of requests are saved every day. For example, Youtube, Google, AOL,  Amazon and Apple all use CSS sprites. Take the mimilist example of Google (left):

Google CSS Sprite

Youtube, does things slightly different and uses a absolutely very simple sprite and applied the background-position property to each link class. Here’s the simple HTML used for the list: Continue reading CSS Sprities and Website Optimization

Reasons to let Google Host your JQuery Files

It’s often the case that I see busy sites hosting copies of the JQuery library locally. E.g

<script src="/js/jQuery.min.js" type="text/javascript"></script>

The preferred and better way is to host your JQuery through Google E.g.

<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js" type="text/javascript"></script>

So, why is this better? Well there are several valid reasons:

CDN (Content Delivery Network) – Google’s datacenters are located over a range of locations and when a user requests content the closest location is automatically chosen. This is better because it does not force users to download from a single server location (E.g your server) and the chances are Google will be able to serve content faster than your webhost. A similar theory is used for the popularweb based game called quakelive. Usually CDN‘s are a service you pay for, but you’re getting this free through Google!

Less server load – When all your website’s files are located on a single server, downloading them simulainiously increases server load and some users will recieve delays while files download. By having an external location for your JQuery library the latter is not an issue.

Improved caching – This is the biggest benefit as users will not have to re-download content. Hosting JQuery on your own server will cause a first time visitor to download the whole file, even if they have several copies of the same file from other sites. Through Google’s CDN, re-requests for the same file will result in a response to cache the file for up to one year, as it understands that it is a repeat request for a duplicate file.

Local Bandwidth savings – by letting Google host the file for you, you are in essence saving bandwidth. For personal sites this may not be an issue, but busy sites will notice significant bandwidth savings.

Google actually suggests using a .load() function to load the library (see below), but this not only interrupts JQuery’s killer feature (document.ready), but also causes an extra HTTP request. Personally I prfer the old fashioned script method, even though there are several other valid reasons to use the .load() method.

<script type="text/javascript" 
        src="http://www.google.com/jsapi"></script>
<script type="text/javascript">
  google.load("jquery", "1.3.2");
  google.setOnLoadCallback(function() {
  });
</script>

Fixing Common W3C Validation Errors and SEO

Yet another thing to check when doing SEO is that your site validates via the w3c validation checker. A site that is xHTML valid will recieve more frequent search engine crawls and more importantly, longer crawl times. I won’t bore you with further details about why validation is a good thing (it’s a huge subject), but if you must there is a great article about the subject right here. Creating a site to an xHTML valid standard encourages better coding practice and more semantic coding – making your site easier to crawl. You are also giving your site a betttr chance of displaying the same across multiple and future browsers.

Another less known theory is that spiders get full when crawling a page, semantic coding practice will allow for cleaner and more lightweight code. For instance, when crawling a badly coded page with lots of line styles and JavaScript (E.g. content not useful to a spider) the spider may become full too quickly and leave – missing you important content contained further on within the page.

Validating your site to at least xHTML Transaitional 1.0 (the test strict version, compared to xHTMl 1.0 Strict) is highly encouraged and is an area often ignored by developers. Below, I’ll quickly outline some of the common validation errors and how to easily fix them:

cannot generate system identifier for general entity X – 99% of the time this relates to errors with entity references such as ampersands in URLs. E.g. having an url like product.php?id=2&mode=view would result in this error as the ‘&’ wasn;t used within the url.

required attribute “alt” not specified –  simply find the line number and add an alt tag for the image. The presence of an alt tag is required for both transitional and strict doc types.

XML Parsing Error: Opening and ending tag mismatch – Depending on how organised you are when coding this fix can take a matter of seconds or a lot longer. It relates to unclosed block level tags, such as a table or div. One plus point is that fixing such an error often results in several validation errors being fixed at once.

Continue reading Fixing Common W3C Validation Errors and SEO