Web Site Optimisation Essentials
It is our company policy to perform a checklist of web site optimisation elements listed below before a web site is launched.
Some of them directly affect search engine rankings, others are just value-adding supplements to online branding.
1. HTML Validation
HTML Validation is a process of checking a web page that you have created to make sure the HTML code on your web page complies with the standards set by the W3 Consortium (the organisation that issues the HTML standards) and that the web page is free of errors.
While there is no evidence that invalid HTML code negatively affects search engine rankings, it may prevent search engine spiders from "deep" crawling your website.
Use W3C HTML & CSS Validator to validate your HTML code and CSS for each web page of your site and if there are errors, ask your webmaster to fix them.
If you are using templates, then validating and fixing template pages will be enough.
2. Fixing Broken Links
Internal links refer to links published on a web page that point to other web pages within the same website.
Broken links may degrade the overall quality of a web page resulting in lower search engine rankings.
Furthermore, broken links prevent PageRank being passed on from one web page to another.
Xenu is an excellent free tool for checking broken links.
3. 301 Re-directions
Each domain usually has two default URLs: a "www" and "non-www". To avoid duplicated content problems we recommend redirecting 'non-www' URLs to 'www' URLs (or the opposite way depending on your preferences).
For example http://pro-internet-marketing.com.au is re-directed to http://www.pro-internet-marketing.com.au.
According to the HTTP standard there are several status codes for redirection. Google recommends using a 301 redirection (moved permanently).
301 Redirection Code for Mod_Rewrite, PHP, ASP, .NET, Perl and ColdFusion
The Duplicate Content Penalty Myth
Information on 301 redirects by Google
Check Server Headers Tool allows you to check the HTTP status returned by a server.
4. Creating a robots.txt
A robots.txt is a text file which restricts search engine spiders (robots) from indexing all or specific pages of the site.
You need a robots.txt file if your site includes web pages that you don't want search engines to index.
Please note that robots.txt file must be placed in the root directory of a web site.
5. Creating an XML site map
XML Sitemaps are XML files that list all URLs (individual web pages) available for indexing on a web site.
While there is no benefit of having an XML site map from a ranking perspective, XML sitemaps help search engines find and index all the pages on a web site more easily. XML sitemaps are particularly important for larger web sites.
6. Creating custom 404 pages
A "404 Not Found" page is shown when the server has not found anything matching the requested URL.
A custom 404 page helps retain online visitors within a web site instead of having them leave.
It also allows search engine spiders to continue crawling the web site.
7. Installing Google Analytics
Monitoring and measuring a web site's performance are critical components of online success.
Install Google Analytics for free.
8. Creating a Favicon
A Favicon is a small image that appears in a browser in front of your web site URL.
A Favicon has no impact on search engine rankings, but is a value-adding supplement to your online brand.
Free web tool for creating still and animated favicons from regular images.
We wanted this page to appear in search results under the term "web site optimisation" so we called it Web Site Optimisation Essentials.