The SEO Check List
Here you’ll find a comprehensive list of Search Engine Optimisation considerations for just about any website. The following covers fundamentals such as keyword research, title tags, duplicate content, site health issues and beyond.
Identify Important Keywords:
Search Engine Optimisation starts with identifying target keywords.
What search queries should the website be visible for on search engines?
What search queries will drive the appropriate audience to the website?
Head Term: Getting newer websites to rank for single keywords is very-hard. Google generally loves old websites in this context.
example: Jeans <head term>
Long Tail: Although they may get less traffic, they are associated with more qualified users and are great for location specifics.
example: Buy Jeans Dublin <long tail>
▷ 3 or more long tail keywords have been identified for the website.
The title tag element of a web page is one of the most important factors evaluated by Google, and is meant to be an accurate, concise description of a webpage’s content. It is also what search users read & click on from SERPs (search engine results page)
- Take full advantage of all 70 characters available.
- Include the most important or highest search volume keyword on the homepage title tag.
- Include the brand or website.
- Create unique title tags for each webpage.
example: My Jeans Shop ⎪ Buy Mens Jeans Dublin ⎪ Black Skinny Jeans
example: Buy Mens Jeans Dublin ⎪ Cheap Skinny Jeans ⎪ www.MyJeansShop.com
▷ The home page has unique title tag taking advantage of the most important long tail keyword and the brand / website name.
▷ Every webpage has unique title tags.
A webpage meta description is the short snippet of text, 160 characters of which serves as the advertising body for your business when listed on search engine results pages. A meta description should summarise what the page is about.
Where no meta description is predefined, Google provides around 160 characters of the webpage content that best serves the search query. On the SERP, the keyword (search query) will be highlighted boldly in both the title of the webpage and the description. It is therefore important to create keyword-rich meta descriptions and entice search users to visit the webpage. Increased click-through-rates due to attractive meta descriptions can lead to better ranking.
▷ Every webpage has a unique and enticing meta description taking advantage of important long tail keywords. Each meta description is no more than 160 characters.
▷ All meta descriptions for key landing pages have a strong call to action.
Usage of CSS Headings: H1 H2 H3
Taking advantage of heading and sub-heading CSS formats serve two purposes:
1) Breaking up content (better, easier to read content)
2) Incorporating keywords into headings is more effective and helps avoid the common-problem of just-keyword-stuffing.
Google is known to put a high value on content in formatting classes such as <h1> <h2> and <strong>, when evaluating a webpage.
<h2> <h3> should serve the purpose of subheadings. Consider them based on importance, and the most important content of a webpage should always be top heavy.
▷ At least a single H1 CSS formatting tag incorporating an important, relevant keyword exists on all landing pages.
SEO friendly URLs:
Including keywords in URLs is a proven method of improving search visibility.
SEO friendly URLs (aka permalinks) should be helpful to humans and search engines alike in being readable from a content hierarchy / website structure perspective.
The standard out of the box WordPress format is:
examples of SEO friendly URLs:
▷ The website uses permalinks URLs.
Don’t ever assume a website doesn’t have duplicate content. Examples include:
1) Pages that are http before login and https after.
2) Printer-friendly pages created by your CMS that have exactly the same content as your web pages
3) If the home page has multiple URLs serving the same content, for example: http://yoursite.com, http://www.yoursite.com and http://www.yoursite.com/index.htm.
4) Pages that have been duplicated due to session ids and URL parameters, such as http://yoursite.com/product and http://yoursite.com/product?sessionid=5486481
5) Any webpage that includes a filter or sorting method that essentially delivers the same content (consider a clothing store, and sorting by t-shirt size or colour) will have duplicate content issues.
6) Any websites that use the same paragraph of text, in site-wide areas such as sidebars and footers, will have duplicate content issues.
7) Where possible, consider expanding webpages to include distinct, relevant content per each URL.
Where duplicate content issues can’t be avoided, apply the rel=”canonical” tag within the <head> to direct Google as to where the “best page” is for this content (or, the priority for indexing).
An alternative to using rel=”canonical” on duplicate content pages is to use the meta robots tag. Use the meta robots tag with nofollow and noindex attributes and this will stop a duplicate page from being indexed by search engines.
If the website is old(mature), received a facelift or has inherited old content for whatever reason, ensure the best quality, original content is presented on landing pages closest to the home page. As a site changes, or evolves use 301 redirects to “move” old/defunct/unnecessary webpages to the current. Create a sitemap to use as a reference for content hierarchy & uniqueness.
Also, maintain consistency when linking internally throughout the website (read more-so the next internal linking structure section).
▷ The website has no duplicate content issues.
▷ Any identified duplicate content issues have been addressed with a canonical, redirect, meta robots or creative solution.
Internal Linking Structure:
Webpage linking considerations go beyond just visitor navigation. Thoughtfully linking webpages within the website is very important for Search Engine Optimisation, and how Google evaluates a website as a whole.
Creating a HTML sitemap page such as http://www.apple.com/sitemap/ is advantageous for crawl bots. Also, the HTML version of the sitemap can be submitted to the webmaster tools “fetch as google bot” function.
It is important to ensure that the home page and key landing pages link-off to “less” pages. The reason for this is that the home page and key landing pages should be the primary target of links coming from external websites. This results in improved “Google Juice” / PageRank from those external websites. The more links on these key pages, the more “Google Juice” is spilled. And it serves search visibility best, if the juice is spilled primarily to the most important landing pages (the pages where search users should land and first experience the website), more-so than any other pages.
Outer webpages that are further away from the home page / key landing pages should internal-link back to the home page and any key landing pages more aggressively.
Ways to maximise internal linking:
Take advantage of a helpful (in terms of navigation), keyword driven footer links system, but ensure the home page / key landing pages differ from outer webpages. Footer links have been devalued due to over-optimisation tactics and it is important to not have footer links on important pages linking to themselves.
Incorporating “related posts” and sidebar links on news articles / blogs is another positive way to encourage Google’s evaluation of important pages from linking pages.
Internal links are links that go from one page on a domain to a different page on the same domain.
Internal Links fall into a variety of descriptive categories:
Top Menu Navigation – Links within the main menu of the website.
Contextual / In-Content Links – Links that take advantage of keywords within webpage content.
Footer Links – Traditionally sitewide links that appear in the footer section at the bottom of a website.
Sidebar Links – Traditionally sitewide links that appear in a column to the side of normal webpage content.
Sitewide Links – Links that appear on every single webpage of a website.
Due to old-school over-optimisation tactics, sitewide links can be a contentious issue. With the exception of top menu navigation links, footer or sidebar links that appear exactly the same on every single webpage should be avoided.
▷ The website has an up to date HTML sitemap page.
▷ The anchor text used in Internal links is not repeated between the various top menu navigation, sidebar, footer link areas on any individual webpage.
▷ Footer links on the Home Page & Key Landing pages are fewer in number / capped and don’t link to themselves.
▷ There are no links on the Home Page & Key Landing Pages pointing to external websites.
Navigation & Content Hierarchy:
Websites generally have a pyramid structure where the most important page (in terms of search visibility), such as the home page, has thinner content, and outer pages go into more specifics and greater detail. Search engines want to serve the search user the best content for their search query.
If the best, most comprehensive content is in outer pages, and the best landing experience for visitors is closer to the home page, how can these landing pages be optimised to rank higher on SERPs?
A truly optimised website will take full advantage of properly introducing and summarising the topics, using important keywords on the Home Page and top level landing pages, which are also the pages which are organically acquire the most back-links from external websites.
An example of an optimised content hierarchy:
▷ Key Landing Pages introduce or summarise the purpose of the website’s products or services by using important target keywords. OR
▷ Key Landing Pages introduce or summarise the topics discussed on sub-pages by using important target keywords.
Always use original images. Incorporate keywords and the website / brand-name into the Image filenames and image alt tag descriptions. Web crawlers cannot see or evaluate images, but they can read their textual descriptive elements.
Consider a content delivery network if the website’s images are causing slower page load times.
▷ All Images use descriptive / keyword laden image alt tags.
.com website domains can have their geographic target set for search indexing via Google webmaster tools. Leave the geographic target unset if the website is targeting multiple countries/regions.
.com websites are affected by the location of hosting.
any .ie .co.uk .com.au (TLDs) domain does not have the flexibility of webmaster tools geographic targeting, and will always organically index in the same country specific Google search index. Hosting location for these domains have less of an impact on SEO.
Essentially, if you’re hosting in Germany, you get a stronger association with Germany. Add to that a .de domain name and you’ve further reinforced your association with the country. The country specific association or relationship is also bolstered by off-page links coming from the target country.
Every domain and subdomain must establish it’s own search authority individually. This is a key consideration that should precede web development / deployment for international websites. In essence
Although subdomains can look attractive from SERPs, in that visitors can identify the uk.versionofthesite.com, but subdomains do not inherit a great deal of search authority from the primary domain. Also, subdomains typically get mis-linked (or don’t organically get linked to) because the primary domain www.versionofthesite.com attracts most of the back-linking attention.
Subfolders such as www.mysite.com/ie/ and www.mysite.com/uk/ offer the ideal solution for already established, and expanding websites. However, hosting and country specific servers/IP address flexibility is limited.
▷ The website is .com with subfolders for international landing pages taking advantage of country/language specific redirects (lazy option).
▷ There are country specific TLD versions of the website per geographic target (only a few geographic targets exist, and the list is not growing).
Taking Search Engine Optimisation Seriously:
A common recommendation is to ensure the website is designed/developed for users and not search engines. This is fair so long as you assume every user is 10 years of age (or of similar intellectual capacity), and accept that you must describe and detail the purpose of the website, the website’s subpages, and the links there-in, as consistently as possible.
Web crawlers are bots. And are in no way as dynamic, or benefiting from billion-dollar, mars rover landing artificial intelligence systems as Google or School-boy white hat SEO’s suggest.
Broken Links stop crawlers dead in their tracks. Always monitor, and check for broken link errors. This is especially important where internal-links are concerned.
Missing Page (404) errors are an indication of the quality & reliability of a given website. Numerous missing page errors tell search engines the site is poorly maintained. During site redesign it is especially advantageous to ensure any webpage with a back-link from another website doesn’t return a 404 error after new-site-deployment.
Validating the markup of HTML and CSS files will help ensure there are no code errors causing human-related-errors, or visually-unnoticed code issues that can impact the crawlability of the website.
Slow websites annoy users. Slow webpages hinder goal funnels. And, poor page load times have been proven to be the difference between prime ranking positions and 2nd page rankings positions for target keywords. Google values the user experience of their most profitable product, the Google search engine. Serving up slow, unresponsive, broken websites.
User experience matters, and easy to navigate websites with strong calls to action generally enjoy longer visitor durations and lower bounce rates.
▷ Google webmaster tools has been setup and verified.
▷ The website has been indexed by Google.
▷ Google Analytics has been setup and tested.
▷ Broken links are monitored, identified and corrected.
▷ Missing page errors are corrected with 301 redirects / creative content.
▷ CSS / HTML files have been validated with W3C.
▷ Page load times are consistently below 4 seconds.