Google Webmaster Tools_
Why You Should be Using Google Webmaster Tools
Google Webmaster Tools (GWT) is an important, but often overlooked, aspect of your SEO strategy. This is because the Google Webmaster Tools provide information that has been traditionally most important to Google, and not to webmasters. The reality is, however, that GWT provides some data that Google Analytics doesn’t, and with a little knowledge webmasters can read between the lines and leverage this information to their benefit.
Though there is a wealth of data and many tools to get to grips with, I’ll cover some of the fundamentals to help you effectively utilise these tools. You may hear different opinions about which aspects of GWT are more important. I’m going to play the diplomat here and say they’re all important. The answer to that question is going to be dependent on your business needs - and if you want a great site, you shouldn’t be cutting corners, you should just be making sure everything is handled.
When applied properly, GWT can be a great assistant to your SEO campaign and, given it is free, it is at least worth a look. Unfortunately however, small mistakes can have a significant negative impact on how your site is indexed, which is why it is important to make sure you have a solid understanding of each of the tools before you use them.
Your first step is to verify your site, which can be done quite easily through your domain name provider. Additionally, if you're already using Google Analytics, you do this through your Analytics account. Google Webmaster Tools is categorised into four sections; Traffic, Optimisation, Labs and Health. Here are some important tools that you should get to grips with.
These are the notifications you receive from Google. Problems such as site hacking and links pointing to your site outside the Google Webmaster guidelines are amongst the type of things that are addressed. Whilst in the past Google rarely sent you direct messages, there is now a lot more information sent between Google and Webmasters. This is, therefore, important to check frequently.
This is a new tool that allows you to see when Google’s manual webspam team has applied any manual actions to your site. This may occur if you have pure spam, keyword stuffing or there are unnatural links directed at your site, amongst other things. Google has compiled a full list of manual actions that you can see here. It’s better you become familiar with the list in its entirety. The manual actions have been split into site-wide matches, meaning your whole site has been affected, and partial matches, which mean only a part of the site has been affected.
Links to your site
This is another important aspect of the Google Webmaster Tools. This section showcases an analysis of the links pointing to your site including: number of links, which sources are linking the most, which pages are linked at a greater frequency and how data is linked (anchor text). Moreover, the internal links section provides an analysis of the pages that are linked internally within your website. These are likely the products, services or contacts pages for most businesses. In order to make sure you have the most effective navigation for visitors, you should review these analytics often.
Tip: The Google Penguin algorithm is cracking down on webspam. It is therefore important that you manually audit your back-link profile in order to minimise poor quality links.
The function of this tool is to provide us with the search terms that are performing effectively for your site. Here you are provided with some information that Google analytics doesn’t provide. The information here is potentially the most immediately applicable to your campaign. You can see a more detailed breakdown of the search queries here.
A site map is essentially a map of all the pages on your site. You want to submit a sitemap in order to ensure that Google can see all of your pages, as the search engine crawler can often miss some.
HTML Improvements is a brief tool that provides you with many common errors including duplicate meta tag descriptions or misleading tag titles – fixing these allows your pages to be indexed more efficiently.
Crawl Errors show you issues that the Googlebot had in crawling your site. This allows you to see all of your pages that have broken links and are prompting an error or response code. This means you can fix them as soon as they arise.
Tip: You can also graph the errors over time to better spot reasons behind broken links.
These crawling stats tell you how the Googlebot has crawled your site for the past 90 days. The benefit here is that you can see the amount of pages that have been crawled, and the speed at which the Googlebot can crawl each page.
Tip: If you have pages that are crawled at a slower rate, you may want to think about re-optimising some aspects of the pages in order to get indexed faster. Two pieces of software that can be very helpful with this are the firebug for Firefox tool and the Google page speed add-on.
This reveals statistics regarding how many of your URLs were crawled and/or indexed by Google. Make sure your number of crawled and indexed pages is steadily increasing over time. If you see a sharp decrease, this may suggest your server is down or overloaded. The index status is split into three groups:
- Total indexed - the amount of URLs you possess that are currently indexed by Google. These are all available to be seen in search results and will likely change over time.
- Ever crawled - the complete number of URLs on your site that have ever been crawled by Google.
- Blocked by robots - the number of URLs across time that restricted Google because they are disallowed in your robots.txt file.
Site links are the sub links that appear under your domain name in a Google search. They are determined based on the authority of your domain and are eye catching on a Google results page. Thus, they are likely to drive more traffic to your site than a plain old domain and description. This section allows you to remove site links, if for whatever reason you don’t want them to appear in the Google searches.
Tip: Be cautious when removing links. There have been stories of people removing one site link, losing them all, and going through a long process with Google to get them back.
URL Parameters allow you to manually restrict what the Googlebot crawls. This is useful if you are having problems with duplicate content, but don’t use it unless you are sure about what you are restricting. As with many Google Web Tools a small mistake can have big consequences, so make sure your webmaster is very familiar with your website's structure before playing around with the parameters.
Tip: Some additional ways to reduce duplicate content before adjusting URL parameters include:
- Reviewing html suggestions
- Being consistent with canonical
- Using 301 redirects when appropriate
Google Webmaster Tools ultimately provides a unique set of data that other tools, including Google Analytics, won’t give you. Start utilising these free tools today to maximise your SEO campaign.