The first stepping stone to getting yourself online

Google webmaster tools will probably be your first stepping stone to getting your website online. When you register a site on it you tell Google that your new website is online. It’s a very helpful tool to learn the basic things you need to look out for in order for your site to perform well in organic search. If there are any problems listed in webmaster tools you should fix them immediately. Don’t let them pile up, use the data provided and fix them. You may find that you can come across and figure out other problems that may occur on your site using the data provided. Any problems with things like crawling or your site, having server issues or going offline will usually result in some huge spikes or drops in different data. If you see anything like a crazy page load time about 10 or 20 times higher than usual chances are you have had some sort of server issue and the Google crawler could not access your site. Google Webmaster Tools acts as a good warning and optimisation system. There isn’t a massive amount to keep track of, that’s why you should quickly fix any problems that do occur. You absolutely must tie webmaster tools to any other Google products such as Analytics and AdWords. Sharing traffic and conversion data across platforms helps you build more effective marketing campaigns and allows access to more data for complex analytics reports. One last thing, don’t submit a half done website. Before you launch a site ensure it’s as perfect as it can be. Utilise the latest Schema to map out your page content better, ensure you have well written meta for all your pages, make sure you have indexing rules set up, spell check all your content, structure your headings, and of course optimise your page speed.

Search appearance

This section primary deals with your display in the SERP (search engine results page). You can click the info icon for a preview of how you can integrate schema into your webpage for better display in the SERP.

Structured data

This section will show you any schema tags you have across your site. It does take Google quite a long time to fully scan all your schema across all your pages. If you don’t see your schema appearing here after a while you could have implemented it wrong. If it doesn’t show here make sure you have scanned your page with the Google structured data testing tool to make sure there are no errors. You can also use this section to get an idea of the pages Google has fully crawled within your site.

Data highlighter

You can use the data highlighter to simplify adding schema to your webpage. However be aware that if you incorrectly nest schema it may not show properly. You can technically only have one schema per page. If you need extra schema you must look at nesting different schema elements within your page.

HTML improvements

This section can help you identify duplicate content issues. Generally you will see Meta here that is duplicated across pages. What this may mean is Google is indexing duplicates of pages within your site. It may also mean you are just using the same Meta across different pages across your site when every page should have different Meta if only slightly different. Being realistic this section should contain no errors and if it does work to fix them and ensure you have no duplicate Meta title, Meta description, or similar across your site.

Sitelinks

A lot of the time when people search for your website it may appear in the SERP with extra “sitelinks” below it. It’s just a list of links to different pages within your site and is generated automatically by Google based on the information they have crawled within your website. Generally they show if you come back as number 1 in search results although smaller site links can show if you are not in position number one in the results. You can use this section to demote links that may show up that you don’t wish to show up. If you think about it your most important links should be showing for these sitelinks, not pages which you may deem to be un-important such as maybe a terms and conditions page. These sitelinks should show links to the most important pages of your site. Be very careful when using this feature. You absolutely must know what you are excluding.

Search traffic

Search queries

The most important part of Google Webmaster Tools is probably the search queries report. This report covers the last 3 months of data, however ranking change data only covers one month. There are reports both for the keywords your site has shown for in organic search and also the pages that have showed. The keyword report gives a list of the keywords that your website has shown for in Google search and you can use this data to optimise your site and figure out keywords you are not getting impressions for that you want to get impressions for. In the search terms reports you want to be seeing lots of impressions, lots of clicks, a high click through rate, and high position for the search terms you are showing for. The ultimate goal for any keyword is number 1, and the more keywords you have there the better. You can go into detail on individual keywords and see on what page of the search results it has been appearing for. Generally when your website starts off you want to see a continuous rise in the search terms graph and always have constant rises in impressions and clicks. You also want to see the number of keywords in the search terms report rise fairly substantially over time and the impression for each keyword rising. It doesn’t take long if you check the search term report often to realise what are the high volume keywords you should be targeting for your particular site. If you see you are appearing in a very bad position, let’s say page 3+ for a particular term, and you are getting a lot of impression, then chances are if you can get on to page 1 for it you will see a massive increase in impressions, clicks and traffic. The keywords and pages reports by default show only desktop traffic and data but you can click the x beside “web” on top to remove the web only filter and show mobile data also.

Links to your site

Depending on how your site evolves over time you may need the backlink data here very rarely or you may need to use this data very frequently. Where this report come in very handy is when you are removing / 404’ing a pages within your site. If you are removing pages from your site that have a lot of highly ranked backlinks you may need to look at either keeping the page or using a 301 redirect to redirect all those great backlinks to another page. This report is great to see the pages on your site that are heavily back linked. The only thing to watch out for is backlinks won’t appear in this repot for a long time when they’re first discovered by Google, months usually. You should periodically check the links here and make sure no spam sites are linking back to your site that may harm your sites performance in search.

Internal links

The internal links report can be used to get a real idea if you have structured your website the right way. You should be seeing the most important pages within your site with the most links and the least important pages within your site with fewer links. You can also use this report to identify crawl problems within your site and get an idea of how much of your site has been crawled by Google. If you don’t see pages listed here that should be it either means Google has not crawled them yet or there may be crawl issues preventing Google crawling them. If pages are very new they may not show straight away here although if you have very old pages that don’t show here it could mean an issue you may have to investigate.

Manual actions

The manual actions report is something that should never have any content, there should be nothing listed here only “No manual webspam actions found”. If you see any sort of issue listed here find a resolution immediately. This report will detail serious issues and policy violations that Google has found within your site. These violations and problems could lead to serious consequences for your site in Google search including poor performance in search, de-indexing, or removal from Google.

International targeting

The international targeting report is fairly straightforward and gives you the option to specify different language versions of your pages and which geographic location you wish to target. While it may seem like a good idea to just not select anything and allow worldwide targeting for your website generally a website will need to be more specific than that. While you can set the preferred geographic location in this section Google will over time determine the right location for your website targeting anyways.

Mobile targeting

Coming soon…

Google index

Index status

Google indexing today is extremely fast. If you make Google aware of new content within your site and it is of high quality, it should be indexed usually within a few hours. Pay careful attention to the graph listed here. The number of indexed pages should reflect the number of pages within your sitemap if you always keep your sitemap up to date. If the number here varies greatly such as you have 1000 URL’s in your sitemap but only 500 here, it may mean you have poor quality content across the pages which are not indexed and Google has not deemed them worthy to show in search results. If on the other hand you have 1000 URL’s in your sitemap and you are seeing 10,000 indexed URL’s here, it may mean serious duplicate content problems across your site.

Content keywords

The content keywords report can be used to tailor and tweak your website to target specific relevant keywords. At the very top of this report the first keyword with the biggest bar and most occurrences of the keyword should be the most relevant and generic keyword across your site. As you go down you should be seeing keywords being listed in terms of relevance and importance. If you are seeing a lot of useless keywords that are not really relevant to your site and won’t bring in organic search traffic you may have to look at the site wide targeting of keywords across you site and re-structure it in a way that primary keywords are placed more throughout your pages.

Remove URL’s

There may be times when you accidentally index pages within your website that you may not wish to be indexed. A perfect example is pages within your admin system. If by some way these pages ever get indexed, you would not want to wait for Google to de-index them automatically but rather manually remove them here as quickly as possible. This is another feature you need to be very careful when using as any page that is removed here cannot be included again in the Google index for months.

Crawl

Another very important part of Webmaster Tools is the collection of tools and features within it that ensure a crawler can index your website effectively.

Crawl errors

You can break this section up into 2 sets of errors. 404 errors are technically not errors as 404’ing / removing pages from a website is a natural process most websites will have to go through. The other errors which will probably be things which are true errors such as faulty redirects or server errors. Soft 404 errors should be included in this list also as they are seperate from normal 404 errors. As just stated if you see 404 errors here, you will probably find yourself in one of two places. Either you have recently removed pages from your website and naturally a 404 error will show here and everything is good, or 404 errors are showing for pages here that should not be showing. If 404 errors are here which should not be here immediately test the page in question and see is it accessible. If you visit it and all looks good but the errors are still showing then it means Google and maybe others are having trouble accessing it. It could be as simple as you have restricting access to only your IP address or it could be a far more complicate process to figure out. You can also examine the pages here and see the backlinks to these pages and redirect them before the page is completely removed from the Google index. How long it takes Google to completely remove a page from its index is dependent on a lot of things including how old the page is and how highly ranked it is. Sometimes it can take months upon months for a page to actually drop out of the Google index. You should be more concerned about the other errors that can be listed here. If you are on any sort of shared hosting it can be difficult to gain access to the server error logs and although in things like Google analytics you may see a drop in hits if you site is experiencing server errors you don’t really get a whole lot of other detail. Here however server error codes can be listed and you can use this information to track down issues with your server connectivity. If you have a mobile site which is not a responsive version of your website, you may have a subdomain with your mobile site completely separate, mobile issues will be listed separately here from your main site which make identifying which version of your site is having problems very easy.

Crawl stats

The only real thing you need to look out for in crawl stats is consistency. Although there will be variation in the graph of the pages crawled per day you should not really see any massive spikes here unless you are adding a lot of new pages to your website. If you do see a lot of inconsistency within this graph it may indicate poor crawling of your website and you may need to examine your linking structure. If you are adding new pages consistently to your website you should see a consistent rise in the crawling of your pages. If Google is only crawling a few of your pages everyday it may mean issues with your page content. Updating a pages content frequently will mean Google will crawl your pages more frequently.

Fetch as Google

If you are experiencing any sort of issues with google crawling your pages you can use the fetch as Google tool to fetch a page from your website as Google and see what is returned to them. You can get a bit of information here but the main thing to look for is that the http headers returned are all okay and the page itself and its html is returned. Fetch as Google is very handy for submitting new content to Google. Instead of constantly re-submitting your sitemap to Google every time you add a new page you can fetch the page here and submit the page to Google. After submission the page will usually be crawled by Google within minutes.

Robots.txt tester

If you are reliant on your robots.txt file blocking parts of your website form being crawled make sure you check this section and make sure your robots.txt file is working as expected.

Sitemaps

The first thing you will probably want to do after registering your site is submit a sitemap. A sitemap gives the Google crawler a clear blueprint of your website and will speed up the process of getting all your pages indexed. Generally you want as many relevant good content pages on your website when you launch as possible. You don’t need to keep submitting your updated sitemap to Google, you can just updated the sitemap and Google will download the updated sitemap after a while. If you do however change your site structure substantially either by removing a large amount of pages, adding a large amount of pages, or both, you should re-submit the sitemap to speed up crawling of the new site structure.

URL parameters

If you have any sort of URL parameter in your website you must add them here and specify what they do. Here is a more comprehensive overview of setting Google URL parameters.

Rate this article

Average Rating: 0 out of 5 / Votes: 0.00

Comments

Please login or register to post comments

Login | Register