DDMSEO - Full Stack Developer based in London

Optimize your website SEO with X-RAY SEO Auditor

Published: 20th Dec 2017 | Words: 1,489 | Reading Time: 8 minutes

Categories / Tags:

SEO
Crawling

Webmaster tools url parameters

Proper configuration of Google URL parameters can be one of the key aspects of avoiding duplicate content across your website. Setting them properly can also avoid wasting crawler resources and lead to better crawling overall for your website. Wrong configuration of URL parameters however within Google Webmaster Tools can lead to you blocking parts or even all of your website and eventual de-indexing of pages. There are hundreds of tweaks you can make to your website to enhance your SEO but setting URL parameters is right on top as one of the most important if not the most important on certain large websites. To understand the importance of setting Google URL parameters and Bing URL parameters effectively you have to understand exactly what search engines classify as a page.

https://ddmseo.com/page
https://ddmseo.com/page?param=1 // A URL param format we all know
https://ddmseo.com/page?a // A URL param can be any addition to a URL

If a param is added onto a page from any source like a link, a form submitted using GET etc, a search engine can follow this link, treat this as a separate page and index it. This is where it all starts and it can end in thousands of duplicate pages creating a huge negative impact on your SEO. This is the basis for not only duplicate content but a range of other indexing issues. Search engines will generally pick up most or all pages with URL parameters on your website which means you can configure the crawling of pretty much all duplicate URLs. Things really out of control on websites when you have a lot of URL parameters, they have a lot of different values, and they combine and a lot of different ways.

https://ddmseo.com/catalogue?page=1&sort=A_Z
https://ddmseo.com/catalogue?page=1&sort=Z_A
https://ddmseo.com/catalogue?sort=A_Z&page=1
// and more

This problem can get so bad you could end up with 90% or more of duplicate pages. This could happen if every page on our site had 9 duplicates, but consider that certain pages on websites can have hundreds or thousands of duplicates. If you have a large website and you analyze your URL params and what they are generating I think you will find a large percentage of what is indexed is duplicates and this is why this is one of the most important aspects of SEO. If you have these problems a search engines will never crawl or return everything you want in search results because it will treat all your URLs as seperate pages with a lot of repeated content. You can configure Webmaster Tools to avoid these URLs or crawl them in a certain way. On large webstes this is the way to remove hundreds of thousands of duplicate pages.

Setting Google Webmaster tools URL parameters

Go to Webmaster Tools and navigate to Crawl > URL parameters

If you see big numbers here you probably have problems. Click on edit next to any param on the table. The first question you will see is:

Does this parameter change page content seen by the user?

Click > Show example URLs

Be careful when these parameters are combined with other URL parameters, you dont want to block any other params inadvertantly. Analyze each one carefully and make a decision.

No: Doesnt affect page content (ex. tracks usage)

You may need to set this when any sort of random number is used within your URL. Numbers like this arent used to change page content but to pass specific ids about a user or session. They have no interaction with the content and dont change it in any way.

https://ddmseo.com/cart?product_id=24134
https://ddmseo.com/browse?tracking_id=1000001
https://ddmseo.com/user?user_id=4

Yes: Changes, reorders, or narrows page content

Dynamic content via URL parameters

URL parameters can be used to generate different content on a page. They pass different values, and these values are used in some way to generate new content on a page. You would usually want the different pages with different parameters to get indexed because they may have different content and in essence be separate pages. However you can also have URL parameters that don’t change a pages content but pass different values maybe for tracking purposes or some sort of id. If you don’t understand the difference between these parameters it can be dangerous to set the URL parameters yourself as you may end up blocking the parameter that generates different content and in turn your pages. If you don’t know what you are doing just leave “let Google Bot decide” selected on parameters.

URL parameter issues

A page with a URL parameter that has a lot of different values causes issues because each value gets treated as a separate page. In essence multiple versions of the same page are created through the different URL’s. What this means is when you have a page with a URL parameter and a lot of different values you can end up within dozens, thousands or even more versions of the same page indexed within a search engine. Setting URL parameters in Webmaster tools allows you to block specific pages from being crawled and therefore indexed. It also gives you the option of specifying if a URL parameters doesn’t change the page content it only passes different values for tracking or similar purposes and all the different variations are in essence one page. When you have a good understanding of all the parameters within your site it is possible to set everything up to more or less completely avoid duplicate pages caused by URL parameters.

What rules to set

What rules to set is really dependent on the type of URL parameter. As stated before you may wish to set rules for Google to crawl every different URL parameter value. But most of the time you probably will want the opposite. If you have a good URL structure generally parameters will be used only for things like re-ordering a pages content or passing values that don’t change page content. A basic example of a page you would not want indexed at all and none of its parameters either would probably be a cart page on an e-commerce site. Firstly you would not want a cart page showing up in Google search results, it’s not really a useful content page, and therefore you should exclude it completely. Also the page will probably use URL parameters to pass cart product id’s to identify the products that should be displayed in the cart. Now if this page was crawled and its parameters you could end up a lot of useless pages indexed. A good practice is to exclude this page by settings URL parameters rules to never crawl its parameters.

When is setting URL parameters essential

If you have a good understanding of URL parameters you can set them up before you put pages live with them if they are new parameters. Setting the URL parameters before the page goes live will ensure they are crawled from the off or not. You will almost certainly need to learn how to set parameters in Google Webmaster Tools if you have an e-commerce site or anything displaying a catalogue of products or similar. For these types of sites you will probably be using URL parameters to change things like sorting options within a page or navigating between catalogue pages. URL parameters also need to be configured if you’re using AJAX to change the content within your page. You will almost certainly need to set parameters yourself so if you are building a AJAX application you want crawled effectively you need to work out the parameters and get them all set properly.

Set parameters across platforms

If you are also set up with Bing Webmaster Tools you can also go in there and configure URL parameters also. Just set up everything the exact same. Add parameters as necessary and set the rules also. If of course you don’t know what you are doing leave them alone the same as you would in Google Webmaster Tools.

Rate this article

Average: 0.00 out of 5 / Votes: 0

Comments

Please login or register to post comments

Login | Register