How indexing works

If you want to build a really optimised website, the indexing of your pages by search engines is extremely important and there is much more too it than just submit every single page to Google and other search engines as possible, quite the opposite is true. You have to avoid indexing large amounts of duplicate pages on search engines. To give a quick insight into those who are new to web design, in a nutshell indexing means a search engine crawling / reading your page then storing it and allowing it to show on that search engines for searches and keywords. It’s very easy to get indexed on search engines like Google. Really it’s just a case of registering your site, submitting a sitemap and very quickly you will be included in the Google index probably within a couple of days. But that’s only the beginning and then you have got to climb up through the rankings in order to show higher in search results. A general rule to live by when indexing is to keep as few pages as possible by cutting out duplicate pages, useless information pages, and old pages that would give you no benefit whatsoever. You always want everything that is indexed to be indexed for a reason that is going to benefit you. The reason to index pages is so they show in search results. If you have pages you think you don’t want to show in search results, don’t index them.

Automatic indexing

Most search engines spiders as they are known automatically crawl through pages by following the links on them to another page. What this means for indexing is that when you add new pages to your site a search engine can find them without you having to submit them. If you have a well linked big site you will find that any new content you add to your site will probably be indexed within a few hours and can rank high fairly quickly after indexing if your site is fairly old. A well built site can pretty much index itself. Search bots will after a while work out what pages are updated frequently and how often they should be crawled, but its still good practice to submit an updated sitemap now and again if you have a very large site. Although you can give significant direction to search bots and how you they crawl your site remember they have their own way of crawling a web site and if you don’t build your website absolutely perfect making use of 404 headers for links leading to non existent pages you can give search bots a lot of problems. Indexing problems waste crawler resources, hurt your search rankings, and can take a very long time to sort out if they are pragmatic.

Manual indexing

Through things like Google Webmaster tools you can manually submit a page or pages either through a sitemap or individually. You will find that Google will usually index the content instantly if your site is well built and ranks moderately high. Submitting a sitemap now and again is good practice and can generally guide search engine spiders better than having them crawl automatically as you are submitting an accurate blueprint of your entire site structure. Manually submitting a single URL can come in handy if you want a page to show instantly, an example would be if you had a blog and you put up a post about today’s events. This may be something that may be only relevant today and therefore you want and need it to show as quickly as possible.

Rate this article

Average Rating: 0 out of 5 / Votes: 0.00

Comments

Please login or register to post comments

Login | Register