Hoe lang duurt googles sitemap indexatie


First of all, the Google Sitemaps Program is by no means a free ticket to Google's search index. It helps Google to discover new resources, and it leads Google's crawlers to updates faster than ever before, but it does not guarantee crawling or indexing. If Google refuses to crawl and/or index a Web page during the regular crawling process, a sitemap submission will not change this fully automated decision. Regardless how Google gets alerted on a URL, be it via sitemap submission or by following links on the Web, the same rules apply. That goes for spam filters, duplicate content handling, ignoring not promoted sites, etc. etc. - Googlebot does not eat unpopular spider food.


A new Web site is considered unpopular, because it lacks link popularity. That is if no other page on the Web donates a link to a new page, Google usually will not index it. Googlebot may fetch it every once in a while, but it will not be visible on the SERPs. This makes sound sense. First, if nobody except of the site owner considers the site's content important enough to link to it, why should it be important enough to debut on the Web via Google's search results? Second, caused by an overwhelming amount of spam Google has to deal with, many (not all!) new sites will not be placed on the SERPs during a probation period, which can last 6 months, a year, or even longer. This probation period is also known as sandbox, the only escape is reputation. A Web site can gain reputation, when it receives well formed natural links from trusted resources.

Maintaining a Google Sitemap with a new Web site makes sense, but it will not result in indexing, until Google has collected enough reputable votes for the new site. During the probation period, a Webmaster should publish fresh and unique content all day long, and acquire valuable links.

Established sites must ensure they provide (at least navigational) links to all (new) pages in a Google Sitemap, because chances for getting unlinked pages indexed are like a snowball's chance in hell. Reasonable linkage and unique content provided, updated pages can make it in the index within a few hours, and new pages get indexed within two days at the latest.
If a Google Sitemap contains a bunch of URLs which are very similar, for example they differ only in a query string variable's value, sitemap based crawls omit a few URLs every now and then, but those get fetched (probably during the regular crawls) later on, usually. For a brand new Google Sitemap it may take a while until the Googlebot sisters donate the first full deep crawl. The duration of this initializing seems to depend on a site's popularity, reputation, size as per sitemap vs. number of indexed pages, and other factors (no detailed research done here). Once the sitemap(s) are downloaded frequently without resubmits, the submit machine runs smoothly. Adding URLs and changing the lastmod attribute results in prompt fetches and instant indexing then.

Reacties (0)

Geen resultaten gevonden