Digital Marketing Handbook

(ff) #1

Site map 311


More information defining the field operations and other Sitemap options are defined at http:/ / http://www. sitemaps. org
(Sitemaps.org: Google, Inc., Yahoo, Inc., and Microsoft Corporation)
See also Robots.txt, which can be used to identify sitemaps on the server.

References
[ 1 ]Site Map Usability (http:/ / http://www. useit. com/ alertbox/ sitemaps. html) Jakob Nielsen's Alertbox, August 12, 2008
[ 2 ]"WordPress Plugin: Google XML Sitemaps" (http:/ / linksku. com/ 10/ wordpress-plugins). Linksku..
[ 3 ]Joint announcement (http:/ / http://www. google. com/ press/ pressrel/ sitemapsorg. html) from Google, Yahoo, Bing supporting Sitemaps

External links



  • Common Official Website (http:/ / http://www. sitemaps. org/ ) - Jointly maintained website by Google, Yahoo, MSN
    for an XML sitemap format.

  • / Sitemap generators (http:/ / http://www. dmoz. org/ Computers/ Internet/ Searching/ Search_Engines/ Sitemaps) at the
    Open Directory Project

  • Tools and tutorial (http:/ / http://www. scriptol. com/ seo/ simple-map. html) Helping to build a cross-systems sitemap
    generator.


Sitemaps


The Sitemaps protocol allows a webmaster to inform search engines about URLs on a website that are available for
crawling. A Sitemap is an XML file that lists the URLs for a site. It allows webmasters to include additional
information about each URL: when it was last updated, how often it changes, and how important it is in relation to
other URLs in the site. This allows search engines to crawl the site more intelligently. Sitemaps are a URL inclusion
protocol and complement robots.txt, a URL exclusion protocol.
Sitemaps are particularly beneficial on websites where:


  • • some areas of the website are not available through the browsable interface, or

  • webmasters use rich Ajax, Silverlight, or Flash content that is not normally processed by search engines.
    The webmaster can generate a Sitemap containing all accessible URLs on the site and submit it to search engines.
    Since Google, Bing, Yahoo, and Ask use the same protocol now, having a Sitemap would let the biggest search
    engines have the updated pages information.
    Sitemaps supplement and do not replace the existing crawl-based mechanisms that search engines already use to
    discover URLs. Using this protocol does not guarantee that web pages will be included in search indexes, nor does it
    influence the way that pages are ranked in search results.


History


Google first introduced Sitemaps 0.84 [1] in June 2005 so web developers could publish lists of links from across
their sites. Google, MSN and Yahoo announced joint support for the Sitemaps protocol [2] in November 2006. The
schema version was changed to "Sitemap 0.90", but no other changes were made.
In April 2007, Ask.com and IBM announced support [3] for Sitemaps. Also, Google, Yahoo, MS announced
auto-discovery for sitemaps through robots.txt. In May 2007, the state governments of Arizona, California, Utah and
Virginia [4] announced they would use Sitemaps on their web sites.
The Sitemaps protocol is based on ideas[5] from "Crawler-friendly Web Servers".[6]
Free download pdf