Digital Marketing Handbook

(ff) #1

Cloaking 201


Cloaking


Cloaking is a search engine optimization (SEO) technique in which the content presented to the search engine spider
is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or
the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a
server-side script delivers a different version of the web page, one that contains content not present on the visible
page, or that is present but not searchable. The purpose of cloaking is sometimes to deceive search engines so they
display the page when it would not otherwise be displayed (black hat SEO). However, it can also be a functional
(though antiquated) technique for informing search engines of content they would not otherwise be able to locate
because it is embedded in non-textual containers such as video or certain Adobe Flash components.
As of 2006, better methods of accessibility, including progressive enhancement are available, so cloaking is not
considered necessary by proponents of that method. Cloaking is often used as a spamdexing technique, to try to trick
search engines into giving the relevant site a higher ranking; it can also be used to trick search engine users into
visiting a site based on the search engine description which site turns out to have substantially different, or even
pornographic content. For this reason, major search engines consider cloaking for deception to be a violation of their
guidelines, and therefore, they delist sites when deceptive cloaking is reported.[1][2][3][4]
Cloaking is a form of the doorway page technique.
A similar technique is also used on the Open Directory Project web directory. It differs in several ways from search
engine cloaking:


  • • It is intended to fool human editors, rather than computer search engine spiders.

  • The decision to cloak or not is often based upon the HTTP referrer, the user agent or the visitor's IP; but more
    advanced techniques can be also based upon the client's behaviour analysis after a few page requests: the raw
    quantity, the sorting of, and latency between subsequent HTTP requests sent to a website's pages, plus the
    presence of a check for robots.txt file, are some of the parameters in which search engines spiders differ heavily
    from a natural user behaviour. The referrer tells the URL of the page on which a user clicked a link to get to the
    page. Some cloakers will give the fake page to anyone who comes from a web directory website, since directory
    editors will usually examine sites by clicking on links that appear on a directory web page. Other cloakers give
    the fake page to everyone except those coming from a major search engine; this makes it harder to detect
    cloaking, while not costing them many visitors, since most people find websites by using a search engine.


Black hat perspective


Increasingly, for a page without natural popularity due to compelling or rewarding content to rank well in the search
engines, webmasters may be tempted to design pages solely for the search engines. This results in pages with too
many keywords and other factors that might be search engine "friendly", but make the pages difficult for actual
visitors to consume. As such, black hat SEO practitioners consider cloaking to be an important technique to allow
webmasters to split their efforts and separately target the search engine spiders and human visitors.
In September 2007, Ralph Tegtmeier and Ed Purkiss coined the term "mosaic cloaking" whereby dynamic pages are
constructed as tiles of content and only portions of the pages, javascript and CSS are changed, simultaneously
decreasing the contrast between the cloaked page and the "friendly" page while increasing the capability for targeted
delivery of content to various spiders and human visitors.
Free download pdf