What is cloaking?
In cloaking human reading the site would see different content or information than the Googlebot or other search engine robot reading the site.
Cloaking is a black hat search engine optimization (SEO) technique in which the content presented to the search engine spider is different from that presented to the users' browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page. The purpose of cloaking is to deceive search engines so they display the page when it would not otherwise be displayed.
According to Google cloaking is "a website that returns altered webpages to search engines crawling the site." In other words, a human reading the site would see different content or information than the Googlebot or other search engine robot reading the site. Most of the time, cloaking is implemented in order to improve search engine ranking by misleading the search engine robot into thinking the content on the page is different than it really is.
In cloaking two different set of pages are developed for the website. First set is the one that crawlers see and the other set is the one that visitors to the website can see. Thus the search engine optimized pages that might not be too visually appealing are seen only by the crawlers and those which are visually appealing remain visible to website visitors.
There are three ways of cloaking:
· IP delivery
· IP addresses of
· User-Agent delivery
Many webmasters see cloaking as a remedy for non-spiderable pages on their sites. These “problem” pages can include graphics-heavy pages, pages using Flash animations heavily, or dynamically generated pages. Unfortunately, since the search engines won’t reveal how many sites they punish for cloaking, there’s a common perception that it’s low-risk. Rhodes challenges this assumption.
One technique for determing if a web page is using cloaking is to view Google's cache of that web page. Unless the web page has turned off Google caching, the Google cache will show you how Googlebot sees the web page. If the web page has turned off Google caching, that is also a possible indicator that cloaking is in use.
It is also possible to set your own user agent to spoof the user agent of a well known web robot, such as Googlebot. If the web page is using user agent cloaking, you will then see the cloaked page that is normally shown to Googlebot.
Special Offer: Scheme Management System
Source: Free Articles from ArticlesFactory.com
ABOUT THE AUTHOR