Cloaking is a search engine optimization technique in which the content or information presented to the user is different from that presented to search engine crawlers (i.e. spiders or bots) for better indexing. In other words, the web server is specially programmed to return different content to search engines than it returns to regular users, in an attempt to distort search engine rankings. You might have understood now that cloaking is an illegal practice and is considered a Black hat SEO. The Search engine may permanently ban from our index any sites or site authors that engage in cloaking.
Here’s what Google has to say on Cloaking:
Cloaking technology is considered as a violation of Google’s Webmaster Guidelines also as it provides our users with different results than they expected.
Some examples of cloaking include:
- Serving a page of HTML text to search engines, while showing a page of images or Flash to users
- Inserting text or keywords into a page only when the User-agent requesting the page is a search engine, not a human visitor
Cloaking is implemented in order to improve search engine ranking by misleading the search engine robot into thinking the content on the page is different than it really is. It is often used as a spamdexing technique (search engine spam, search engine poisoning) to try to trick search engines into giving the relevant site a higher ranking.
Here’s what Matt Cutts has to say:
How is Cloaking done?
Now, the question is how is cloaking done? How is content displayed differently to users and search engines? You know in the internet world every internet device (i.e. computer, mobiles, tablet etc) are identified by IP address. Using cloaking, content is delivered based on the IP addresses or the User-Agent HTTP header of the user requesting the page. Cloaking is done by modifying a file called .htaccess. Apache server has a module called “mod_rewrite”. With the help of this module in the .htaccess file, you can apply a cloaking technique for your web pages. Webmasters gather search engines’ IP addresses or User-Agents (search engine crawlers). If mod_rewrite module detects that an IP address or user-agent belongs to a search engine, a server-side script delivers a different version of the web page. If IP address doesn’t belong to any crawler, it means it is a regular visitor and delivers a normal web page.
White Hat Cloaking?
The very common question being asked is – is there any term as white hat cloaking? Matt Cutts has said:
“White hat cloaking is a contradiction in terms at Google. We’ve never had to make an exception for “white hat” cloaking. If someone tells you that — that’s dangerous.”
He further said that if any site includes code that special cases for Googlebot by the user agent or IP address, Google considers that cloaking and may take action against the site.
What are some common Cloaking Practices?
Now the next thing is, what type of content webmasters can show differently to users and search engine crawlers. You know that search engines index a web page and rank it higher in the SERPs according to META tags, titles, and other SEO techniques. So considering this thing in mind, webmasters optimize these basic things.
- Invisible or Hidden text
Using this approach, webmasters add additional information like keywords stuffing, overwriting content in a way that is invisible or hidden to the users. To do this, webmaster adds this information in the same color that of background color which is not visible to human eyes.
- Flash Based Websites
As you know flash is not recommended as per SEO guidelines. But there are lots of websites that are built in flash only or that requires a lot of flashes and they cannot avoid it. So rather than rebuilding websites and rewrite everything in plain HTML, they create content rich web pages and provide them to search engine crawlers and flash pages to visitors.
- Cloaking in E-Mail
In e-mail distribution, cloaking is the act of masking the name and address of the sender so that the recipient does not know who sent the e-mail.
- HTML Rich Websites
A good SEO recommends having “TEXT to HTML ratio” as high as possible. In other words, your web page should have more text (content) as compared to your HTML tags. But if anyone is writing short articles or posts, your text to HTML ratio will be very low. To avoid redesigning website, folks choose cloaking to meet SEO guidelines.
- Image Gallery Websites
Image gallery websites have more images than the actual content on their pages. And images are not scanned by crawlers. So, webmasters think that cloaking could help them get top placement for relevant keywords.
What additional Practices are allowed that represent different content to users?
There are several practices that are allowed which present different content or information to different users. Below, some content delivery methods now acceptable to Google that might have once been considered cloaking:
- Geo – Location
Let us take an example of Google itself. When you open Google.com from outside the US, you will see a different page than someone else see in the US. It is redirecting users to a different page based on their geo-locations. For example, If a user opens google.com from his browser, it will redirect him to google.co.in. This is not considered as cloaking. It is done to provide a better experience to users so perfectly fine for any webmaster to do on their website.
- First Click free
Users clicking from Google to a listed page can read the page without having to pay or register with the hosting site. You let Googlebot through as if it were a registered member and also allow anyone coming from Google’s search listings through.
- URL Rewriting
This practice includes the removing unnecessary parameters and other URL (URL cloaking). The key point here is you are not changing the underlying content.
- Replacement of Java Scripts
If you still ask me whether you should do cloaking (that is not recommended) to fool crawlers and rank high, my answer will be NO. It’s a trick, and search engines don’t like to be tricked.
Most search engines are against cloaking. Search engines algorithms are more intelligent than us. They often change their algorithms in order to discourage and stop cloaking and other techniques that create a negative user experience. They will find your site is using this technique sooner or later and your site will be banned and dropped from their index (In the worst case, it can be blacklisted also). In fact, Search giant Google has “blacklisted” German car manufacturer BMW for breaching its guidelines in Feb 2006.
Also, Matt Cutts, ex-head of the Webspam team at Google, says all cloaking is considered by Google to be ‘black hat,’ meaning Google sees none of it as ‘white hat,’ or harmless. If Google finds any of these or other illicit practices, it often will remove your site entirely from the Google index. As new methods of hacking, redirecting, cloaking and other types of deceptive practices are developed all the time, Googlebot is constantly being updated to find them. This means that while you might enjoy higher ranking for a time, ultimately you’ll be caught and lose all your rankings completely. So, rather than focusing on cloaking or other illegal and deceptive practices, focus more on writing quality and unique content to help visitors 🙂
Do share your feedback and views in the comment section below 🙂
Latest posts by Mohit Arora (see all)
- WordPress Ping List To Index Your Site Faster In 2017 - November 29, 2016
- 50+ Best Genesis Child Themes For Your Blog - November 10, 2016
- Tiny Ranker Review : A Must Have Rank Tracker Tool - October 25, 2016