Blackhat v/s Whitehat SEO

Joseph Jayson

What are the techniques used by these two SEO communities? How do you differentiate between them? Read to understand…

If you own a web site or a web business, you might have been tempted with emails from companies that promise you top rankings in Google, Yahoo and other search engines for a small fee. They call themselves specialists in Search Engine Optimization.
Search Engine Optimization (SEO) is a very big business. Almost 81% of all e-commerce revenues is said to have resulted out of a web search. It a multi-billion dollar industry out there, and companies such as eBay, Amazon, and HP spends millions in getting their site optimized.
All this has resulted in number of vendors who claim to be Search Engine Optimization experts. These so called SEO gurus try several tricks in the trade, some ethical, some not-so, some legal and some illegal.

The SEO community has divided them into BlackHat and WhiteHat SEO webmasters.

But what do they do? How do they differ?

WhiteHat SEOs as the name suggests play the game according to rules. They may not give you immediate results. But as the good book says the righteous path is often tough, and has thorns all over it, but the end result is worth all the effort.
If you are talking to a WhiteHat SEO he or she will recommend having good content on your web site, making sure pages are all indexed, making sure there are no missing links, having proper headers, sub headers and tags and so on. They will play according to rules set by search engines. They will never recommend you to do something that is termed as not a fair practice.

BlackHat SEO suggests and implements certain tricks not recommended by the Search Engine vendors, but they deliver imemdiate results. The techniques are often unethical and illegal. They are all set to lure search engines, and may not make any sense to some one visiting your web page.

When using black hat SEO, the content on a page and links both on and to a page are developed for search engines to see. Humans aren't supposed to see them at all, and various black hat techniques can be used to hide them. If humans do see them, their experience is degraded because for example  the content may be machine generated garbage. When using white hat techniques, the content and links are designed for both humans and search engines to see. This means they must be at least coherent. The mistake that many white hat practitioners make is to produce visible yet ugly pages that don't read well, and therefore don't convert well.

Since black hat practises are designed to be hidden from humans, the quality of the work produced by the practitioner is also hidden to humans. This causes a couple of problems:

  • Clients often can't tell what has been done on their behalf or in their name
  • The work is not open to peer review, so it would be difficult for professional organisations to review the quality of the work, too. This is likely to be more of a problem in future as the industry evolves.

White hat methods are visible to humans. Therefore, the quality of the work can be seen straight away, both by the client and by peers of the practitioner.

Black hat techniques, by contrast, will always increase the risk that a site will be deliberately removed from a search engine's index. Better black hat practitioners know this, will warn their clients of the danger and will have a strategy to cope with that danger. They may treat domains as disposable items. This is obviously not a suitable tactic if the work is being performed under the client's primary domain. Recently, problems were caused when Google dropped a number of domains from its index for using black hat techniques, when those domains were the primary domains of the clients of a particular SEO firm. These kinds of cases don't help to give the SEO industry a good reputation.

Some of the Black Hat Techniques

Spamdexing or search engine spamming is the practice of deliberately creating web pages which will be indexed by search engines in order to increase the chance of a website or page being placed close to the beginning of search engine results, or to influence the category to which the page is assigned. Many designers of web pages try to get a good ranking in search engines and design their pages accordingly. The word is a portmanteau of spamming and indexing.

Hidden or invisible text

  •  
    • Disguising keywords and phrases by making them the same (or almost the same) color as the background, using a tiny font size or hiding them within the HTML code such as "no frame" sections, ALT attributes and "no script" sections. This is useful to make a page appear to be relevant for a web crawler in a way that makes it more likely to be found. A classic example can be a web site selling crayon pencils hiding text written in same background color (invisible) on a hot topic such as music from Beatles. He places hidden text appropriate for a fan page of a popular music group on his page, hoping that the page will be listed as a fan site and receive many visits from music lovers. However, hidden text is not always spamdexing: it can also be used to enhance accessibility.
  • Keyword stuffing
    • This involves the insertion of hidden, random text on a webpage to raise the keyword density or ratio of keywords to other words on the page. Older versions of indexing programs simply counted how often a keyword appeared, and used that to determine relevance levels. Most modern search engines have the ability to analyze a page for keyword stuffing and determine whether the frequency is above a "normal" level.
  • Meta tag stuffing
    • Repeating keywords in the Meta tags, and using keywords that are unrelated to the site's content.
  • Gateway or doorway pages
    • Creating low-quality web pages that contain very little content but are instead stuffed with very similar key words and phrases. They are designed to rank highly within the search results. A doorway page will generally have "click here to enter" in the middle of it.
  • Scraper sites
    • Scraper sites, also known as Made for AdSense sites, are created using various programs designed to 'scrape' search engine results pages or other sources of content and create 'content' for a website. These types of websites are generally full of advertising, or redirect the user to other sites.

Link spam

Link spam takes advantage of link-based ranking algorithms, such as Google's PageRank algorithm, which gives a higher ranking to a website the more other highly-ranked websites link to it. These techniques also aim at influencing other link-based ranking techniques such as the HITS algorithm.

  • Link farms

Involves creating tightly-knit communities of pages referencing each other, also known humorously as mutual admiration societies.

  • Hidden links
    • Putting links where visitors will not see them in order to increase link popularity.
  • Sybil attack
    • This is the forging of multiple identities for malicious intent, named after the famous schizophrenia patient Shirley Ardell Mason. A spammer may create multiple web sites at different domain names that all link to each other, such as fake blogs known as spam blogs.
  • Wiki spam
    • Using the open editability of wiki systems to place links from the wiki site to the spam site. Often, the subject of the spam site is totally unrelated to the page on the wiki where the link is added. While many powerful tool exist to filter or block email spam, there are very few tools for blocking wikispam.
  • Spam in blogs
    • This is the placing or solicitation of links randomly on other sites, placing a desired keyword into the hyperlinked text of the inbound link. Guest books, forums, blogs and any site that accepts visitors comments are particular targets and are often victims of drive by spamming where automated software creates nonsense posts with links that are usually irrelevant and unwanted.
    •  

Spam blogs

  •  
    • A spam blog, on the contrary, is a fake blog created exclusively with the intent of spamming. They are similar in nature to link farms.
  • Referer log spamming
    • When someone accesses a web page, i.e. the referee, by following a link from another web page, i.e. the referer, the referee is given the address of the referer by the person's internet browser. Some websites have a referer log which shows which pages link to that site. By having a robot randomly access many sites enough times, with a message or specific address given as the referer, that message or internet address then appears in the referer log of those sites that have referer logs. Since some search engines base the importance of sites by the number of different sites linking to them, referer-log spam may be used to increase the search engine rankings of the spammer's sites, by getting the referer logs of many sites to link to them.
  • Buying expired domains
    • Some link spammers monitor DNS records for domains that will expire soon, then buy them when they expire and replace the pages with links to their pages.

Some of these techniques may be applied for creating a Google bomb, this is, to cooperate with other users to boost the ranking of a particular page for a particular query.

Other types of spamdexing

  • Mirror websites
    • Hosting of multiple websites all with the same content but using different URLs. Some search engines give a higher rank to results where the keyword searched for appears in the URL.
  • URL redirections
    • Taking the user to another page without his or her intervention, e.g. using META refresh tags, CGI scripts, Java, JavaScript, Server side redirects or server side techniques.
  • Cloaking refers to any of several means to serve up a different page to the search-engine spider than will be seen by human users. It can be an attempt to mislead search engines regarding the content on a particular web site. However, cloaking can also be used to ethically increase accessibility of a site to users with disabilities, or to provide human users with content that search engines aren't able to process or parse. It is also used to deliver content based on a user's location; Google itself uses IP delivery, a form of cloaking, to deliver results.

A form of this is 'code swapping, this is: optimizing a page for top ranking, then, swapping another page in its place once a top ranking is achieved.

These are some of the ideas used by BlackHat SEOs. While these ideas appear to be cool and appealing be warned. Google and other search engine vendors are working overtime to ensure that their algorithms do not get tricked by these techniques. And they do block sites that are known to be search engine spammers.








}