Jan 5, 2010

SEO Best Practices – Avoid the Google Blacklist

Google is running a business.  Their business depends on returning the most relevant and useful search results to web surfers, so that they’ll continue to use the Google Search Engine, click on Google ads, and generate revenues.  Because quality search results are so important, Google takes people who try to cheat the system very seriously, and they have very well-documented rules that they use to identify search engine “spammers.”

Break any of these rules, and your rankings will suffer, or even worse, your site can be black-listed… removed entirely from Google.  You may benefit from a brief boost in search engine rankings, but Google always catches up. Trust me; you do not want to fight Google in order to get your website re-listed.

Rule 1 – No Duplicate Content or Mirror Sites

We all know that copying someone else’s text and images is wrong, and we’d never do it.  But Google also penalizes you when you copy your own content.  Say for instance you have a corporate website that has been around for 10 years, but now you want to build a focused mini-site for a featured product or service.  There is a temptation to post the same information that already appears on your existing site – product information, company info, etc.  However, Google views this as duplicate content.  And since the existing site is older, your new site will be ignored as a copy and won’t be indexed.

How to avoid this issue:

  • Re-word and re-organize text.  A good rule of thumb is to be at least 60% unique.
  • Add exclusive content to the newer website, so that a significant portion of the site is unique.
  • Consider creating images for content that is not relevant to the search engines on the new site – like charts, technical specs, testimonials, etc.
  • Be sure meta tags and title tags are also unique.

Rule 2 – No Keyword Stuffing

Google knows all of the “tricks” that black hat, or disreputable, SEO practitioners try to use in order to rank for keywords and key phrases.  One of the most common techniques that people are familiar with is keyword stuffing, or adding the keyword or phrase you want to rank for over and over and over to your pages.  There are many keyword stuffing techniques (see below), and Google’s algorithms will identify and disregard these techniques.  At best, you wasted time (and money, if you paid someone else to do it for you) and your rankings will not improve in the long-run; at worst, Google might identify you as a spammer and remove your site entirely from search results.  Warning: keyword stuffing may seem to work in the short-term, but Google always catches up with you…

  • Gibberish Text – Have you ever visited a website that has introductory text on the Home Page that makes no sense? Instead you find yourself wading through paragraphs of keyword-stuffed text, typically with lots of links. Not only does this kind of text turn off your website visitors and reduce the usability of your site, Google will notice that your keyword density is unreasonably high.
  • Invisible, Low-Contrast or Very Small Text – Instead of filling your pages with gibberish text that might look bad to your visitors, you may be tempted to stuff some keywords onto the bottom of your pages in very small print, or written in the same or a very close color to the background of the page.  Google is on to this trick, and can easily identify it.  In fact, you need to be careful that your well-intentioned web designs don’t have any elements, like captions or legitimate small-print, that might flag Google accidentally.
  • Alt-Tag Stuffing – Another shady method is to overload your images’ alt and title tags (the text that shows up when you hover over an image on a web page) with your keywords. Not only is this an easy method for Google to identify, but most search engines virtually ignore alt and title tags these days because of over-abuse of this technique.  The same holds true for some other HTML tags that can only be viewed by browsers.  Most experts agree that meta keywords and meta descriptions are not factored into Google’s rankings anymore.
  • Invisible Links – One final method that some “experts” like to use, is to create invisible links in a page’s HTML code to keyword relevant pages.  They do this in the hopes of building what’s called “link juice,” or link relevance. The theory is that search engine spiders will read these links and factor them into your page’s ranking, while website visitors won’t be able to see them.  Google doesn’t take kindly to this kind of cheating though, and will penalize your page’s rank accordingly.

The methods above can sometimes show up in your page accidentally or without malice.  Knowing about these potential hazards to your page rank can help you avoid any unexpected results.  The most likely to get you blacklisted and booted from the Google index is Duplicate Content, however all of them will be detrimental to your ranking in the long run.

The “tricks” below, however, are widely recognized as one-way tickets to the Google black-list.  They are employed by disreputable, black hat SEO “experts” who make outlandish promises of #1 rankings.  They’ll use these techniques to give you a quick and dramatic boost.  Then after you’ve already paid for their services, your website will suddenly disappear from Google.  Getting yourself re-listed will be an uphill battle and can sometimes take years.  So proceed with extreme caution if a service provider is using, or proposes to use, any of the following techniques:

Rule 3 – No Doorway Pages

Doorway pages are created specifically for spamming search engine indexes, or “spamdexing”.  These pages contain keyword optimized text that attracts the search engines, but when a visitor clicks on that indexed link, they are quickly redirected to a different page entirely. These pages are also referred to as bridge pages, jump pages, or portal pages.

Rule 4 – No Link Farming

Link Farms, or Link Farm Exchanges, attempt to take advantage of the weight search engines place on in-bound links to a website when determining a site’s rank.  Generally, the more in-bound links your site has, the more “popular” your site appears, and then the higher you rank.  Google in particular easily identifies link farms, and penalizes web sites with links on those sites, typically by banning them from the Google index.

Rule 5 – No Blog Comment Spam

This technique tries to exploit the same page rank weight placed on in-bound links in the above technique. However, instead of placing your site in a link farm, this technique adds links to your site on relevant blogs by commenting on posts.  In general, becoming active on relevant blogs and commenting thoughtfully is a respected and recommended technique for improving your search engine ranking.  However, using automated posts, making nonsense and/or unrelated comments, and typically abusing the blog’s community by only posting promotional comments, are all considered spamming. Google takes a dim view on them all.

Want to Learn More About Search Engine Optimization?

Read about NetSource’s Search Engine Optimization services
Learn about Content Strategy and Copywriting
Check out our other Online Marketing services

6 thoughts on “SEO Best Practices – Avoid the Google Blacklist”

  1. Craig Stevenson

    I am new to this and just finished building my site.
    after I finished a friend mentioned “black listing” and provided this link.
    I am now worried that over 75 pages will need to be re-writen.
    I sell parts via the web and listed a page for each part.
    On each page I used bascially the same keywords.
    Is that keyword stuffing?
    I certainly did not intend to do anything wrong and mearly wanted to use what I was allowed.
    Do I need to enter different keywords on each page?

    1. It doesn’t sound like what you did is keyword stuffing precisely, although you probably won’t get the best results possible from simply duplicating keywords across your site.

      It was a great idea to have a separate page for each part. To capitalize the most on that strategy, you’ll want to optimize each of those pages for the specific keywords relevant to that part and that part only. In particular, your title tag, page headline (h1), and descriptive text should all contain words and phrases relevant to that part.

      If you’d like to send a link to your website, we can take a quick look and send you specific recommendations.

      Thanks for the comment!

  2. Google keep changing all the time, all we need is make our website usefull to human, it will last forever but it need lot of work such as keep updating the information to attract more visitor, robust method

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>