Saturday, November 1, 2008

Getting Slammed By Google: It's Easy

It doesn’t take much to make a Googlebot angry. And it certainly doesn’t take much to confuse one of these script-bits that swarm the web like those killer ants. And while there is no absolute consensus on the negative ranking factors employed by Google, there is general agreement on how to avoid getting slammed by the search engine that controls 46% of ALL web searches – the proverbial 800-pound gorilla.

So here are some common, agreed-upon slams Google can give you.

1. Lack of site access. If your host server is down, your site is down and if your site is down, visitors can’t reach you. Google won’t send its users to an inaccessible site. To avoid trouble: (1) go with a reputable host and (2) avoid launching until the site is complete.

2. All text appears in a graphics format like gif, jpg or bmp. Spiders are as dumb as a box of rocks. They can’t read anything in a graphics format. To avoid the problem, keep critical information in HTML format and provide description tags for all graphics.

3. You’re living in a bad neighborhood. You’re known by the company you keep on the web – in two ways. First, by your inbound and outbound links. Too many low-quality links gives you a bad name.

Further, though contestable, if you’re using a shared hosting account, your site is on the same server as 1,264 other client sites. A server that’s stuffed with porn and overseas drug company sites doesn’t exactly make your site shine, does it?

4. Keyword stuffing is bad for site health. You can overstuff an HTML keyword tag, you can overstuff on-site text (keep keyword density at no more than 3%), HTML meta data, headers and headlines. Any overuse of keywords is a bad sign to spiders.

5. Redirects raise suspicions. Not all redirects are bad. Some serve useful purposes. For example, when you submit an information form online, you might immediately be redirected to a confirmation page with a short note stating that “If this page doesn’t redirect you click here” message.

That’s fine. This isn’t: a site page is designed for one purpose only, to appeal to spiders. It’s a perfectly optimized, single site page buried deep within the site. Because the page is hyper-optimized for crawling, there are no graphics, there is no useful information – it’s simply a highly-optimized page of site code.

Because the page is highly optimized, it ranks highly on Google SERPs. That means it pulls in a great deal of organic traffic. However, as soon as a visitor clicks on the SERP link, s/he is immediately redirected to a page designed for humans. It happens so fast, you won’t even notice. This kind of redirect is bad form to spiders. It’s not nice to fool with Google.

There are lots of other missteps Googlebots look for: invisible text, too much cross-linking within one site, dynamic pages – the list goes on and on. The fact is, it’s easy to get slammed – and not even know why!

To learn more, visit Google’s Webmaster Central and get the information straight from the source.

No comments: