Saturday, January 24, 2009

Five Negative Ranking Factors From the Googlistas



So what do the cyber-pros identify as the most negative ranking factors within Google’s current algorithm? They’re listed below but note, take these Google negatives with a grain of salt.

 

It could all change tonight while you sleep.

 

Negative Ranking Factor #1: Googlebots can’t access your server.

If the site is down for more than 48 hours, which is often the case with low-rent web hosts located half-way around the world, a site’s Google ranking drops like a stone.

 

If your host server is down a lot, search engines don’t want to recommend the site to visitors who will see a 404 error message that the site is unavailable and can’t be accessed.

 

The solution? Find a host that delivers not only a 99.9% uptime but also has local tech support, backup emergency generators and multiple layers of server side security. You’ll spend about $7.00 a month for quality shared hosting. Double that amount for quality dedicated service if cross-server attacks are a concern. Don’t let a few bucks a month keep your site from higher rankings. It’s just not cost effective.

 

Note: Server availability as a ranking factor is one of the most contended topics among SEO professionals who spend much of their time trying to out-think Googlebots, so even the experts can’t agree on this one.

 

Negative Ranking Factor #2: Duplicate or Similar Content.

Most experts do agree on this one.

 

Repetitious content is a stone-cold killer. Now, that doesn’t mean that you can’t pick up a useful piece of syndicated content of interest to your readers. The warning, here, has to do with site text. A programmer can always upload a syndicated article. However, body text should change from page to page, providing a more useful visitor experience.

 

Of course, duplicate content can be tagged with a designation, but too many of these “do not enter” signs is also a negative ranking factor. Bots want to be able to crawl pages and when you keep them off of critical content pages, it’ll have a negative impact on your SERPs ranking on Google.

 

Negative Ranking Factor #3: Links to low-quality sites.

SEO survey contributor, Lucas Ng, sums it up nicely: “Linking out to a low quality neighborhood flags you as a resident of the same neighborhood.”

 

It’s not just about links and plenty of them. It’s more about the quality of the links on a site. So, link up to sites in nice neighborhoods. On the web, Googlebots know you by the company you keep.

 

Negative Ranking Factor #4: Links Schemes and Links Selling.

Google’s algorithm employs probability modeling in determining bought-and-paid-for links, which doesn’t always equate to an accurate view of a site’s actual linking activity. Even so, Googlebots make assumptions programmed into the algorithm.

 

A site with a broad menu of links to diverse sites won’t fare well come spidering time. These links farms are easy for bots to spot. The key to avoiding being mis-indexed by Googlebots is to avoid too many links, try to link to higher-quality-more-visited sites and never buy or sell links. It could mean another web site fatality.

 

Negative Ranking Factor #5: Duplicate Title/Meta Tags.

Search engine algorithms employ numerous filters to identify everything from questionable links to duplicate content that appears on numerous site pages. The same thing is true of a site’s HTML code. Too many duplicate title tags and duplicate meta data can hurt you.

 

Survey participant, Aaron Wall, stated, “If a site does not have much content and has excessive duplication, it not only suppresses rankings, but it may also get many pages thrown in the supplemental results.”

 

Bots read code and if the same title tags show up on page after page, if title tags don’t match page text, or if meta data is cut and pasted into every site page, these crawlers take offense according to some experts.

 

However, there’s another whole school of thought, here. Many SEO pros and site designers believe just the opposite is true – that title tags on each page create numerous entry points to a site, and because each page is indexed separately, the site maintains a larger presence on SERPs. 

 

The key appears to be in the duplication of inserting repetitive title and meta tags. If the content doesn’t change on a particular page, that page doesn’t call for yet another title tag. However, when topics and functions do change from page to page within a site, title tags do help spiders identify the page’s purpose and do provide greater site access to potential visitors.

 

What NOT To Do With This Information

The wheels are spinning, aren’t they?

 

You and a million other site owners are weighing negative ranking factors and the impact these factors have on their SERPs position on Google.

 

Forget it. Let it go. The time you spend trying to reverse engineer your site to appeal to the perceptions of a collection of 31 SEO professionals would be better spent on search engine marketing – promoting to humans.

 

Oh, sure, you can migrate your site to a host with a much improved uptime and, in this case, you should regardless of what Googlebots like and dislike. You should migrate, not because bots will like you better, but because your customers will like you better when you’re there when they need you.

 

Same with cheesy links. Disconnect from garbage sites, links farms and any site that ranks lower than your site in page rank (PR). That’ll take five minutes of your time and it’s something you should do, again, forget the bots, do it for your site visitors seeking to further their web searches through links on your site. Help out site visitors because it’s just good business.

 

But, if you’ve got duplicate content on site, perhaps as RSS feeds, content syndication or hosted content, it seems counter-productive to remove this useful information from the site. Bots recognize these ephemeral links and their time-saving value to visitors by providing good content all in one place, even if it does appear on a few other sites.

 

There are a couple of lessons to be learned here. Lesson #1: Even really smart people who study the activities of Googlebots under controlled conditions can not agree, ultimately, what negative ranking factors are programmed into that passing Googlebot.

 

Lesson #2: (And the most important lesson du jour) Don’t try to outwit a Googlebot. Don’t rebuild your site to mitigate negative ranking factors. Take the obvious steps by going with a reliable host, cutting links to unattractive sites and so on, but don’t spend time reverse engineering your site based on the opinions of SEO pros.

 

Spend your time promoting your site to humans. Do it ethically. And over time, your site will receive an improved rank on Google’s SERPs – guaranteed.

 

Guaranteed? You betcha. “Length of time a site has been up” is one of the positive ranking factors. The longer you remain hooked into the web, the higher your Google ranking.

 

It’s just a matter of time. 

No comments: