Thursday, September 24, 2009

Make On-Site Links Work For You

Don't let 'em leave. Use on-site links to keep visitors around longer.

Link Your Way to Site Success

No, Not Those Kind of Links, The Other Kind

You can’t swing a comatose web head without running into the stalest advice in all of SEO. Get quality, inbound links to improve site ranking with search engines. Yawn! What else ya got?

Okay, inbound links work in a lot of ways – creating credibility, trust and the chance for designation as an authority site so, yeah, inbound non-reciprocal links help, and there isn’t an SEO pro or newbie who doesn’t know it.

What you don’t hear a lot about is on-page links – links seen on every page of a web site. Links that connect visitors to other site pages.

Redirects and On-Page Links

Redirects are not held in high regard by search engines. The long-held impression that redirects are black hat tactics is still there. And, there are hackers still trying to hi-jack sites using invisible, on-page redirects. As soon as a visitor accesses the hacked site, s/he is redirected to another site page or even web site from the link provided in the SERPs. Redirects, such as a 301 (permanent redirect) or 302 (temporary), are cause for suspicion and can mean instant death for a web site.

There are plenty of legitimate uses for redirects. A blog, for instance, may send out a conformation of post receipt before redirecting the visitor to the blog and post itself. This kind of redirect is beneficial to visitors, providing useful and reassuring conformation and therefore, not all redirects are bad.

Here’s the deal: if the redirect has a valid purpose – one that an SE bot understands – redirects aren’t a problem. In fact, on-page links are nothing more than redirects and your body text should use them to help visitors navigate.

Embed Text Links Deep In The Site

It’s easy to optimize a site page for bots. The SEO industry still contends some search engine weighting factors, but there are many that enjoy almost universal acceptance by SEO pros.

That’s why some site owners optimize a page for bots and bots only. 5% keyword density, perfect title and alt tags, perfectly balanced informational content – the kind of content bots like to see. This page is then buried deep in the site with lots of links to more user-friendly pages within the site.

The deep site page, perfectly optimized for bots, won’t be attractive to humans (necessarily) with keyword dense text, no graphics (bots don’t read graphics files) and with a perfect title tag. This is a high ranking page according to metrics analysis because the content is information, as opposed to sales copy and again, it’s bot-o-mized in the page’s HTML.

Once the visitor reaches this highly optimized page, he or she is automatically redirected to a page that’s designed to appeal to humans rather than bots. These automatic redirects are usually permanent (301) and susceptible to bot interrogation and even page penalty.

Use On-Page Links to Avoid the Appearance of Impropriety

Use links to redirect visitors. Links are, in fact, redirects and they can be used to help visitors find the information, goods or services they need, and help index a site faster and with greater accuracy. If you do it right, you can get all desired pages indexed on the first pass by a Googlebot. For human visitors, it’s all about on-site links placement that strikes a chord or hits a nerve and generates a response to take action.

Example: A fire extinguisher site publishes an informational piece on home safety, providing good, quality advice. Quality, high-ranking content. This page is one of the high-ranking, deeply placed pages that draw visitors in. Now, instead of using automatic redirects, the savvy site designer will use contextual links to trigger a response from the site visitor.

Within the article, of course, is the recommendation to keep a fire extinguisher in the house. (Completely off the subject, you should have a fire extinguisher on hand. It saved my house.)

Anyway, the article provides a link in context to (1) generate a response and (2) compel action to that response. So, to move the visitor off the highly-ranked page, a short paragraph, based on the keywords entered to access the highly-ranked page, is used. For instance:

“Fire danger in the modern home is a reality, putting you and your family at risk every day. A small, properly-charged fire extinguisher can save your home and the lives of your loved ones.”

This deeply-embedded link then takes the reader directly to the products page for home fire extinguishers. The highly-ranked informational content draws attention from bots. The links draw the attention and direct the flow of visitor traffic once the site has been accessed, leading visitors to the precise page they need.

Use On-Page Links at All Site Access Points

A visitor can reach a well-connected site any number of ways – via directory,

Obviously, the more access (doorways) to a site the better. However, how a visitor got there is indicative of what the prospect is searching for. If the prospect reached the site through the Directory of Insurance Brokers, that visitor may or may not land on a home page depending on the query words used in the directory search.

“Low-cost high risk car insurance,” as the query phrase, displays a link with that exact headline. The searcher clicks on the top-ranked link, reads a short “Let us show you how to save $$$ on high-risk pool insureds, and a click takes the visitor to the car insurance zone page where additional links continue to direct the pathway taken by the visitor, i.e.

“High risk insurance will cost more depending on just how complicated your driving history is.” , (especially if you’re a local broker looking for local business).

Directions for Humans, Street Signs for Bots

These on-page links direct visitors to precisely the information they’re looking for. These links also provide pathways for search engine spiders that are trained (programmed) to follow links.

Links direct spiders to the far corners of a site, deep into the corpus. However, it’s just as important to make it clear what pages are off limits to Googlebots and other snippets of spidering programs.

Keeping Spiders Out

Spiders don’t just crawl. They follow the mathematics within the algorithm that directs their movements. They follow commands as well.

You can designate certain pages as to keep spiders out of your private business, or keep bots from indexing pages that are in beta at the moment and not quite optimized for indexing.

Or, if you want to close off large sections of a site to spiders, create a robot.txt file that identifies the pages of a site that are NOT to be indexed or accessed by spiders. The fact is, Googlebots are unleashed on any site visited by a user with a Google toolbar so there’s nothing you can do to keep bots from crashing the party.

A robot.txt file, placed in the site’s root directory, will make it clear to spiders what they can and cannot see. It’s the safest way to keep the relentless, “Terminator”-like Google bot from reforming from liquid into a dangerous cyborg once again. And believe it, bots “…will be back.”

Each page of a site should be analyzed from both the bot and the human perspective. Use embedded links instead of automatic redirects to avoid raising the suspicions of bots who think redirects are “icky.”

And place these on-page, intra-site links for maximum effect – either at the point when user need is identified, at all entrance points to the site, on the order form and the contact page.

On-site links are invaluable for helping visitors and helping bots. And together, that’s very helpful to the success of your site.

No comments: