Showing posts with label Google. Show all posts
Showing posts with label Google. Show all posts

Wednesday, September 28, 2011

Take Care Of Those Who Hold The Checkbook


Managing Customer Care:
“It’s easier to keep a client than find a new one.”

Yeah, it’s an old cliché, but it’s a cliché because it’s true. The key to long-term site success is an expanding customer/client base – repeat buyers of your goods or services.

Chances are, she won't be back
Keeping the customer satisfied, especially for web-based businesses, isn’t a walk in the park but there are things you, as webmaster, can do to manage client care, keeping the customer satisfied and coming back for more.

1. Maintain an accurate order tracking system.  If you use a delivery service like FedEx or UPS, you’ll get tracking software with your account. But, if you’re trucking 37 ceramic figurines to the post office every day, you’ll need an order tracking system – preferably one that can identify “downstream” problems like: “Hey, you’re going to run out of hula girl bobble heads next Thursday. Time to reorder.”

2. Stay involved. You may be using a drop shipper to manage inventory storage, shipping and handling, and it’s not always easy during the rush of the day to check tracking data – even if you’re shipping out of a spare room. Track all problem orders yourself.

3. Provide updates to the buyer. An auto-responder as soon as the problem is identified with an opt out box to cancel the sale. You may lose that one, but your straightforwardness and ease of use will make a positive impression.

Federal law requires that orders must be filled within 30 days, after which the buyer is no longer obligated to pay. Don’t ship without renewed buyer approval.

4. Provide US-based customer support 24/7. In this global marketplace, someone is always buying, and someone always has a question. Also, empower telephone reps to accept returns with the customer’s receipts. This saves on call-backs and significantly lowers buyers’ stress levels.

5. Use dynamic pages designed specifically for each visitor.  Best example? Amazon. My home page is different from your home page based on our past buying histories. All of this data is stored in Amazon’s database and when I log on, I’m bombarded with recommendations based on items purchased five years ago.

But Amazon stills calls me by my first name. That’s nice.

6. Last key point. Overdeliver. If prudent, drop a personalized email or even make a telephone call. People are really pleasant when the company CEO calls and promises satisfaction.

The whole point of quality customer care is to create word of mouth (WOM) viral marketing. Treat your customers or clients right, and viral WOM will do the rest in growing that client base bigger and bigger.


Webwordslinger
editor@webwordslinger.com

Wednesday, May 11, 2011

What You Don't Know About Google Can Hurt You

Making the Most of Google's SE

The world of e-commerce depends on Google. Even though there are more than 4,000 search engines available, including the biggies like Yahoo, AltaVista, Ask Jeeves, etc., the name people know is Google. Even the word itself has become part of everyday speak, as in, "Let me Google that." (a verb) or "I want the Google on our competition." (the complete picture). So, as a site owner or designer, it pays to get the Google on Google - more specifically, how potential visitors to your site might use this mighty SE with over 100 billion pages currently in its data base

In addition, Google offers a lot of search options that will enable site owners to see their sites the way the Googlebot sees them. Something as simple as revising your site's title tag can make a significant difference in how Google's SE picks you up, views your site, ranks it and, subsequently, places your URL on its SERPs - all in one-tenth of a second.

Most users simply log on to the Google site, enter their query (in the form of key words), hit the enter key and wait to see what pops up. This is called a default search and it will deliver all sites in which the entered keywords appear as part of the SERPs available to the user. In other words, the user will get pages and pages of search results that are only marginally associated to his or her search topic.

By using common symbols, the more sophisticated users can narrow their searches, isolating those sites that are truly relevant. For example, by adding a minus sign (-) in front of a key word, Google's SE will NOT show the results of that key word. So, let's say you're looking for a recipe for apple pie. The last thing you want is 118 useless SERPs about Apple, the company. So, you might enter: 'apple pie -Apple computer' to eliminate pages of information about Steven Jobs. The tilde (~) tells the SE to search for the entered keywords and synonyms of the keywords. Add quotes to key words and only pages in which quotes appear around the key words will be delivered to the user's screen.

All of these basic search techniques improve the quality of search results for users, making users happy and Google shareholders even happier. But then there are Google Search operators - in fact, specified keywords that the SE recognizes as directions rather than words to be searched. And some of these operators will be extremely useful to the owners of e-commerce sites by enabling them to optimize their sites while conducting e-espionage on competitor sites - and it's all free.

Here's a for instance: want to find out how many inbound links are pointing to your site? Try this: link:www.yoursitename.com. Obviously, type in your site's name where it says 'yoursitename'. You'll get SERPs with all URLs pointing to your site. And as most site owners know, quality, non-reciprocal links are like gold when it comes to improving your PageRank. You can also identify links that aren't helping your site. In other words, this Google tool allows you to control inbound links - an ability that's grown in importance now that Google heavily weighs inbound links in its ranking algorithm.

Want to know what the competition is doing? It's simple enough. All you have to do is enter: related:www.yoursitename.com and sites that are, in some way, related to yours, will appear. Not only is this a good means of tracking competitor activities, it's also a great way to find sites that might be interested in some link swapping - always a good thing, especially for the owner of a small or brand new site.    

Now, turning to the matter of SEO and how Google's search services can help determine if your site is, indeed, fully optimized. Check this out: try using the search engine to see how your site ranks when the SE is instructed to find keywords only in the title of your page. Enter: allintitle: fruit baskets (of course substituting any of your keywords in place of the example keywords, fruit baskets, unless you're in the fruit basket business). Chances are, if your site's PageRank tanks on this search, a bit of tweaking of your title tag just might be in order. If your title tag reads 'Rosie's Little Bit of Home", i.e., no mention of fruit baskets, your visitor traffic will increase by simply adding the words 'fruit baskets' to your title tag.

You can also check out your site's level of optimization by conducting the following Google searches:

1. To have Google search for keywords - your keywords - in the text of the site, type in:
            allintext: fruit baskets

This search will identify if keyword density and placement are sufficient to make the Google SE sit up and take notice. 

2. To ask Google to search for your keywords in URLs only, type in:
            allinurl: fruit baskets

A search of URLs will reveal sites similar to yours (since all sites will have the same key words as part of their address, i.e. fruitbaskets.com, yourfruitbasket.com and, of course, the ever-popular fruitbaskets'r'us.com. If these sites are ranking higher than your site, check out what the competition is doing better than you. E-espionage is legal, so do a little spying on the competition and learn from them.

3. How about a search of the anchor text of all sites that mention your keywords? Type in:
            allinanchor: fruit baskets

This will indicate sites that mention fruit baskets somewhere in their anchor text, which might also indicate sites interested in reciprocal links.

4. When Google discovers your site (or you submit your URL for spidering), the SE takes a snapshot of every indexed page and places them into a cache. To search the pages in your site's cache, type:
            cache:www.fruitbaskets.com

To further refine the search of cached pages, you can also conduct a keyword search within the cache. Simply type:
            cache:www.fruitbaskets.com web

And finally,

5. On of the most useful tools Google offers for no-cost marketing research is the info search. This will provide whatever information Google keeps on your site (or any other site, for that matter). Type:
            info:www.fruitbasket.com

This will produce a general profile of your site and the sites of your competitors, at least from the Google perspective. Much of this information - everything from inbound links to meta tag text - can help you (or your web designer) deliver more visitor traffic and  a higher conversion rate because visitors are actually looking for your product, not something like your product. In short, better results all around.

Google's objective is to deliver the highest quality search results to its users, which is one of the reasons they offer this variety of search tools for knowledgeable users and, of course, site owners. Your knowledge of how the leading SE views your site and compares it to similar (competitive) sites is a critical aspect of making adjustments to everything from key word density to fresh anchor text.

To learn even more about the tools Google offers, click on the links below and get  your site Googlized. (See, another new word!)



Friday, February 5, 2010

DIE, KEYWORDS! DIE!

Mushroom Grave by MrBobDobolina.

The Agonizingly Slow Death 
of Keyword Density:
What Worked Yesterday Won’t Work Today

If you’re about to launch a new website, there’s something you should know right now. Search engine marketing – in fact the entire online marketplace – is evolving so quickly that the marketing strategy you developed yesterday is out of date.

Keywords Are Born
Prior to the mid-90s, we were all stumbling around the web. There was no map. No address book. So, if you wanted to buy an elephant online the only way you’d find one is if the local elephant franchise posted their URL (universal resource locator, your web address) in a flyer or newspaper ad.

Then, the folks at Yahoo (not much more than a good idea back then) came up with the idea of a search engine – a small piece of programming, comprised primarily of a simple mathematical formula called an algorithm, that would assess web sites and sort them into categories (the search engine taxonomy) so that, if you typed in ‘elephant,’ you got links to various elephant retailers. Cool. It was like the sun came up in Webville, or at least we had a flashlight!

Keyword Abuse
That primitive search engine relied, primarily, on reading letter strings. And the more a particular letter string was repeated on the web page, the easier it was to index the web site back at search engine headquarters.

Well, it didn’t take site owners long to discover the inner workings of Yahoo’s search engine and the Dark Ages of search engine marketing descended upon the virtual landscape.

Keywords were scattered all over a site – along with keywords that had nothing to do with the site’s subject. So, lots of site owners started adding keywords like “sex” and “porn” to their site text – even if the text was invisible to the human eye. (White text against a white background is invisible to humans but search engines can read it just fine.) The site could be selling dietary supplements but the owner (or search engine optimizer) could add a bunch of other words to the site text. No problem.

And what the search engine user saw amounted to keyword babble. Just a bunch of keywords (letter strings) designed to fool search engines, not to help site visitors.

Other abuses followed as technology marched on until we entered the era in which we now function online – an era of SEOs trying to anticipate what Yahoo, Google and Inktomi programmers are hatching and those programmers trying to stay one step ahead of keyword abuse and subterfuge. Why? The purpose of the search engine is to deliver relevant results to the user. If site owners undermine the process through the use of keyword abuse, the relevance of search engines diminishes.

Other forms of keyword abuse included (and still include) keyword stuffing – stuffing the HTML keyword tag with every word in your kid’s dictionary. At one time, these keyword tags were given significant weight by search engines. Today, they’re considered much less important because of the on-going practice of keyword stuffing.

Then, finally, we come to keyword density – a loathsome concept to copywriters, search engine programmers and web site visitors. Simply put, keyword density is simply the number of times a keyword or set of keywords appears in a body of text. Example:

The Ski Hut carries the latest in skis, ski gear and ski wear. Snow skis from around the world, along with ski boots, ski poles, ski goggles and ski outerwear line the aisles at The Ski Hut where, if it has to do with skiing, we sell it.

Can anyone guess what the Ski Hut sells? Hands please? That’s right, they sell skis and, in this example, the keyword density is about 22%, i.e. out of the 48 words in the example, 11 mention skis, skiing and other ski word keywords. Unfortunately, with a keyword density of 22% the text isn’t much more that a continuous letter string of ski ski ski.

The Evolution of the Search Engine
Yahoo’s algorithm builders gradually developed increasingly sophisticated search engine algorithms to thwart keyword abuse, adding numerous other criteria to assess the nature and quality of each spidered site. And, when Google came along with an even more sophisticated algorithm the flood gates opened and the search engine wars began. Yahoo, Google and Inktomi (used by MSN and many other search engines), along with My Simon, Ask Jeeves (now just plain Ask.com) and a bunch of other search engines, big and small, all went at search engine building a little differently.

Some, like My Simon and Ask Jeeves, focused on retail sites. Type in wristwatches and the SE would delivered sites and pictures of wristwatches. (That picture thing is making a comeback, btw.) Other search engines, like Google, focused on sheer size, spidering and indexing literally billions and billions and billions of web pages so no matter what a user enters as a key word or phrase, something will show up.

Keywords Now
Yes, they’re still marginally relevant. HTML code still allows for a keyword tag and spiders still count up letter strings so, yes, selecting keywords that count is a critical part of the optimization process.

But keywords ain’t what they used to be. Fact is, contemporary search engine algorithms are designed to detect keyword stuffing and abusive keyword density so their users don’t have to plow through page after page of repetitive, nonsensical keyword garbage. So, choose your keywords carefully. They still count.

As far as keyword density is concerned, cut it out! Different search engine marketing professionals will tell you a 5% keyword density (5 keywords per 100 words of site text) is still acceptable but the majority of SEOs recommend a keyword density of 2% to 3% to keep search engine spiders from slamming a site for abuse.

The Agonizing Slow Death of Keywords
There are still site owners, and even SEO experts, who believe that keywords are the end all to be all. Soooo not true.

Keywords are great for delivering organic traffic to a site – organic traffic being the links on the left side of Google’s search engine results pages (SERPs). These are sites that the programmed-to-be-impartial search engine has determined will be relevant to the user. Organic SERP traffic is rock solid. It’s a good thing. But think about your own search engine habits.

How often do you look beyond the first or second page of Yahoo’s SERPs. Occasionally? And how often do you search for a site on page 145 of Google’s SERPs? Never. SERPs are sorted by relevance to the user’s query (keywords) so anything on page 145 is going to have much less relevance than links that appear on page one of Google’s SERPs.

And, bottom line? The chances of you creating a page one SERP site are pretty remote (though don’t give up. It happens everyday).

Today, search engine algorithms are much more sophisticated, looking for fresh, informational content, non-reciprocal inbound links, interactivity between site and visitor and other criteria, again giving some, but not much, weight to keywords that appear in the HTML keyword tag, or how many times you can stuff the word ‘kumquat’ into the text of your kumquat site.

And, it’s only a matter of time before site-owner-selected keywords will finally fade away altogether. Search engine spiders will assess and sort each site according to growingly sophisticated and expansive criteria.

It’s only a matter of time and that time can’t come soon enough. Kumquat.



Monday, February 1, 2010

GET NOTICED SMALL TIME: IT'S A START

My last minute entry by SharkeyinColo.
Think Small:
Grow Big


Your Site May Appear on Digg
But So What If No One Diggs You Except Your Mom? Hi, Mom! I'm on the WEB!"

photo courtesy sharkyinColo 
Print media – newspaper, magazines, catalogs and such – are losing relevance in the digital world in which news is up to the minute – with videos. You can read all about it in tomorrow’s newspaper – minus the video, of course.

With the proliferation of social search engines like Digg and Reddit, the future of dailies, weeklies and monthlies seems pretty bleak, unless…

…with the advent of more and more sophisticated RSS aggregators and bigger pipes, many print outlets are turning to the web to stay afloat – but just barely at the moment. The NewYork Times, the grand dame of national newspapers, is stuffing RSS feeds with content, changing their tag line from “All the News That’s Fit To Print” to “All the News That Fits We Click.” Time marches on.

But if the NYTs, Washington Post, LA Times, Boston Herald and other information powerhouses are giving it away for free, what chance do you, owner of Fred’s Newsletter, against such formidable competition.

Think Small Web Communities
If you have something worthwhile to say, worthwhile people will listen if we can just hook up writer and reader. But if the humongous social search engines like Digg, del.icio.us, Stumbleupon, citulike, Reddit, Spurlnet and other social SE’s are growing bigger and accessing content from reliable sources like The LA Times, what chance have you got to have your daily news and views picked up and, yes, actually read.

Simple. Think niche.

The big, egalitarian search engines collect feeds from readers interested in everything from taxidermy to tax shelters. Not so with the smaller, more focused site comminutes with search engines powered by the people.

Getting Noticed Small Time
There are three steps involved in getting your news picked up by smaller socially-driven search engines – and they’re all free, which in itself makes the strategy appealing to the new site owner with a very limited marketing budget.

Step one: Identify your reader. What segments of the web community would actually be interested in your thoughts from Squirrel News Today? Well, you’d be surprised. People who feed the squirrels,  watch their antics in the park, companies that eliminate squirrels from the attack (you know, Squirrel Be Gone), nature lovers, hikers – just use your imagination.

Step Two: Give the reader something of value. A free subscription works great if the quality of your squirrel news isn’t pure nuts and berries. Squirrel aficionados want their rodent updates straight up, without a lot of editorial opinion.

Step Three: Add an open blog to your feed and let the readers post their own squirrelly thoughts. This builds a community, provides unusual perspectives (From squirrel lovers? Who’d a thunk it?) and expands your publication’s name and reputation, making Squirrel News Today the #1 online publication among the (growing?) ranks of squirrel lovers.

Popular Niche Communities
Squirrel lovers or not, these are the people you have to reach to be heard. So, let’s look at some of the more popular niche communities. Hmmm. Not a squirrel-loving bunch among them.

Deviantart.com caters to the artistic community with exhibitions and an outlet for would-be artists. Some of the posts are pretty good. Some look like they were done on black velvet, but there’s no accounting for taste. If your feed caters to the coffee house crowd, get hooked up and get noticed.

Urbis.com is another arty site – avant garde to old guard, but if you’re a struggling artist or graphic designer, and you want some exposure within the art community, these subscribers and readers are the people you want to meet.

Yelp.com is the site for business travelers. There’s information on cool restaurants, city sights, how to get around, the night life, spas – all there delivered to the subscribers’ RSS reader each morning. How sweet is that? It’s free marketing, people. FREE!

Realestatevoices.com employs the Digg “reader votes” model, only instead of covering all the news, this social search engine focuses on – you guessed it – real estate. So, if you’re an agent, a professional investor, a contractor, rehabber or flipper, you’ll want to get caught up on the land business over morning coffee. And your feed is right there to read with the java.

Personally, I like BakeSpace.com. It’s a great feed for swapping recipes and, yes, the search engine is growing. There are more recipes for three bean salad on this one spot than in all of Mongolia, which either says something about three bean salad or Mongolia. Not sure.

WebMD.com is a well known web brand, and, like many other large sites, they maintain a community zone where visitors can pick up the latest on what bug is making the rounds in schools and where. Good information when you kid comes down with the sniffles.

And speaking of kids, check out imbee.com. Think of it as MySpace for the younger crowd. Now, if this is part of your market, or their parents are, this is a great place to feed in to. It’s a teeny-tiny community of teeny-tiny readers.

Finally, for the serious news junkie, visit Helium.com. If you have something newsworthy, or happen to glance out the window and see a CAT 5 tornado heading your way, snap a picture, add it to your Helium feed and you’re a reporter. (P.S. After taking the twister shot, head for the basement. You’re about to lose the roof.)

No, you might not get the exposure you’re looking for on Digg. My best post has been dugg six times in six months so the chances of you finding it are about zero. But take heart.

Make your views, news and cookie recipes heard by feeding to smaller sites looking for information on niche topics. Who knows, you could become the next Lenny Slobotnik, publisher of Squirrel News Today and recognized authority on rabies vaccinations.

Monday, January 18, 2010

Will Your Website Take Off Like a Rocket?

3…2…1…Launch
The Dos and Don’ts of a Successful Site Launch

 Ares I-X Rocket: A Beautiful Launch (NASA, 10/28/09) by nasa1fan/MSFC.




You’ve selected your template, your palette of colors and written your site text so it’s irresistible to any visitor who happens on your site. Good for you. Unfortunately, if you don’t prepare for a successful site launch, not too many visitors will happen upon your site. They won’t even know it’s there.

So how are visitors, potential buyers or clients, going to find you among the millions of other web sites covering the cyber terrain? Well, the answer, at least in part, is SEO – search engine optimization – making your site more easily recognizable to search engine spiders.

It can take weeks, months and even years to have a site indexed by Google, Yahoo, MSN and other search engines – unless you go proactive and make your site spider friendly. It also helps if you invite these crawlers to stop by for a look. So, here are some dos and don’ts to ensure your site launch doesn’t go unnoticed.

Avoid These Common SEO Black Holes

Search engine spiders aren’t smart. They don’t think and they have to be led from place to place within a site. They’re unable to “read” certain kinds of information and they don’t make connections between that body of text and the image associated with it.

Black Hole #1
Spiders can not read text in graphics – any kind of graphics. So, if you’ve loaded up your site with Flash animations or graphics frames to appeal to human eyeballs, these images won’t appeal to spiders. They won’t even be noticed.

Black Hole #2
Spiders don’t bounce around a site randomly (thank goodness), they track links from page to page. Links can be embedded in site text to direct spiders to each page of your site to ensure that it’s completely and accurately indexed. No links, or too few links, and you won’t get the recognition you need for long-term site success.

Black Hole #3
Keywords still count, though not as much as they once did. Spiders crawl text strings looking for repeating words and phrases. They count up the number of keywords per block of text to determine keyword density. You’ll need keyword dense text that also appeals to human readers.

Don’t select keywords willy-nilly. Search engines employ a taxonomy – a system of classification – to place your newly-launched site into one category or another. Select keywords that spiders don’t understand, within the context of the entire site, and you’re sure to have search engine problems.

So, remember, no critical text in graphics, add some embedded links and select keywords with care to avoid being sucked into an SEO black hole.

Take These Positive SEO Steps

There are lots of low- and no-cost steps you can take to gain the attention of search engine spiders – even if your marketing budget is the change you find in the sofa.

Link Up
One thing search engines like to see is links from other sites – especially from sites with page ranks (PR) higher than your site. Page rank is measured on a scale of 1 – 10, 10 being the highest. So, Yahoo – the most visited site on the W3 – has a page rank of 10. The Open Source Directory has a PR of 9.

By submitting your site to various directories and portals like these, you gain prestige in the “eyes” of SE spiders. Now, some directories, like the Open Source Directory at www.dmoz.org are free – a good thing. Google’s directory is free as well, but it costs you $300 to get listed in Yahoo’s directory of “selected” sites. It’s a great way to get some respect right from the start. Link up with directories. To find a listing of free and paid directories visit www.stronglinks.com/directories.php.

Spread the Word
The world wide web requires a ton of new content everyday. Think about it – there are a lot of site pages to fill each day, so many sites are looking for free content. And you can give it to them.

Press releases, keyword optimized without sounding like gibberish, will get you noticed as more sites post your release with links to your site – a wonderful thing in the eyes of a spider. There are on-line companies that distribute press releases. Some do it for free, others charge for the service. Some companies to look at are: www.pr.com, www.prleap.com and www.clickpress.com. There are lots of others.

Another way to spread the word is through article syndication. Are you an authority on the products or services you market on-line? If so, you can write 10 or 20 articles on various aspects of your expertise and put them out for syndication. These articles will be picked up by sites and displayed with a link back to your site. You get to establish your creds as an authority first, then provide a link back to your site. It works.

Link Exchanges

Good, but time-consuming and a bit humiliating. Links exchanges are simply asking sites related to yours to exchange links. “I link to you; you link to me.” This creates various pathways for web users to find what they’re looking for. However, do note that if you’re linked up to any old site, sites unrelated to your own, SEs will lower your marks because these aren’t links helpful to visitors. So, stay within your field with links exchanges and always try to link with a site that has a higher PR than yours.

Blogs and Forums
A lot of sites have blogs today – places where visitors can post articles and other useful information, or respond to articles that have been posted by the site owners.

Contribute to these info outlets if you can string words together into sentences. It doesn’t have to be great art (though well-written and without mistakes is nice), it just has to be relevant, useful to readers and written with authority. The one problem with blogs is that they’re updated regularly on big sites so your insightful analysis of long-chain polymer molecules might disappear quickly into the blog’s archives.  On the other hand, because blogs and forums are updated frequently, they always need new content, some of which could be yours.

Pay-Per-Click (PPC)
You know those sponsored links you see on search engine results pages – Ads by Goooooooogle? Well, somebody paid money to have that ad and link placed there. Google’s Adwords program allows advertisers to bid on different keywords. The highest bidder for a given keyword receives the most prominent placement on the SERPs. You can bid as low as five cents for some keywords (not the best of them) and many bucks for really good keywords in hot market segments.

Of course, this requires some marketing capital. If you can afford programs like Adwords or Yahoo’s Search Marketing, you can give your site a running start with PPC marketing. The good thing is you only get charged when a web user clicks on your paid link. The bad thing is you pay every time a user clicks on the link whether s/he buys something or not.

Google Sitemap
Log on to Google, click on Business Solutions and follow the path to the Google Sitemap service. Here, you can upload your own sitemap, basically “telling” Google to come take a look at you.

Spiders love site maps because that’s where the links are and they can find a lot of information quickly all in one spot. By uploading your site map to the largest, most popular search engine within the known solar system, you’ll get noticed faster. And just as importantly, your site will be completely and accurately indexed from the start.

It takes much more than a good looking website to find success on the web. It’s a lot of hard work and it won’t happen overnight. But, by following some of these steps your site will start showing up in SERPs, generating “organic” visitors – the best kind to have.

So, even if you’re working with a non-existent promotion budget, there are plenty of free steps you can take to move your recently launched site closer to profitability.


Monday, September 28, 2009

LOCALIZED SEARCH: ARE YOU LOOKING FOR BUSINESS FROM MOZAMBIQUE?


YOUR NEXT BUYER IS RIGHT HERE.



YOUR NEXT BUYER


Local Site SEO/SEM:

Think Small. Win BIG.

Of course GE has a huge website and Microsoft’s digital space is the size of a couple of football fields – everything from download patches to sales to rights-free clip art. A web site, for any well-known business, is a must. So is a HUGE web presence. That’s why these global conglomerates have SEO professionals on staff. It’s also why these sites rank so highly on Google. They’re enormous and optimized to the nth degree.

But what about your little boutique on Main Street, Anytown, USA. Or your car dealership out on Route 81? Can you compete with the big guys and if so, how? Here’s a quick primer on SEO for local businesses looking for traffic within a 40 mile radius of the business’ brick-and-mortar storefront or office building.

You don’t need a big budget and you don’t have to be an SEO to rank highly on local searches. But you do have to design your site for local search and optimize the site skin for the highest conversion ratios. So, here’s how to put your little hometown business on the web map – and actually drive traffic.

Google Webmaster

Google Webmaster Central is a treasure trove of Google-based tools designed to provide data and tips on improving your site’s page rank (PR) on Google’s search engine results pages or SERPs.

Google provides tools to improve local site traffic with its Keyword Generator, its Diagnostics that identify problems encountered during the last Googlebot visit – everything from an old home page still on your server to broken links. If bots have a problem, you have a problem.

This site also provides analytics: what does your site look like to a Googlebot (remember bots never see the site skin, just the HTML code under the site), there’s a Site Status Wizard to determine how many pages of your site are actually indexed in Google’s database of over 100 billion web pages. Google Analytics provides a breakdown of visitor traffic – who, what where, when and sometimes even why – all in one place.


Google Gadgets

Free stuff and especially useful for two purposes: (1) site stickiness and (2) local search. These doo-dads and gizmos keep visitors coming back and customers walking through your front doors. So what can you get from the Google Gadget goodie bag? Local weather, calendars and local time, all perfect for local search for local businesses. These Google gadgets enhance the visitor’s perception that you really are local and that’s a very good thing.


Google Gadgets also includes an online to-do list function. Ideal for local search. TO DO: go to dry cleaners – your dry cleaners. You can also pick up complete mapping functions – another must have for site localization. How do people get to your outlet?

Google’s To-Do List

Using Google maps, you can provide written directions and even a printable map. And all of these features are free when you open a Google account – which you should do ASAP.

“Hey, we’re right at the intersection of Maple and Main!”

Local SEO

You don’t want business 400 miles away from your shop. You want people who live in your area to stop by to make a purchase (unless you want to go global and get into drop shipping, which is another topic altogether). Adding gadgets like maps and local weather help convert visitors and lower your cost-per-acquisition (CPA) by delivering highly-qualified local buyers – web users who have located your store or office through a local search.

To improve the likelihood of being found locally, here are some tips to ensure that Googlebots and other crawlers get it right. Oh, and btw, if you don’t think this is critical to the long-term success of your business, check out these stats:

Seven out of 10 web users employ local search, aka 70% of potential prospects!

68% of surfers call the telephone number provided by a local business to ask questions about product availability, directions, questions when “Some assembly required” and so on. People like searching globally but they love buying locally. It’s so much more personal.

One-half of search engine users add a geographic modifier to their query words, aka, certified public account dallas texas. It’s easy to understand why. There are some things (like a tax audit) that you want to deal with face-to-face so you call a local CPA – or at least 50% of web users do.

So, in no particular order of effectiveness (do them all) here are some suggestions for upping local traffic through local search:

  • Add the name of your community to HTML tags – keyword tag, title tag, description tag and other HTML code. Remember, this is what spiders spider – not the site skin – so SEO is all about optimizing your code, NOT the site skin. However, you should also add local contact information on the site itself. This text synchs up with tag content, adding validity to the site’s code. (No funny stuff goin’ on.)

  • Make your URL visible within the local community. If your business advertises in the local newspaper, make sure the site URL is prominently displayed. Think of your web site as an opportunity to tell local buyers why they should by locally and, more germane, why they should buy from you.

  • List your business with Google Maps. When visitors access mapping from Google, Google Earth or MapQuest, your little business shows up as a push-pin online and in a printable format.

  • Develop a list of long-tail keywords that would be used by local search engine users. If you go with the top-used keywords, you’ll get buried in Google’s SERPs. How well will “Bonnie’s Art Supplies” stand out against Dick Blick and the hundreds of other mega-crafts stores that sell art supplies? Bonnie will be lucky if her site ever sees the light of day using the most popular keywords.

  • Instead, create a list of long-tail keywords – keyword phrases that locals would use, e.g., restaurants boothbay harbor maine or tourist information boothbay harbor maine. This narrows the number of search engine users who actually employ these long-tails, but moves your site to page one of the SERPs when a user does employ one of your localized, long-tail keywords. Make sure these keywords are topic-city specific.

  • In your site text (which will be part of the site’s code and, therefore, spidered) use words that describe your service region. Words like “near,” “around,” and “vicinity of” can be used to expand or contract your service area.

  • Some SEO experts claim that people don’t search by zip code. I do. And it doesn’t hurt to have your zip in your body text a bunch of times.

  • When creating long-tail, region-specific keywords, spell out the name of the state and avoid abbreviations, e.g. Boothbay Harbor Maine, not Boothbay Harbor ME.

  • Link to other businesses maintaining websites within your service community. This includes community sites, tourism sites and local business directory sites. Also, privately-owned sites specifically targeted at a local community are popping up like weeds. They provide local news, weather, some local reporting and links to local businesses – hey, like yours!

  • Post informational content on local blogs. Many communities maintain blogs – places where people can sell an old couch or ask for volunteers for the upcoming May Fest. These “local spots” are fast turning into the web log-on page – the place to get the local news and download a coupon for a free pizza when you buy eight. (Good deal.)

  • Don’t fool with Google. You can try multi-listing under different names when you register with Google Maps, but if you get caught employing this gray-hat tactic, your site can be sanctioned – sent to the back of the line – or banned altogether.

Local search is here to stay and more and more web users are employing the local search options now offered by the big search engines. Locals don’t want their eyes tested by someone 3,000 miles away. They want your eyeglass emporium. The want to make an appointment by telephone, print out a map to your specs store, and they want the personalized service they only get locally.

Optimize your site for local search and the local community. Who knows? If you supply enough good, local information, your home page may become the log-on for hundreds, even thousands of local prospects.

And you know that’s going to be good for business.

Localized search delivers powerful results in a short time - and it's growing more popular every day. Drop me a line. Your next customer is right down the street. No, really.

Webwordslinger.com

Sunday, September 13, 2009

LET GOOGLE TELL YOU WHAT IT WANTS. THEN LISTEN CAREFULLY!

YOU WANT GOOGLE TO LOVE YOUR SITE, AND THEY'RE HAPPY TO TELL YOU HOW TO BUILD A LOVING RELATIONSHIP.

YOU JUST HAVE TO LISTEN TO THE GOOLISTAS.


Google How-Tos:

Listen to the Googlistas

You want Google to love your website. This search engine alone accounts for 46% of all searches so when you consider that there are virtually thousands of search engines (granted, many topic specific), controlling a 46% share of all search engine users makes you “the cat that everybody’s rapping ‘bout.” And they are.

The webmaster community and Google don’t always get along and that’s understandable. For most webmasters, Google is a prime source of site traffic but if there are too many obstacles to Google success, of course there’s going to be feuding between search engine and those professionals who rely on search engines for their livelihoods. Every time Google tweaks an algorithm, some sites gain, some lose ground – and the reasons are rarely clear.

So, Google put together Webmaster Central, a blog for site owners to post gripes, offer suggestions, identify glitches and otherwise interact with the people behind the search engine. (We can only assume there are people behind Google. Verifiable proof is slow in coming. The entire company could be bot-run for all we know.)

The Google Webmaster Blog and You

Google knows it must keep site owners happy and who or whatever is running the company recognizes the need to interact with professional SEOs, SEMs, coders, designers, graphic artists and every new technology that takes a giant leap forward such as remote site syndication (RSS) that changed the way information was distributed over a weekend.

So, this is where you go to ask questions and get answers from other site owners. Google answers. From regular users like you – the owner of a small, once active site that has mysteriously disappeared from Google SERPs overnight. What happened? And how are you going to pay the rent if your e-store has disappeared from Google’s ever-expanding index?

Posting to the Webmaster Central Blog is a good place to go for quick answers from real people. And that usually means you’ll get an answer you can actually understand rather than an earful of techno-babble from some chip head.

This is also the place where Google introduces new features for webmasters. Just a while back. Google let loose improvements to iGoogle Gadgets for Webmaster Tools.

Here’s how the Googlistas explain it: “After our initial release, we saw clear interest in the gadgets, and plenty of suggestions for improvement. So we've spent the past several weeks working on various areas. The biggest improvements are probably for those of you with more than one site: when you add a new tab of gadgets, your gadgets will now default to the site you were viewing when you added them to your iGoogle page. Additionally, gadgets now retain settings as a group, so if you change the site for any gadget in a group, the next time you refresh that page, all the gadgets will show data for that site. And gadgets now resize dynamically, so they take up less room.”

Functionality has also been improved with the addition of Top Search Queries for your site, very helpful in refining a keyword list. “The data from the Top Search Queries allows you to quickly pinpoint what searches your site appears for and which of those searches are resulting in clicks,” according to Google.

Other new features that improve site performance analysis include a smart, geo-targeting function. This enables you to create several site skins for regions around the world if you choose. This geo-targeting gadget also produces a map overlay of where your visitors are coming from – right down to street level if you’re only seeking local business or referrals. Your site may be hot in Australia but bombing in the UK. There’s got to be a reason. This Google gadget helps isolate what’s working where, by region, with incredible specificity.

And if you’d like Google’s opinion of your numbers and your conclusions, click on Analytics Help Center for a ton of Google-centric info. All good in determining what Google likes and dislikes about your site.

Another tool from Google is the URL Remover. You log on to your administrator’s console over coffee and scan through your stats for the overnight, and you discover that your “Content by Title” section – a back office only function – has been inadvertently Googled, indexed and displayed on Google SERPs, giving anyone (including competitors) more than a quick peek at your business. They can read everything because it’s been spidered and indexed.

Using the new URL Removal Tool, you can quickly remove those private pages from Google’s index and tell spiders that this information is off limits as in DO NOT SPIDER.

Google Webmaster Help

This is one very cool tool. One that is certainly bookmark-worthy.

Google Webmaster Help provides tips and suggestions for improving your site in the eyes of what Google calls “benevolent Googlebots.” Hey Boys, those bots ain’t so benevolent if they mis-index my site because of your messed up classification taxonomy. Even so, when you have as much influence over online success as Google does, you get to call your bots “benevolent” even if they are mindless snippets of programs that chew through letter strings.

Don’t get me wrong, there’s plenty of good, useful information here from the people who make the algos. So, it’s worth a visit just to see what’s new, what’s working and what you should do about your precipitous loss of PR when you changed the home page text. Something Happened. And this is the place to find out what.

The Google Webmaster Help section also has a very robust, informed community able to answer FAQs from other site owners. You don’t have to wait for Google to get back to you. Ask an SEO or other web professional using the Webmaster blog for fast facts fast.

Today, there are 107,738 messages, questions and answers on crawling, indexing and ranking; 14,019 posts on the new Google gadgets listed above, so you see that these help pages see a lot of activity and should be a part of your daily web scan.

Go To the Source

Google sets the rules and no matter how strongly these rules are debated among site designers, SEOs and other web professionals, the rules are the rules. One way to stay current is by joining the regular online discussions that Google offers. You can check the schedule of upcoming discussions and mark them so you don’t forget. It’s a great way to meet the Googlistas and your counterparts who are trying to figure out how to perform better in the search engine sweepstakes.

Visit webmaster blogs like the one you’re reading now, especially targeted at those just venturing into ecommerce. Some webmaster blogs are highly technical (more for coders than site owners, actually) while other webmaster blogs provide information on everything from digital selling to site design tips.

But if you want the skinny – the unvarnished truth – go to the source. Go to Google and become a member of the Google webmaster community. Download the free Google analytics and join in with companion site owners to let Google know when a problem arises.

With Google controlling almost half of all searches, it’s good business practice to learn what Google wants.



Saturday, June 27, 2009

Accessibility: It's What a Blog is All About

Is your web site accessible?


Accessibility, when discussing web sites, includes a number of factors: easy navigation, understandable site text, no dead ends requiring a browser back click to escape (lots of users don’t even know browsers HAVE a back click).

Let’s start with the bottom line- yours: the easier it is for a site visitor to perform the most desired action (MDA), the more times that MDA will be performed.

Let’s Start With Navigation

Whether you go with a navigation bar at the top of the screen or a menu list in the first column far left, your navigation must be:

  • simple
  • unambiguous
  • truthful
  • always available
  • always in the same location

Avoid numerous tabs, drop-down or flyout menus. Keep it simple. If visitors are faced with too many choices too soon on arriving at the site, chances are they’ll bounce.

Keep the navigation unambiguous. It’s routine to have a "Contact Us” page on a web site. If you label the contact link “Company Authority,” visitors are going to be totally confused. And again, bounce.

Truthful is just what it says. If the link says “Product Descriptions,” don’t make the visitor read through another landing page of sell copy. Deliver what the link says and go directly to the products.

Always available is an aspect of keeping visitors on site longer, and the longer they stick around, the more likely they are to perform the MDA. So, the navigation bar or menu should be available from every page so the visitor can surf at will, unencumbered by what YOU think the visitor wants to know.

Finally, keep the nav tabs in the same place. Don’t move them from bar to menu and back to bar. The last thing you want is a visitor trying to figure out how to return to the contact page to make contact.

Keep it simple. The fewer clicks required to get the visitor to perform the MDA, the better. So, go through the process and eliminate every unnecessary side road, dead end and yet another landing page.

Accessible Content

If your client site is for a professional medical dispenser, you can assume that the visitors have some knowledge of the subject, i.e. you don’t have to start from square one. But you still have to stay on target pointing out the benefits of buying the client’s medical products.

On the other hand, if you’re writing text for a hearing aid retail outlet, accessible text is understandable by the reader. So first, toss the thesaurus. Find the simplest, shortest way to say what needs to be said about products and services.

Be helpful and supportive to the new visitor. Make things simple to find, simple to learn and simple to bookmark. Returning visitors are gold. Eventually they buy something so earning a bookmark is a very good thing.

Skip the hype. Educate the visitor using simple terms, no jargon and listing benefits rather than features. This is the stuff site visitors want to know.

Finally, lay out the text so it can be scanned rather than read. No big, long paragraphs. Visitors scan from upper left to lower right so put your most important info upper left on the screen.

The easier it is to buy something, opt-in for a newsletter, or to complete a form, the more often those MDAs are performed. So make it as simple as possible (why do you think Amazon offers a one-click checkout? How easy can it be?).

Accessibility benefits both site owner and site visitor – a win-win. Also a no brainer.


Need some help with your website? NP. Visit my site and let's get down to work.

Saturday, November 1, 2008

Getting Slammed By Google: It's Easy




It doesn’t take much to make a Googlebot angry. And it certainly doesn’t take much to confuse one of these script-bits that swarm the web like those killer ants. And while there is no absolute consensus on the negative ranking factors employed by Google, there is general agreement on how to avoid getting slammed by the search engine that controls 46% of ALL web searches – the proverbial 800-pound gorilla.

So here are some common, agreed-upon slams Google can give you.

1. Lack of site access. If your host server is down, your site is down and if your site is down, visitors can’t reach you. Google won’t send its users to an inaccessible site. To avoid trouble: (1) go with a reputable host and (2) avoid launching until the site is complete.

2. All text appears in a graphics format like gif, jpg or bmp. Spiders are as dumb as a box of rocks. They can’t read anything in a graphics format. To avoid the problem, keep critical information in HTML format and provide description tags for all graphics.

3. You’re living in a bad neighborhood. You’re known by the company you keep on the web – in two ways. First, by your inbound and outbound links. Too many low-quality links gives you a bad name.

Further, though contestable, if you’re using a shared hosting account, your site is on the same server as 1,264 other client sites. A server that’s stuffed with porn and overseas drug company sites doesn’t exactly make your site shine, does it?

4. Keyword stuffing is bad for site health. You can overstuff an HTML keyword tag, you can overstuff on-site text (keep keyword density at no more than 3%), HTML meta data, headers and headlines. Any overuse of keywords is a bad sign to spiders.

5. Redirects raise suspicions. Not all redirects are bad. Some serve useful purposes. For example, when you submit an information form online, you might immediately be redirected to a confirmation page with a short note stating that “If this page doesn’t redirect you click here” message.

That’s fine. This isn’t: a site page is designed for one purpose only, to appeal to spiders. It’s a perfectly optimized, single site page buried deep within the site. Because the page is hyper-optimized for crawling, there are no graphics, there is no useful information – it’s simply a highly-optimized page of site code.

Because the page is highly optimized, it ranks highly on Google SERPs. That means it pulls in a great deal of organic traffic. However, as soon as a visitor clicks on the SERP link, s/he is immediately redirected to a page designed for humans. It happens so fast, you won’t even notice. This kind of redirect is bad form to spiders. It’s not nice to fool with Google.

There are lots of other missteps Googlebots look for: invisible text, too much cross-linking within one site, dynamic pages – the list goes on and on. The fact is, it’s easy to get slammed – and not even know why!

To learn more, visit Google’s Webmaster Central and get the information straight from the source.




editor@webwordslinger.com