Using Google Sitemaps:
Show ‘Em What You Really Got
Search engine bots have long relied on site maps to simplify their mindless collection of letter strings. Site maps make it easy for spiders to assess the nature of the site, the accuracy of HTML tags and to detect new content faster and more easily. So, the development of a site map has always been useful to bots and to site visitors alike.
The importance of developing a site map has increased recently with the introduction of Google’s cool feature – Google Sitemaps, which is pretty much what it sounds like. Today any webmaster or site owner can quickly develop a site map for submission to Google’s SE for indexing. The benefits? Many.
Help Google. Help Yourself.
The proliferation of new websites has caused the Google gods to rethink their current processes used to spider and index the billions and billions of new pages of text coming online each year. It can’t be done, not even with all of Google’s resources. “Too much information.”
The solution? Let webmasters index their own sites. That’s what’s behind the new Google Sitemaps program. It’s a wonderful opportunity for e-biz owners to ensure that their sites are assessed and indexed properly on Google’s big ol’ hard drive, in effect taking ‘botness’ and placing it into the hands of site owners – humans with the ability for rational thought – something no SE spider brings to the table.
How Hard Is This?
No very. To get the most out of Google Websites, you’ll need an XML generated file on your site. XML stands for extensible markup language and you can find XML tutorials everywhere. Just Google XML and you’re on your way.
If you don’t know a thing about XML and writing code, no worries. This is something your programmer can develop in less than an hour, delivering better SE results and more highly-qualified visitors to your site.
The XML generated file is the ‘transmitter’ that sends site changes, updates, new content and other data to Google. No doubt, you’ve seen the ubiquitous red XML logo on the most up-to-date sites. Kind of a status symbol to those in the know. These XML-enabled sites provide more accurate SERPs, though it’s important to note that an XML generator won’t do a thing for your page rank. That’s another whole story.
XML Syndication
XML coding is used by most blogs because it’s a language that adapts easily to syndication. RSS – remote site syndication – employs XML to deliver your site’s content to other interested, like-minded sites in the form of XML/RSS feeds. It’s all part of the new dynamic of Web 2.0 – content syndication with lots of arrows (links) pointing back to your site.
In the case of Google Sitemaps, all the webmaster is doing is syndicating content directly to Google’s SE. This puts more control in the hands of webmasters to ensure that their sites are crawled accurately and frequently.
Even better, this program gives site owners control over several key weighting factors – variables, XML tags, attributes and so on.
Your XML Generator
It’s important that your XML site map be kept up to date since that’s going to be an important consideration when bots come calling. So, you need an XML generator. A what?
It’s easy. An XML generator simply spiders your site, identifies new content, lists URLs and notes any other changes and reports this data to the Google index. There are a slew of XML generators, some free, all low-cost, available on the w3. Just Google XML generator and take your pick. They all do the same basic thing and there’s one for every level of programming expertise.
Which Generator For You?
http://enarion.net/google/ is a php generator that you install on your host server. Once installed, your site is spidered and an XML sitemap automatically produced. Upload the new sitemap to your server. Then, run the generator to have your site map indexed by Google. It’s easy, even for the digitally-challenged.
But there are dozens of XML generators available – from pick-click easy to more-than-most-of-us-need-to-know-in-a-lifetime. So, to make things even simpler, and to encourage more self-indexing, Google has developed a list of 3rd party XML generators that configure nicely with Google’s data hook-up. To see Google’s list of XML generators, simply click here.
Telling Google What You Want It To Know
The XML file created as part of your site will provide Google with lots of useful information that you submit only after every letter and symbol has been checked for accuracy. This information includes:
The location of your web page, i.e. your site’s URL. (Hello, I’m over here!)
Dates of content modification – what was added, deleted and/or changed since the last time you were spidered. This is very important for sites that change content frequently – especially informational or editorial content. Spiders love fresh content, as long as it isn’t page after page of hype.
Frequency of Content Change - SEs may view frequent content changes with suspicion, “thinking” that changing content may indicate gray-hat marketing tactics. Problem solved when the webmaster can indicate how frequently content changes from page to page. You can indicate change frequency from never to hourly, giving the bots a heads up.
Site Page Priority – Some pages of your website are more important than others. Obviously the homepage ranks high in importance, but so do high profit pages, opt-in pages and, of course, ordering pages.
Google’s Sitemap program allows webmasters to determine the importance of each page on the site! Pages are prioritized on a scale of 0.0 (least important) to 1.0 (most important). The higher the importance you assign to a page, the higher importance Google will assign to that page. No bump in page rank, however, since your priority rankings are only relative to the pages of your site.
Extreme Trust
Extreme trust is a new www concept, sort of like the honor system. Wikipedia, the online, ever-evolving encyclopedia is a project based on extreme trust. Anyone (even you) can make an entry to the Wikipedia site. Other readers can then go in and add, update and edit your information. It’s a means of harnessing the world’s knowledge in real time and it’s one of the most exciting uses of the web currently underway.
Now back to Google. Google has a well-established reputation for protecting its reputation and the quality of its search results. Sites have been gray-barred (banned) for seemingly minor infractions – sometimes even unintentional infractions. You don’t mess with Google. Period. Even big-time car maker BMW got penalized for less than exemplary reporting to Google. If it can happen to BMW, it can happen to you.
Banned from Google and you might just as well start over. Recovering your previous Google rank is all but impossible.
Now, apply this hard-edged business philosophy to the new Google Sitemap program. Believe this – you don’t get carte blanche. Spiders will still spider and bots will still bot. And if the information in your XML site map is out of sync with site text, purpose, components, tags or any other aspect of site descriptions, you better believe that you’re site is going to be slammed – and there won’t be a thing you can do about it.
Play It Straight
The black and gray hats out there are always trying to beat the system but the system grows more sophisticated with each passing minute. That’s why it’s best to use the Google Sitemap tool to your advantage – to ensure that your site is presented the way it should be and the way you want it to be.
To learn more about Google Sitemaps, click here.
And remember, when it comes to Google and every other search engine, play it straight. It’s the only way to go.
No comments:
Post a Comment