Any well SEO-optimized site is going to make it easy to get spidered and indexed into the critical search engines fast – first time through. However, Googlebots and other crawlers have no brain, no soul, no conscience, they feel no pain and they never give up – ever! (Sounds a lot like The Terminator), not a bad comparison, actually.
Bots and spiders (same thing) see one thing. Humans see something different. And the features you develop and add to your site to appeal to human thoughts and emotions may totally confuse a spider, creating chaos back at search engine headquarters.
On the other hand, the clever site owner can use the mindlessness of letter-string gobbling bots to some advantage, delivering content visible to humans, but unreadable by bots. So there are pluses and minuses to any decision you make in the design of your site. Bots or bodies?
What Humans See Isn’t What Spiders See
When you visit a web site, you see the site skin, sometimes called the presentation layer. This is all “front of the curtain” stuff designed specifically to appeal to human visitors. Color and design motifs, placement of content, graphics elements and other stylistic considerations are all for human consumption. Bots wouldn’t know a good-looking site from one that’s uglier than a mud fence. And they don’t even care!
Instead, look at your HTML code. The boring sub-structure made up of meta data, line after line of code,
strange glyphs and indecipherable computer-speak. This is what bots see. This is what spiders spider. The underbelly of your drop-dead-gorgeous website. And, if you print out the HTML, XML and CSS programming in place, it’s not the least bit pleasing to the human eye. In fact, it’s black and white text. But spiders love it.
Using The Work Habits of an SEO Spider to Your Advantage
If you know what a spider knows about your site, you can take advantage of the crawler’s significant intellectual limitations.
Spiders crawl the underlying code of a web site. In doing so, they follow links. Their movements aren’t random. They’re directed. So how can you use this to your advantage?
Embed text links throughout the site body text – the text meant for human consumption. These embedded links should take spiders and humans to other relevant information on the site. Using these embedded text links ensures that spiders stick around longer, index more site pages and accurately assess the scope and value of your site to search engine users.
The down side of using embedded text links is that, if the content to which the spider is sent doesn’t synch up with the text in the link, the spider is easily confused. It may determine that you’re intentionally misdirecting site traffic and slam you. Thus, it’s important that embedded links actually lead to more expansive, albeit related, content. It’s equally important that spiders get the connection, something that can be handled on the coding side through the use of title tags and other individual page descriptors.
Another means of licitly exploiting the limitations of search engine spiders is to identify pages that should not be crawled. You don’t want spiders crawling the back office and posting your payroll records as part of the presentation layer so you put up a “KEEP OUT” sign on those pages you want left uncrawled and consequently unindexed within the search engine.
Now, spiders are programmed to be suspicious and too many ‘Keep Out’ signs will set off alarm bells. Unscrupulous site owners employ this tactic to hype a product or service that’s not exactly on-target with how spiders “see” things. Those obnoxious, long-form sales letters, for example, can be excluded from a spider’s view simply by informing the passing crawler that this page is off limits.
Bots never see the site skin. They’re incapable of ‘reading’ or indexing graphic elements including pictures, charts, graphs, Flash animations and other non-text elements. Further, search engine bots won’t know that this body of text is associated with the picture next to it, i.e. product descriptions and product pictures. (Bots are utterly without nuance or guile. They’re more like sledgehammers than calculators.)
You can use this limitation of spiders to your advantage by uploading text in a graphics format that you don’t want spidered. For example, if you want to launch a trial balloon by testing a new product, you might not want that new product spidered until your market testing was complete.
No problem. Create the text in Flash and upload it as a Flash file. Any graphics format can be used – gif, jpg, etc. Visitors will see the text but spiders will just “know” there’s some kind of graphic in that location.
A couple of cautions here. First, even though bots can’t read graphics files, that doesn’t mean you can or should mis-lead site visitors with useless or deceitful information disguised as a graphic. These “black hat” tactics will eventually catch up with site owners who are less than straightforward with visitors and spiders.
Caution number two: Because bots can’t assess graphics, any text in a graphic format is invisible to spiders. That means that information critical to accurate search engine indexing may be unreadable by spiders. This can lead to a site that is only partially indexed, mis-classified within the index and, worst-case-scenario, banned from the search engine for perceived infractions. The only thing you’ll see coming through your site are tumbleweeds.
Structure your site skin for humans and your site code for spiders.
Functionally, that’s the difference between SEO (search engine optimization) and SEM (search engine marketing). But, to achieve online commercial success, you need to know your buyers, their likes and dislikes, and spiders and what they look for crawling through the code underlying your site.
Even though this may sound like a simple task, it isn’t always the case. For example, web frames are useful to humans in speeding up interactivity but frames can also corral spiders, trapping them and preventing them from reporting back to home base with your site data accurately in place.
In an ideal, digital world, spiders would be more intuitive, more refined in their search and indexing skills, and able to distinguish honest site owners who use the limitations of bots from the scammers who abuse these same limitations.