While it is impossible to know the underlying algorithms that produce the search engine results in the major engines like Google, Yahoo and MSN, I often try to put myself in the spider's shoes, theoretically speaking. If you can visualize your website the way a search engine might "see" it, then you can make adjustments and tweaks that will help your site rank well.

A conversation with Google
A conversation with Google's automated crawling software (or "spider"), otherwise known as the Googlebot, might sound something like this.
You: Excuse me, Googlebot, why doesn't mysite.com rank well in Google for the keyword "help me?"
Googlebot: (raises a harried eyebrow and looks annoyed) Where shall I begin? First of all, your code is a mess. You have more lines of code than actual text and so many nested tables it makes my head spin.
Your home page has a keyword density of 24 percent, which is suspiciously high compared with the top-ranking sites in my database, all of which average about seven percent. Are you keyword stuffing? You know I don't like doorway pages!
You only have 12 backlinks going to your home page that I recognize, and six of them are from within your domain. The top ten sites have an average of 300 backlinks in my database and literally thousands of backlinks in Yahoo and MSN (not that I care about those hacks).
I've slapped you with a duplicate content penalty because I noticed that www.anysite.com has the same exact home page copy as you. Don't look so surprised - I don't care whose fault it is! On average, your site is 60 percent slower to download than every other site in my database and all your dynamic URLs are giving me a headache. Honestly, do you really need so many variables?
You don't have a site map so I can't easily crawl through the pages of your site, and all of your navigation is represented in images without meaningful ALT tags, so I don't know where I am when I click away from the home page. Your link partners are abysmal - they are not contextually relevant (which makes me suspicious) and you repeat the same exact words in the linking text, which makes me think you're doing automated link swapping. I've been here three times in the past month and your content has not been refreshed once. I can't be bothered with you and your stale, over-optimized content. I will be back to crawl you again sometime this century.
You: (sobbing)

So you've been dismissed by the Googlebot. Get yourself a pint of Rocky Road and join the club.
SEO Tools and Tricks that Help You Think like a Search Spider - SEO Tools that Can Help You
(Page 2 of 4 )


My theoretical response from Googlebot is based on a combination of things that I look at as an SEO, and tools that are freely available online to help me analyze a site. Google's assessment of your site is obviously proprietary, but there are certain things you can look for when your site is in trouble and/or if you want to get a better ranking on Google. These matters are fairly common knowledge in SEO circles. Let's break down the response a little.
Your Code is a Mess
You have a lot of code compared with actual text (e.g., nested tables, JavaScript)
Your keyword density is high compared with your competitors
You're keyword stuffing
Your home page looks like a doorway page
You have fewer backlinks than your competitors
You have poor link partners
You're linking to a site that's banned
Your backlink text is repetitive
You have no fresh content
You have duplicate content
Your site is slower to download compared to your competitors
You have dynamic URLs
You don't have a site map
Your navigation is image-based
You have no ALT tags or meaningless ALT tags
The above list represents an amalgamation of variables that can affect your positioning in Google. It does not represent the full list of search engine faux pas that can be committed by unwary or unknowing webmasters (e.g., frames and Flash are not mentioned here). It's a good start though. Simply diagnosing the problem is half the battle toward getting better rankings ,and all of the above information is freely available using tools that are either Web-based or part of your browser software.
Problems:
Your code is a mess
You have a lot of code compared with actual text (e.g., nested tables, JavaScript
Google doesn't see your Web page the way you do. Google sees the code. Most browsers have a function that allows you to view the source code of the page at which you are looking. Internet Explorer and Firefox, for example, enable you to right click on the page and "view source." Pick a spot on any Web page and give it a try (make sure the mouse pointer isn't on an image).
Not too pretty, is it? Code that is messy or profuse can hinder your search positioning. A good way to clean it up is via HTML Tidy, an open source program created by Dave Raggett and available via download from Sourceforge.net (http://sourceforge.net/projects/tidy). HTML Tidy cleans up the code produced by WYSIWYG editors or poor coders (like myself), and it's completely free.
When viewing HTML code you'll also want to evaluate the quantity of code versus actual text. Search engines like Google seem to put more weight on keywords the higher they are in the HTML document. If your text is buried under hundreds of lines of code, then you'll be at a disadvantage compared to the top-ranking and well-optimized websites that compete for your keyword. There are many ways to get around this; first and foremost is to choose your programming language wisely. I'm not a programmer, so I can't recommend the best programming language to use for SEO. I can only flag this as an issue, as it is something to consider when analyzing your Web page for SEO.
Here is a tool that simulates what a spider "sees" when it visits your site: http://www.stargeek.com/crawler_sim.php. If you're not seeing a lot of text when you enter your Web page's URL, then neither is the search engine spider. It's time to add some.
Problems:
Your keyword density is high compared with your competitors
You're keyword stuffing
Your home page looks like a doorway page
The above three problems are related. If your keyword density is too high, Google may interpret this as a spam tactic called "keyword stuffing." Likewise, Google may interpret a page with very high keyword density as a doorway page. A doorway page sticks out to Google in that it is optimized for a number of terms that are only loosely connected, or not connected at all, to a site's main theme.
The best way to find out whether your keyword density is too high compared to your competitors is through a keyword density analyzer tool. I use GoRank.com or SEOChat.com's own keyword density tool to analyze the top ten ranking pages in Google for my desired keyword. I generally take an average of the keyword density of the top page and compare it to my own page. If my page is much higher than the top-ranked pages, I will revise the copy and tags (ALT, Title, Meta) and tone down the frequency of the keyword in question.
Problems:
You have fewer backlinks than your competitors
You have poor link partners
You're linking to a site that I've banned
Your backlink text is repetitive and/or bad
You have no fresh content
Google is the best tool to use to diagnose the above problems. The Google "link:" operator allows you to check your backlinks and evaluate the sites that link to your page. You can tell whether Google has banned a site, if the URL is not in their index at all. Use the "site:" operator for this.
You probably know whether the content on your site is fresh or not, but if you want to know what Google thinks, then click on the "cache" link next to your listing to see the last time Google paid your site a visit. If it was over a week ago, Google got bored and wandered to greener content pastures. It's time to add some new content. You can also use the "cache:" operator to get cache information. Here's a complete list of Google's operator commands (what they mean and how to use them). You can also download and utilize the Google Toolbar to check PageRank and view your backlinks.
Google may only show a handful of backlinks, when you have thousands. The reasons for this are not entirely certain, though it may have to do with how Google weighs each incoming link in terms of popularity and/or relevancy. With this in mind, I recommend using one of the free link popularity tools available online. A couple of my favorites include the link popularity tool on Mikes-Marketing-Tools.com, MarketLeap's Link Popularity Checker and SEOChat.com's own tool to evaluate link popularity. If you have a lot of backlinks it will quickly get tedious to try and read all the link text to check for duplicity in language. The best tool I've found to do this is SEO Elite, which isn't free but will save your hours of time (and time is money, folks!)
Problems:
You have duplicate content
Your site is slower to download compared to your competitors
You have dynamic URLs
You don't have a site map
Your navigation is image-based
You have no ALT tags or meaningless ALT tags
The above is a miscellaneous list of problems that can be diagnosed as follows. Check CopyScape for duplicate content or perform a search for an exact line of text on the page you are evaluating. Alexa.com will tell you how fast your website downloads compared with others competing for your key term (assuming you are in the Alexa database). You probably know whether your site uses dynamic URLs, but if you're not sure, click into an interior page and check for odd characters in the URL, such as question marks or equal signs. You can use any browser to see the URL string of a particular page in your site. Google has been indexing dynamic URLs, but if the string is particularly long and the variables particularly profuse, Google may not index the entire site as well as it would if the URLs are search engine friendly and/or do not contain as many variables.
A site map is self-explanatory. It's a page that lists links to all the pages of your site. If you don't have one, create one so that Google can find all of your relevant pages easily.
If you use images for all of your navigation and don't assign meaningful ALT tags to them, a site map is especially critical. Googlebot can't read images; it just sees code. If you scroll over a navigation image and no text appears, it means that you have not assigned an ALT tag to that image. You can also view the source code and review your images that way. Assigning meaningful ALT tags to images helps with usability as well as search engine friendliness (for people with slow connections or browsers that have images turned off, for example), though the best case scenario is to use text-based navigation in place of image-based navigation.
Conclusion
The tools that are available to help you analyze your search engine friendliness are profuse and often free. This article just scrapes the surface of what's out there. Read forums to see what the experts use and try out the tools yourself to find your favorites. Proper diagnosis of search engine friendliness is the building block for creating a comprehensive, competent search engine optimization strategy that will definitely give you an edge over the competition.
Keep in mind that while it is helpful to approach SEO from the search engine's perspective, you are not writing for the search engines. You are writing for your visitors. So don't overdo it.

0 comments