Analysis: desktop CPUs in 2007

What a difference a year makes. One year ago, we were dazed, dazzled, and beguiled by the arrival of dual-core processors. Offerings from Intel and Advanced Micro Devices had analysts, journalists, creative IT professionals and enthusiasts all gushing with praise for a bright new multitasking future.

Amazingly, both Intel and AMD were able to deliver on the potential of dual-core processing. Throughout 2006, desktop PCs played host to a series of processors that, while slower at the clock-speed level, were faster in real-life usage, allowing for unprecedented amounts of multitasking.

As the calendar flips to 2007, we are firmly entrenched in the world of multicore processors. And, based upon the confidential road maps of both Intel and AMD, it is clear that dual-core CPUs are only the launching point for the future of the microprocessor. In 2007, quad cores and even eight-core CPUs will be available. By 2009, there's a good chance that sixteen-core processors will be on the market.

As we enter 2007, five key questions regarding the pending year's CPU battle are on our minds:

1. Will AMD be able to continue its dominance in the desktop market?
2. How will Intel capitalize upon the success of Core 2?
3. Will AMD be able to match the success of Intel's Core 2 processors?
4. When will the market see true quad-core and even eight-core processors?
5. What surprises do the chip makers have up their sleeves?

With all this in mind, we're taking an extended look at the processors and processor trends you can expect to see in 2007. Not surprisingly, neither AMD nor Intel was willing to divulge many specifics regarding their CPU releases for the coming year. So we scoured the Net, pored over statements from both companies and dug into reports from the host of analysts and experts who cover them.

It's worth noting that much of the information in this road map is preliminary and code-name-level information. As such, the specifics of the processors could change in coming months.

All secrets are revealed within.

Intel advances

Extensive digging has revealed a good portion of Intel's plan for increasing desktop market share in the coming year. Not surprisingly, the bulk of the company's processor road map revolves around the Core microprocessor architecture, formerly code-named "Merom." One of the smashing success stories of 2006, Core 2 processors offer unparalleled levels of performance per watt of energy consumed and may allow Intel to recapture market share lost to AMD over the past three years. (Core 2 processors are based on the Core architecture; so-called Core processors were based on the company's previous Pentium 4/M architecture.)

In attempt to round out its desktop CPU portfolio in the first half of 2007, Intel will focus on several new processor families based on the Core 2 architecture at all performance levels, including a new value line that uses Core 2 at the Celeron level. Here are the details.

Early 2007 brings new Core 2 processors

At the high-end performance level, Intel will release three new quad-core CPUs at the beginning of the year, dubbed the Core 2 Quad Q6600, Q6400 and Q6300. These three processors will be dual-core, dual-die processors, meaning that they will essentially be two Core 2 processors joined together.

Scheduled for release in the first week of January, the Q6600 will have a clock speed of 2.4 GHz, the Q6400 will have a clock speed of 2.13 GHz, and the Q6300 will operate at 1.86 GHz. Each processor will operate on a 1,066-MHz front-side bus and have 8MB of total Level 2 cache, with 4MB of shared cache on each die. (A large L2 cache allows for faster retrieval of frequently accessed data, thereby speeding up overall system performance.)

In the first half of 2007, Intel will also release a new series of Core 2 Duo processors aimed at the midrange market. These dual-core, single-die processors will reside in the newly introduced Core 2 Duo E4000 series, and the initial release will consist of three CPUs: the 2-GHz E4400, the 1.8-GHz E4300 and the 1.6-GHz E4200.

This category of CPUs will operate on an 800-MHz front-side bus and will likely come with a 2MB shared L2 cache. The E4300 will be the first processor in this family released and could be in desktop PCs as soon as February. It is widely expected that E4000 processors will come with virtualization and 64-bit support.

Finally, in an attempt to make significant inroads in the value CPU sector -- one that has traditionally been dominated by AMD -- Intel is trickling its Core 2 CPU line down to the low-cost market. Intel has not yet made it clear whether these processors will be single-core versions of the Core 2 Duo or dual-core chips with one core disabled.

In the second quarter, Intel plans to release a number of processors in this value category. Around this same time, the chipmaker will probably phase out the Pentium 600 series, specifically the Pentium 4 651, 641 and 631.

To avoid confusing CPU buyers, Intel will use the Pentium and Celeron brand names for these new CPUs, even though they are based on the Core architecture.

In the Pentium bracket, we'll see releases of the E1060, E1040 and E1020. The E1060 will have a clock speed of 1.8 GHz, the E1040 will run at 1.6 GHz, and the E1040 will run at 1.4 GHz. Each will have 1MB of L2 cache with a front-side bus speed of 800 MHz. While these processors will support Intel's 64-bit extensions, none of the E1000 line will support virtualization or hyperthreading, a technology that allows single-core CPUs to behave as if they were dual-core ones.

In the Celeron bracket, CPU buyers will likely see a wide range of clock speeds. At press time, no specific model numbers or clock speeds were available, but it appears that the name of this series of processors will be the Celeron 400 series and that these processors will have 512KB of L2 cache. It is not clear whether or not these processors will support 64-bit extensions, virtualization or HyperThreading.

'Bearlake' chip set boosts front-side bus speeds

As Intel shifts to multicore processing, the bus speed becomes a more pressing concern because of the increased volume of data traffic generated by separate CPU cores. The front-side bus (FSB) is the primary channel of data communication between the CPU and other devices on the system, such as RAM and hard drives. It's essentially a single-lane highway with limited bandwidth. As CPU manufacturers stack more processing cores onto a single processor, the risk that this data channel will become full increases, hence the need for faster FSB speeds.

Thus, one of the most significant releases Intel will make in 2007 is a brand-new chip set foundation code-named "Bearlake." This chip set is the successor to the 975X chip set and will feature a number of upgrades and improvements. The P35 Express will be released first in the second quarter of 2007 and will feature two key upgrades: an all-new 1,333-MHz FSB and support for DDR2-800 and DDR3-1066 memory.

Intel recently announced official names for the first wave of Bearlake chip sets. The G35 and G33 monikers will be attached to mainstream consumer desktop chip sets. The G35 chip set will feature an integrated DirectX 10-compatible graphics processor.

The P35 Express and X38 Express will be Intel's performance-oriented, high-end versions of Bearlake. The X38 will feature the same 1,333-MHz FSB and DDR2-800/DDR3-1066 memory support found in the P35 Express, and it will also feature two PCI-Express x16 slots and PCI Express 2.0, which is twice as fast as PCI-Express 1.0 (5 GHz, compared with 2.5 GHz).

1,333-MHz front-side bus CPUs by midyear

At the same time it releases the Bearlake chip set described above, Intel will also release three speedy new Core 2 processors that are compatible with the Bearlake chip set's 1,333-MHz FSB and other new features. The model numbers of these processors are the E6850, E6750 and E6650. (The "50" designator in the model number indicates a FSB speed of 1,333-MHz).

The clock speeds of the E6850, E6750 and E6650 will be ratcheted up to 3 GHz, 2.66 GHz, and 2.33 GHz, respectively. For reference's sake, 3 GHz is the current high mark for Intel's Core 2 CPUs and can currently be found in only one Core 2 processor -- the Extreme X6800. Each of these new Core 2 CPUs is a dual-core, single-die processor that utilizes 4MB of shared L2 cache.

At the same time it releases the E6850, E6750 and E6650, Intel will also release a non-Bearlake CPU -- the E6800. The E6800 will have a clock speed of 3 GHz and a 4MB shared cache, but will run at a bus speed of only 1,066 MHz.

The road to 45nm begins here

Initially, based upon statements Intel released in the middle of 2006, it appeared that Intel would be fairly conservative in terms of CPU releases in 2007. Goals for 2007, as stated by the chipmaker, were to release several new processors based upon the Core 2 architecture. But the company's technology and design priorities would be on improving its fabrication process with the aim of producing 45nm-process CPUs by 2008.

However, toward the end of 2006, Intel indicated that it was ahead of schedule for reaching a 45nm fabrication process. In late November, the company stated that it had already produced a prototype of a 45nm processor, and that it was now hoping to release 45nm processors by the second half of 2007.

Why the emphasis on the shift to 45nm? Beyond the simple metrics of cost -- a 45nm process is smaller than today's state-of-the-art 65nm process, which allows for more CPUs to be manufactured per wafer of silicon -- smaller fabrication processes allow for performance boosts via shorter distances for electrons to travel, faster clock speeds, larger cache sizes and reduced energy consumption.

Another important reason behind the frenzied pursuit of 45nm CPUs: Intel wields a significant advantage over AMD when it comes to fabrication process technology. AMD just released its first 65nm processors in December 2006, and it is not expected to move to a 45nm process until mid- to late 2008. Given the theoretical performance-per-watt advantages that 45nm-process CPUs will possess, Intel's rapid transition to 45nm could place considerable pressure on AMD to catch up.

Enter the 'Penryn' processors

Intel's 45nm process will manifest itself in a microprocessor architecture known only by the code name "Penryn." Not surprisingly, Intel has kept a fairly tight lid on Penryn, but based on rumors and speculation by analysts and experts, it appears that these processors will be based on the Core 2 architecture, but will take advantage of the 45nm processor to provide larger L2 caches and increased performance. (It's worth noting that Penryn will also serve as Intel's mobile processor architecture, with laptop CPUs scheduled for release in early 2008.)

In terms of specific processor releases, Computerworld has heard of a few Penryn-based CPUs that should be released in late 2007. Two dual-core, single-die processors known as "Ridgefield" and "Wolfdale," respectively, could be released as early as the third quarter of 2007. There has been no concrete information regarding the clock speeds of these two processors, but reliable early information has indicated that the Ridgefield processor will have 3MB of shared L2 cache, while the Wolfdale variant will have 6MB of shared L2 cache.

One of Intel's most potentially exciting desktop CPUs is code-named "Yorkfield" and appears to be a 45nm-process quad-core processor that uses a single die (referred to as "native" quad-core) and has an astonishing 12MB of shared L2 cache. When combined with the performance-per-watt advantages of the 45nm processor, this could be Intel's extreme high-end CPU of the year if it is released on schedule at the end of Q3 or the beginning of Q4 2007.

Finally, although the company has not confirmed this in any way, it's entirely plausible that Intel could combine two Yorkfield processors at the end of 2007 to create an octo-core, dual-die, 24MB L2 cache monster.

All the Penryn processors described above will be compatible with Intel's new Bearlake chip set.

As a teaser to what may come beyond 2007, rumors have swirled around a future-gen Intel microprocessor architecture code-named "Nehalem" that will be released in 2008. No details on this architecture have been revealed to date.

AMD battles back (and takes the eight-core lead)

At the beginning of last year, AMD was the CPU darling in terms of performance and the all-important price-performance ratio. This run of dominance ended with a thud in the summer of 2006 when Intel released its stunning new Core microprocessor architecture. Based on a highly efficient 65nm fabrication process -- a process AMD just reached at the end of 2006 -- this new architecture produced results that swiftly relegated AMD CPUs to also-ran status. Much to AMD's chagrin, benchmark result after benchmark result declared Core 2 processors the winners.

Interestingly, while AMD was left scrambling to keep up with Intel on the performance and performance-per-watt fronts for both desktop and laptop CPUs, the company experienced one of its best years ever. In January 2006, reports indicated that AMD CPUs were dominating market share on PC desktops to the tune of 85% to Intel's 15%. Even longtime Intel stalwart Dell Inc. got into the movement, inking a deal to use AMD CPUs in some Dell PCs.

But this success was largely fueled by price-performance advantages that existed prior to the release of the Core 2 Duo line. How will AMD respond in the coming year to what appears to be a clear technology advantage on Intel's part?

Part of the answer to this question appears to reside outside of the realm of CPUs. In July, AMD announced a whopper of an acquisition as it took over venerable graphics and chip set manufacturer ATI Technologies Inc. It's not likely that this acquisition will have a significant impact upon AMD's 2007 CPU forecast above and beyond the growing pains and distractions that a large acquisition can create.

Read on to find out the details about AMD's push to a smaller fabrication process, an all-new Socket AM3 and the alluring potential of eight- and sixteen-core processors.

The push to 65nm

One of the chief advantages Intel wields over AMD is the ability to deploy new technology at a more rapid pace. This was made clear in the early part of 2006, when Intel pushed out a 65nm series of processors many months ahead of AMD.

In December, AMD finally caught up with the release of four new 65nm dual-core processors in its X2 line: the Athlon 64 X2 5000+, 4800+, 4400+ and 4000+. These processors operate at clock speeds of 2.6 GHz, 2.5 GHz, 2.3 GHz and 2.1 GHz respectively. Each has 1MB of shared L2 cache and support for AMD's virtualization and 64-bit technologies.

In the second quarter of 2007, AMD will release two more 65nm processors at the high end of this product series. The X2 5200+ will run at 2.7 GHz, while the 5400+ will operate at 2.8 GHz. Like the rest of the processors in this lineup, the 5400+ and 5200+ will support virtualization and 64-bit technology.

It's highly likely that AMD will release more 65nm processors into this lineup throughout the year. In the second half of 2007 and possibly sooner, buyers and systems integrators will likely see 65nm X2 5600+, 5800+ and 6000+ parts as well as conversions of lower-run CPUs such as the 3800+.

As soon as January or February, AMD will also release a single-core 65nm Athlon processor. Code-named "Lima," these processors will be introduced as the Athlon 64 3800+ and the 3500+. In the second quarter of 2007, AMD will release an Athlon 64 4000+ CPU on this same 65nm process. All of these processors will have 512K of L2 cache.

Also in the second quarter, AMD plans to release four single-core Sempron processors fabricated on the new 65nm process: the 2.2-GHz Sempron 3800+, the 2-GHz 3600+, the 1.8-GHz 3500+ and the 1.8-GHz 3400+. These processors will have 256K of L2 cache, with the exception of the Sempron 3500+, which will have only a 128K cache.

The high end: Quad FX rumbles in

In early December of 2006, AMD released three new performance-oriented processors -- the 3-GHz Athlon 64 FX-74, the 2.8-GHz FX-72 and the 2.6-GHz FX-70 -- under the chipmaker's newly introduced Quad FX line. Based upon a new dual-socket Socket 1207 motherboard and AMD's enterprise-class Opteron CPU architecture, Quad FX processors are purchased in pairs, one per socket. Early performance benchmarks have indicated that these CPUs are indeed suitable for the "megatasking" environments AMD has constructed them for.

In 2007, AMD will continue to build out this Quad FX line with the Q2 release of the Athlon 64 FX-76. The FX-76 will have a clock speed of 3.2 GHz and 1MB of L2 cache per core. This, like the FX-74, FX-72 and FX-70, will be fabricated on the older 90nm process. AMD has indicated that these will be the last of the FX CPUs to be built on a 90nm process.

65nm 'Agena' makes its debut

In early Q3 2007, AMD is planning to release a brand-new performance-oriented 65nm CPU architecture code-named "Agena," and it sounds like a high-performance dream. This new processor line will be the first "native" quad-core processor released by either AMD or Intel. (When used with a multicore processor, the term native refers to a processor with all the individual CPU cores integrated on a single die. To date, all previous quad-core processors have essentially been two dual-core processors attached together.)

One other impressive attribute of the Agena FX processor is that it will operate at a bus speed of 4 GHz, thanks to the 3.0 iteration of AMD's HyperTransport link that will debut at the same time. This doubles the bus speed of previous FX and other Athlon 64 processors. The Agena FX quad core will feature 2MB of shared L2 cache and 2MB of L3 cache. (L3 cache functions in a similar manner to L2 cache, but it's a little slower and is consequentially less expensive.)

Preliminary information has revealed that Agena FX processors will run at clock speeds between 2.7 GHz and 2.9 GHz. It's likely that we'll see two or three different Agena FX processors when they're released, possibly under Quad FX-8x model numbers.

It's too early to say for sure, but the native single-die nature of these CPUs and the shift to 65nm should result in a massive performance boost. One other interesting attribute of the Agena FX processors is that using them with the Quad FX platform, which uses two CPU sockets, will likely allow AMD to be the first chipmaker to release an eight-core platform. By the end of the summer, high-end enthusiasts will be able to run two Agena FX processors at the same time.

CPUs for the masses: Socket AM2+ and Kuma

AMD's rapid embrace of quad-core processing at the high end of CPU performance does not mean that the chipmaker is leaving mainstream dual-core computing out in the cold.

In the middle of 2007, AMD will revise Socket AM2 to increase energy efficiency and bus speed. Currently scheduled for release at the end of Q2 2007, this revision will be named Socket AM2+.

Finally, in Q3 2007, AMD will release a new series of 65nm native dual-core processors aimed squarely at the mainstream consumer market. Currently code named "Kuma," these processors -- which emphasize power consumption and high performance-per-watt yields -- will operate at clock speeds from 2 GHz to 2.9 GHz and will contain 1MB of shared L2 cache and 2MB of shared L3 cache.

These processors will be compatible with the all-new Socket AM2+ and as such will feature bus speeds of 4 GHz. At press time, AMD had yet to reveal model names or numbers for Kuma-based CPUs.

technorati tags:

Maybe this is what Captain Obvious would say but it gives me great joy to report that the Xbox 360 has surpassed both the Nintendo Wii and the Sony PS3 as regarding sales.

What they lack in other areas, Microsoft makes-up for in marketing and bundles, so this is how the 360 registered larger sales than both its competitors. Of course, you would say that the fact that the PS3 and Wii are nowhere to be found was the main reason for this increase in Xbox sales, but who is to blame, if not Sony and Nintendo?

As msn.com reports, Canada Microsoft Game Studios had a special Xbox 360 offer to counteract the effect of the Wii and PS3 launches. They offered three games, 20GB Xbox 360 plus an extra controller for just US$ 449.99. Not to mention the Amazon US$ 100 offer for an Xbox Core package.

The release of Gears of War also helped, but we have to give this one to Microsoft for knowing exactly when to react. Keep in mind that the Xbox 360 was released almost a year ago, so we can’t consider it “hot property”. During a time when the whole U.S. is in a Wii and PS3 frenzy, the Xbox 306 manages to outsell them all. Well done!


technorati tags:

Linuxquestions.org announced that Ubuntu has been chosen by their readers as the best Linux distribution of the year. The community site ran a poll in which of the 2504 total votes, 488 (19.05%) went to Ubuntu Linux. Next in line were Slackware, Suse and ironically, Debian, with 477, 330 and 265 votes respectively.

If a few years ago Red Hat would have certainly topped the poll, this
time RHEL only got 34 votes (1.36%) and Fedora 235 (9.38%). Far into the distance, holding the last place is Novell Linux Desktop with 6 votes.

Even though it's very young (less than two years old), Ubuntu managed to gather an impressive following. In November 2005, Ubuntu Linux 5.10 was named Editor's Choice for small businesses by ZDNet UK and Best Debian Derivative Distribution in a ceremony at the Linux World Expo in Germany. In October, it was awarded the Reader's Choice Award by Linux Journal and the Reader Award at the UK Linux & Open Source Awards dinner.

The Ubuntu team puts its success down to regular and predictable releases and usability out-of-the-box.
Although businesses may be reluctant to use this distribution due to lack of support comparable to larger vendors' like Red Hat or Novell, this poll shows that it has already convinced the home user.

technorati tags:

Google has continued its round of acquisitions with the purchase of wiki site Jotspot.

Google's acquisition was announced on both the Google and Jotspot web sites on Tuesday. Jotspot and Google did not divulge the value of the completed deal.

Wikis allow users to edit and update web pages themselves. JotSpot's wiki allows users to create web-based spreadsheets, calendars, documents and photo galleries. The company claims that "thousands of businesses" use JotSpot to manage projects, build intranets, and share files with colleagues and customers.

Tuesdays purchase is the latest of a string of Google acquisitions in the Web 2.0 space. Google bought Jotspot competitor Writely, a hosted word-processing package, in March.

Jotspot was founded in 2004 by Excite.com co-founders Joe Kraus and Graham Spencer.

technorati tags:

Search this, search that. All you hear about is search engines in the web marketing publications, forums and events. New owners of web sites are often told, "You're going to have to wait a year before you can rank in Google. There's a sandbox you know. You're stuck using pay per click until your organic rankings pick up." BALONEY

Before I get on too much of a hypocritical rant, I must admit that I am one of those search engine optimization practitioners that used to give the above advice. It's true for some types of sites, but for many IT IS NOT.

Marketing channels like social media marketing and new media press releases are generating an increasing amount of buzz and there's a reason for that. There are a variety of alternative channels, both new and innovative and older tried and true. Here's a list of five:

  1. Social Networking - Build a social network of influential bloggers and marketers. When you announce news to this group, they'll "sneeze it" to their networks and so on and so on.
  2. Ride the Digg Wave - Subscribe to relevant category RSS feeds and watch those that are rising. Be one of the first to make relevant comments if it's a blog and you'll benefit from all the traffic as that post hits the first page of Digg. Same goes for del.icio.us and slashdot.
  3. Tag it up - Watch your web logs for keyword trends in the referrer data and use a tool like Hittail to identify long tail phrases. Use these phrases as your Technorati tags and for the tags you use with social bookmark and news site like del.icio.us and Digg.
  4. New Media Press Release - The more media you have, the wider the net you can cast for capturing an audience. One of the best tools for distributing information with multiple media formats is the new media press release. Create a compelling news announcement, then record a 5-10 min interview (audio or video) talking about the announcement. Post the announcement details to your blog. Combine all this along with subscriptions and social bookmark options into a social media release and distribute through a savvy service like PRWeb. A strong call to action, a compelling landing page and metrics are key to the success of such a press release.
  5. Links - Yeah, that's right links. Get links from every relevant source you can whether it's a directory or a blog or a news web site. The great thing about links is that they can send traffic and influence your search engine rankings. But for this post, we only care about the traffic. How do you get the best links? Create something clever, useful or controversial. Create content that is really worth linking to and then use the ideas above to promote it. The best links come from others that recognize on their own, that your resource is something truly valuable.
There you go. Five alternatives to driving traffic to your web site without using search engines. Are there more? Of course there are. But that's what companies hire search engine optimization services for.

There’s a long standing debate in the search marketing industry about links versus content. Which is better?

On the one hand there’s the perspective that if you create great content, people will link to you naturally. That’s true, but it’s a bit misleading.

On the other hand there are those that say links are the answer. You can get pages to rank well based purely on links. Again, that’s entirely possible, but such a statement does not give you all the facts.

When I read or hear people ask whether links or content are better, I liken such a question to asking, “What’s better, air or water?”. Links and content are both necessary for competitive search marketing efforts. Emphasizing one over the other depends on the situation. Excelling at both is the ideal.

The thing to understand about optimizing for search engines is that there are many ways to solve the visibility or ranking problem. There is no “one right way” to SEO. There are fundamental concepts that persist as being true, such as the need for a site to be crawler friendly and all content reachable via links, a logical site structure with relevant content and the need for inbound links from relevant sources. What differs over time and as the search engines update ranking methodologies is the execution.

The links attracted by great distribution of quality content creates a very desirable link “footprint” that is rewarded by search engines. To think this will happen naturally in any reasonable amount of time is shortsighted. To try and create the link footprint automatically is easily detected by search engines as manipulation. Unless you’re in the MFA and “churn and burn” business, automated linking solutions have no place in a search marketing program.

If you create great content and no one knows about it to link to it, you’re spinning your wheels. A combination of content as well as social networking, link networking, public relations and gaining editorial visibility as well as viral and individual link solicitations will all work together synergistically. Building a community of consumers of your content as well as relationships with the media in your industry is the distribution network necessary to gain the most link value out of creating great content.

While it is impossible to know the underlying algorithms that produce the search engine results in the major engines like Google, Yahoo and MSN, I often try to put myself in the spider's shoes, theoretically speaking. If you can visualize your website the way a search engine might "see" it, then you can make adjustments and tweaks that will help your site rank well.

A conversation with Google
A conversation with Google's automated crawling software (or "spider"), otherwise known as the Googlebot, might sound something like this.
You: Excuse me, Googlebot, why doesn't mysite.com rank well in Google for the keyword "help me?"
Googlebot: (raises a harried eyebrow and looks annoyed) Where shall I begin? First of all, your code is a mess. You have more lines of code than actual text and so many nested tables it makes my head spin.
Your home page has a keyword density of 24 percent, which is suspiciously high compared with the top-ranking sites in my database, all of which average about seven percent. Are you keyword stuffing? You know I don't like doorway pages!
You only have 12 backlinks going to your home page that I recognize, and six of them are from within your domain. The top ten sites have an average of 300 backlinks in my database and literally thousands of backlinks in Yahoo and MSN (not that I care about those hacks).
I've slapped you with a duplicate content penalty because I noticed that www.anysite.com has the same exact home page copy as you. Don't look so surprised - I don't care whose fault it is! On average, your site is 60 percent slower to download than every other site in my database and all your dynamic URLs are giving me a headache. Honestly, do you really need so many variables?
You don't have a site map so I can't easily crawl through the pages of your site, and all of your navigation is represented in images without meaningful ALT tags, so I don't know where I am when I click away from the home page. Your link partners are abysmal - they are not contextually relevant (which makes me suspicious) and you repeat the same exact words in the linking text, which makes me think you're doing automated link swapping. I've been here three times in the past month and your content has not been refreshed once. I can't be bothered with you and your stale, over-optimized content. I will be back to crawl you again sometime this century.
You: (sobbing)

So you've been dismissed by the Googlebot. Get yourself a pint of Rocky Road and join the club.
SEO Tools and Tricks that Help You Think like a Search Spider - SEO Tools that Can Help You
(Page 2 of 4 )


My theoretical response from Googlebot is based on a combination of things that I look at as an SEO, and tools that are freely available online to help me analyze a site. Google's assessment of your site is obviously proprietary, but there are certain things you can look for when your site is in trouble and/or if you want to get a better ranking on Google. These matters are fairly common knowledge in SEO circles. Let's break down the response a little.
Your Code is a Mess
You have a lot of code compared with actual text (e.g., nested tables, JavaScript)
Your keyword density is high compared with your competitors
You're keyword stuffing
Your home page looks like a doorway page
You have fewer backlinks than your competitors
You have poor link partners
You're linking to a site that's banned
Your backlink text is repetitive
You have no fresh content
You have duplicate content
Your site is slower to download compared to your competitors
You have dynamic URLs
You don't have a site map
Your navigation is image-based
You have no ALT tags or meaningless ALT tags
The above list represents an amalgamation of variables that can affect your positioning in Google. It does not represent the full list of search engine faux pas that can be committed by unwary or unknowing webmasters (e.g., frames and Flash are not mentioned here). It's a good start though. Simply diagnosing the problem is half the battle toward getting better rankings ,and all of the above information is freely available using tools that are either Web-based or part of your browser software.
Problems:
Your code is a mess
You have a lot of code compared with actual text (e.g., nested tables, JavaScript
Google doesn't see your Web page the way you do. Google sees the code. Most browsers have a function that allows you to view the source code of the page at which you are looking. Internet Explorer and Firefox, for example, enable you to right click on the page and "view source." Pick a spot on any Web page and give it a try (make sure the mouse pointer isn't on an image).
Not too pretty, is it? Code that is messy or profuse can hinder your search positioning. A good way to clean it up is via HTML Tidy, an open source program created by Dave Raggett and available via download from Sourceforge.net (http://sourceforge.net/projects/tidy). HTML Tidy cleans up the code produced by WYSIWYG editors or poor coders (like myself), and it's completely free.
When viewing HTML code you'll also want to evaluate the quantity of code versus actual text. Search engines like Google seem to put more weight on keywords the higher they are in the HTML document. If your text is buried under hundreds of lines of code, then you'll be at a disadvantage compared to the top-ranking and well-optimized websites that compete for your keyword. There are many ways to get around this; first and foremost is to choose your programming language wisely. I'm not a programmer, so I can't recommend the best programming language to use for SEO. I can only flag this as an issue, as it is something to consider when analyzing your Web page for SEO.
Here is a tool that simulates what a spider "sees" when it visits your site: http://www.stargeek.com/crawler_sim.php. If you're not seeing a lot of text when you enter your Web page's URL, then neither is the search engine spider. It's time to add some.
Problems:
Your keyword density is high compared with your competitors
You're keyword stuffing
Your home page looks like a doorway page
The above three problems are related. If your keyword density is too high, Google may interpret this as a spam tactic called "keyword stuffing." Likewise, Google may interpret a page with very high keyword density as a doorway page. A doorway page sticks out to Google in that it is optimized for a number of terms that are only loosely connected, or not connected at all, to a site's main theme.
The best way to find out whether your keyword density is too high compared to your competitors is through a keyword density analyzer tool. I use GoRank.com or SEOChat.com's own keyword density tool to analyze the top ten ranking pages in Google for my desired keyword. I generally take an average of the keyword density of the top page and compare it to my own page. If my page is much higher than the top-ranked pages, I will revise the copy and tags (ALT, Title, Meta) and tone down the frequency of the keyword in question.
Problems:
You have fewer backlinks than your competitors
You have poor link partners
You're linking to a site that I've banned
Your backlink text is repetitive and/or bad
You have no fresh content
Google is the best tool to use to diagnose the above problems. The Google "link:" operator allows you to check your backlinks and evaluate the sites that link to your page. You can tell whether Google has banned a site, if the URL is not in their index at all. Use the "site:" operator for this.
You probably know whether the content on your site is fresh or not, but if you want to know what Google thinks, then click on the "cache" link next to your listing to see the last time Google paid your site a visit. If it was over a week ago, Google got bored and wandered to greener content pastures. It's time to add some new content. You can also use the "cache:" operator to get cache information. Here's a complete list of Google's operator commands (what they mean and how to use them). You can also download and utilize the Google Toolbar to check PageRank and view your backlinks.
Google may only show a handful of backlinks, when you have thousands. The reasons for this are not entirely certain, though it may have to do with how Google weighs each incoming link in terms of popularity and/or relevancy. With this in mind, I recommend using one of the free link popularity tools available online. A couple of my favorites include the link popularity tool on Mikes-Marketing-Tools.com, MarketLeap's Link Popularity Checker and SEOChat.com's own tool to evaluate link popularity. If you have a lot of backlinks it will quickly get tedious to try and read all the link text to check for duplicity in language. The best tool I've found to do this is SEO Elite, which isn't free but will save your hours of time (and time is money, folks!)
Problems:
You have duplicate content
Your site is slower to download compared to your competitors
You have dynamic URLs
You don't have a site map
Your navigation is image-based
You have no ALT tags or meaningless ALT tags
The above is a miscellaneous list of problems that can be diagnosed as follows. Check CopyScape for duplicate content or perform a search for an exact line of text on the page you are evaluating. Alexa.com will tell you how fast your website downloads compared with others competing for your key term (assuming you are in the Alexa database). You probably know whether your site uses dynamic URLs, but if you're not sure, click into an interior page and check for odd characters in the URL, such as question marks or equal signs. You can use any browser to see the URL string of a particular page in your site. Google has been indexing dynamic URLs, but if the string is particularly long and the variables particularly profuse, Google may not index the entire site as well as it would if the URLs are search engine friendly and/or do not contain as many variables.
A site map is self-explanatory. It's a page that lists links to all the pages of your site. If you don't have one, create one so that Google can find all of your relevant pages easily.
If you use images for all of your navigation and don't assign meaningful ALT tags to them, a site map is especially critical. Googlebot can't read images; it just sees code. If you scroll over a navigation image and no text appears, it means that you have not assigned an ALT tag to that image. You can also view the source code and review your images that way. Assigning meaningful ALT tags to images helps with usability as well as search engine friendliness (for people with slow connections or browsers that have images turned off, for example), though the best case scenario is to use text-based navigation in place of image-based navigation.
Conclusion
The tools that are available to help you analyze your search engine friendliness are profuse and often free. This article just scrapes the surface of what's out there. Read forums to see what the experts use and try out the tools yourself to find your favorites. Proper diagnosis of search engine friendliness is the building block for creating a comprehensive, competent search engine optimization strategy that will definitely give you an edge over the competition.
Keep in mind that while it is helpful to approach SEO from the search engine's perspective, you are not writing for the search engines. You are writing for your visitors. So don't overdo it.

Few things frighten a website owner more than the possibility of being assessed a penalty from Google. The reason for that fear is uncertainty. Not everyone is entirely certain what will invoke the wrath of the search engine giant. Because of that unknown factor, many legitimate and entirely honest webmasters often believe they have been subjected to a penalty, even when they are not. Part of the confusion over Google penalty policy is not knowing what will cause a penalty. Another area of concern is what exactly Google's various punishments involve and how they are applied. Knowing how to avoid penalties entirely by only using above board optimization techniques is the best policy for every legitimate website owner to utilize.

Avoiding the many known penalty triggers will provide peace of mind and far better results in the search engine results pages (SERPs). Severe abuses of the Google search engine policies and terms of service can lead to outright banning of your website from the search engine giant. Because of that potentially devastating loss of revenue to your online business, it pays to use only the best search engine practices.

The question then, is what is considered a penalty, and how can you avoid being
Like any other disciplinary body, Google's terms of use enforcement department has a range of penalties. There are punishments ranging from the light for first offenders and relatively minor terms of use infractions. The penalties range upward to more severe punishments for repeat offenders of more serious terms violations.

Penalties vary in length. More minor penalties may last from one to three months, while the most severe banning from the Google index penalty may even be permanent.

A Google penalty may be minor, if any punishment can ever be considered as such. A loss of a point of Google PageRank (the measure of the importance of a web page on the Internet) is one of the less severe penalties. Of course, that too is relative. Loss of PageRank

A loss of one PageRank point may not hurt as badly, if it moves a page (remember, PageRank is for a page and not a site) down from PR3 to PR2. It can really sting a website owner if the PageRank drop is from PR6 to PR5. It is much harder to move up from PR5 to PR6, than it is to recover from PR2 to PR3. That is because the Google PageRank scale is not linear, but exponential like the earthquake Richter Scale. It takes many times more and stronger incoming links to move up to the next PageRank level, with every succeeding step.

A more severe punishment would be the loss of all PageRank entirely. Regardless of what your website's current PageRank, to be moved down to a PR0 is a bitter pill to swallow. It is quite likely that Google PageRank will be harder to achieve in the future as well, although that is debatable. No one is certain if PageRank dampers continue for penalized sites into the future. As always, Google isn't talking.

As we move up the punishment scale, using PageRank drops as the punishment, the grayed out bar is the most severe. When Google grays out a site's PageRank, that site is marked off as a major problem. Linking to that site can even result in a penalty for the linking site. The gray bar site is a Google pariah.
Other Penalties

Other punishments involve the search engine results placements (SERPs). A penalized site may find themselves dropped either slightly, or perhaps even very dramatically, in the search results for their most important targeted keywords.

Since achieving top rankings for those keyword terms is the ultimate goal of terms of service violators, the resulting loss of revenue really hurts them. A loss of top positioning in Google costs the website owner money, as searchers seldom go beyond the first three pages of results; if they even go that deeply.

The ultimate penalty, of course, is complete banning of a website from the Google search engine index entirely. The length of a website ban may vary, but often if such extreme measures are taken by Google, the ban is permanent. In less stringent cases following correction of the violations, the ban may be lifted, and the site restored to the index. There is probably a probation period involved for re-indexed sites as well. Google Penalties and How To Avoid Them - What Activities can Result in a Google Penalty?
(Page 3 of 4 )



A penalty can be invoked by Google for violations of its search engine Terms of Service http://www.google.com/terms_of_service.html. As a service and a guide to webmasters in the proper development of websites, Google provides a Google Information For Webmasters page http://www.google.com/webmasters/guidelines.html. The guidelines set out by Google provide web developers with many explicit rules that are meant to be followed.

Within Google's webmaster guidelines are some very specific don'ts.

The overall theme of Google's webmaster guidelines is to not attempt to trick the search engines. With that in mind, they provide a list of strong suggestions.

One major reason for being assessed a Google penalty is engaging in schemes to increase your site's position in the search engine results and for raising your PageRank. The avoidance of what Google calls "link spammers" and "bad neighborhoods". Link Farms

The main culprits of this problem are the so-called "link farms". A link farm exists solely to increase PageRank by requiring the exchange of links between themselves and otherwise entirely unrelated websites. Should one of them, unknown to you, link to your site, Google won't penalize you for that. They believe you have no real control who links to your site.

On the other hand, Google feels you have complete control as to where you point your own links. Should you link your site to a link farm, clearly with the goal of increasing your PageRank and link popularity, Google will most likely respond negatively to your action. Google uses the "does it help your visitors, and if search engines didn't exist, would you still do it" test. That seems a reasonable test to ask yourself for any activity you use for your own website.
Hidden Text

Google specifically tells you not to use "hidden text" or "hidden links". It also disapproves of "cloaking" and "sneaky redirects". Hidden text is usually in the form of keywords, written very small, usually in the same color as the web page, or both. The idea is to have the text read by the search engine spider, but not by the site visitor. Hidden links are sent out to other sites, but not seen by the visitor. They are often used as a trick in link exchanges, to prevent visitors leaving the site, once there. Sneaky redirects send a web surfer to a different web site entirely from the one intended. These are often used in affiliate programs.
Other Don'ts from Google

Google disapproves of unauthorized computerized and automated programs for page and site submissions and for checking search rankings. The guidelines specifically mention the avoidance of the popular web tool Web Position Gold (TM). Because these programs utilize a large amount of computing time and space, they are a violation of the terms of service. For that reason, automated queries are banned.

Heavy use of keywords, that clearly do not belong in the context of the web page, are strongly discouraged as well. They are clearly an attempt to rank well for search terms that have nothing to do with the website, except to steer traffic to the site from irrelevant search results. For example, filling web pages with the most popular search terms, when none of them have anything to do with your site content, would fall under that category.

Duplicate content, whether pages or entire websites is specifically against the prescribed guidelines. Identical pages are used to add more pages of content and perhaps to have more results appear for searches. The Google algorithm attempts to avoid indexing identical pages and duplicate sites by indexing only one.

The so-called "doorway" pages, that are packed with every imaginable keyword to attract the search query, and then steer that traffic via a "sneaky redirect" to another site are expressly forbidden. Often used by affiliate programs, to avoid providing additional useful content, these techniques often lead to penalties from Google.

There are many more techniques that search engine spammers employ, that Google is attempting to stop in their tracks. While the methods are not specifically named, Google reserves the right to change its algorithm to combat them as they arise. The goal of the search engine is to provide the best and most relevant results possible. The inclusion of spam type websites prevents good webmasters from getting their honest sites as high in the results as they deserve.Google offers a spam report page where suspected illegal and unethical practices can be reported. The page requests the name and URL of the site and the specific violations that are suspected. In the report page information, Google claims to investigate each reported case. While that may not always be possible in practical terms, the information provided can assist Google in reworking its algorithm to prevent the abuses in the future. Changes in the ways that Google returns results, that specifically target and filter out spam laden pages, are good for everyone.

Good search engine practices benefit everyone, whether you are a webmaster or conducting a search for information or products. By penalizing and even removing the deceptive and spam-filled websites, from the Google index, ensures that everyone plays by the same set of search engine rules.

By developing a good quality website that observes the Google best practices guidelines, you will never have to worry about being handed a penalty. On the other hand, it's good news that Google intends to reprimand website owners who engage in unethical search engine practices.

Synopsis:

Read this article to find out some of the reasons why a site may have been banned from Google and what to do to get back in the rankings.
Pages: 1
The Article

"Why has Google banned my website?"



As an SEO company this is one question we get very often. There are few bigger problems for an Internet business or search engine marketer than to find that their website has disappeared from Google's search rankings. Sometimes their website doesn't even rank for their own web site's name. How did this happen? Read below to find out some of the reasons why a site may have been banned from Google and what to do to get back in the rankings.



Usually there is no warning for being banned or penalized by Google except for the steady drop of sales and visitors to your site. Many site owners and search engine optimization firms are left with little to no idea why they were removed and can be left scratching their heads as to how to get back in. While there are many reasons why a site has been banned, here are a few of the more common reasons. If your site has been banned contact your SEO company or give Big Oak a call to help you get back on the right track to high Google rankings.



1. Robots and Meta Tags



The first and simplest solution many be that your robot.txt file has been changed to prevent search engines from entering your site. Or your meta tags could be directing the search engine robots to exclude your site. While this would be highly unlikely, it is best to rule this out. So check your robot.txt file (if you have one) and your meta tags. Unless you want your site hidden, you should never read this in your meta tags: . If you see this, you are blocking your site from Google.



2. Cloaking (A Big Google No-No)



Straight from Google's website: "The term "cloaking" is used to describe a website that returns altered web pages to search engines crawling the site. In other words, the web server is programmed to return different content to Google than it returns to regular users, usually in an attempt to distort search engine rankings. This can mislead users about what they'll find when they click on a search result. To preserve the accuracy and quality of our search results, Google may permanently ban from our index any sites or site authors that engage in cloaking to distort their search rankings."



If your website or web pages are set up to display different information for a search engine spider versus a real person, then you are cloaking. Cloaking delivers one version of a page to an Internet user and a different version to a search engine. The cloaked page is packed with keyword and terms that the site wants to be highly rank for so, in essence, they are cheating. There are good reasons for cloaking as well, such as targeted advertising, but if you are trying to manipulate your rankings you should put an end to this immediately.



3. Duplicate Content or Websites

If Google finds multiple web pages have the same content they may penalize each website for this. Of course, someone may have copied your content and Google banned you even though it was your original content that was taken. Make sure no other site is using your content. You can do this by performing a Google search using some of your text with quotation marks (") around it. If you do find someone is using your original copy visit here to learn more about copyright infringement: http://www.google.com/dmca.html.



4. Hidden Text and or Links

How can text been hidden? Well, there are a variety of ways - some are more sneaky than others. But is boils down to this: it is considered hidden if the text or link is invisible to the website visitor but can be seen by search engine spiders. This used to be done quite often, such as making your text white on a white background or using cascading style sheets (CSS) to hide your text, but search engines can easily spot this today so it is best to avoid it altogether.



5. Keyword Spam and Keyword Stuffing

Ever seen a web page with a very awkwardly written first paragraph where a certain word is repeated ad nauseam? Here's an example:



"We sell the best father's day gifts for father's day. If you like to celebrate father's day we can help with the best father's day gifts for father's day."



Care to guess which keywords are being targeted? This is keyword spamming or stuffing but it is just the tip of the SEO iceberg. This is just the content on the page, there is probably keyword stuffing happening in the code: in the meta tags, invisible text, alt tags, title tags and comment tags. etc. If the word or phrase is repeated too often Google can place a filter to reduce the site's rankings or simply ban the site. Keyword density can be tricky but, as a general rule, Big Oak shoots for 3% to 12% of all text on a page to be our targeted keywords.



6. Doorway Pages

Defining a doorway page can be difficult so here is our definition that could potentially ban your site in Google: pages that are created in order to attract search engine spiders and be ranked highly for their targeted keywords. Real visitors find this page and then continue to the "real" website from there. Hence the name "doorway page". These pages aren't in the navigation most of the time. If you come across a page where much of the information is duplicated from other pages on the site but it is different in terms of keywords only, this is most likely a doorway page.



As you can see this can be a gray area. Some pages on a website may focus on a particular subject and be innocent of trying to lure search engine spiders only for high rankings. Err on the side of caution and make sure the page is useful and part of the your site's navigation.



7. Redirect Pages

Sneaky redirection pages are set up in groups from 5 to hundreds. They all target similar and related keywords or phrases. Usually, the only links on these pages are links to other pages in the same family creating a false sense of related linking.



These pages don’t necessarily contain content that any human would be interested in. These pages may show up high in Search Engine Results Pages (SERPS), but when you click on one of these pages from the SERPS, you will be redirected to another page. In other words, the page you click to see is not the page you actually get to read.



The redirect can be automatic, done with a meta refresh command or through other means such as a the mouse moving while on the redirect page.



8. Buying Links

While buying links may not get you banned, they can certainly hurt your page rank. Google has slowly been catching on to this fad and has measures in place to put your site in limbo for 6-8 months (known as the "sandbox effect") so you can't instantly benefit from buying links to your website. Many sites that sell links are being devalued by Google, making an investment in this strategy a waste of money and time. Ultimately, stay away from buying links to increase your ranking.



9. Linking to Bad Neighborhoods

Link campaigns are good thing when done correctly; we would say they are a necessity in today's SEO world. But linking to bad neighborhoods are a sure way to lose your rank in Google. If you aren't careful about who you are linking to you can easily disappear overnight. Basically, while you may be ethical and do everything right linking to someone who isn't can be considered guilt by association. Always verify your links to other sites. Make sure they have page rank in Google and are indexed by Google. Try searching for their URL to see if they are indexed. Avoid linking to any sites that use spamming techniques to increase their search engine rankings. Regularly checking outbound links from your site and removing any offenders is a good idea.



A few site types to avoid:



- Free-for-all link farms
- Adult sites

- Gambling sites

10. Code swapping

Optimizing a page for top ranking, then swapping another page in its place once a top ranking is achieved.



What does Google say?

"Don't deceive your users, or present different content to search engines than you display to users," Google says, and they list some bullet points on avoiding being banned.



Avoid hidden text or hidden links.

Don't employ cloaking or sneaky redirects.

Don't send automated queries to Google.

Don't load pages with irrelevant words.



Don't create multiple pages, subdomains, or domains with substantially duplicate content.



Avoid "doorway" pages created just for search engines, or other "cookie cutter" approaches such as affiliate programs with little or no original content.

Google also states:



"Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you. Another useful test is to ask, 'Does this help my users? Would I do this if search engines didn't exist?'"



While creating a page without a thought to search engines is probably going a little too far, optimizing your site for an organic search, as long as it conforms to their standards, is perfectly acceptable.



We pride ourselves on being an ethical SEO company. We follow the guidelines and do things the right way. There is no easy path or shortcut to high rankings in Google or any other search engine.



We welcome any questions and are always willing to share information with clients and perspective clients. Give Big Oak a call today for a free consultation and search engine report to find out where you rank in the major search engines. Whether you feel you have been banned or just want to show up higher in Google's search engine results, we can help you.



Beware of Google Ban!



If you are carrying out search engine optimization your site for high positioning in major search engines such as Google, Yahoo! and MSN you need to make sure you don't get banned by accident...



IMP**** In the spirit of fair play and providing depth in its results, Google frowns on duplicate content. Some web site owners purchase multiple domains and copy their content for both domains. They mistakenly think that having another domain with the same content is just going to replicate their success, but unfortunately it isn't that easy!



Another more common mistake is to have multiple domains pointing at the same site - this can be as simple and unintentional as having www.refreshedmedia.co.uk point to www.refreshedmedia.com. Unfortunately the outcome is to end up with 2 copies of your site in the search engines and they don't like it.



What's the solution? In most cases, a 301 redirect is your best bet. It's a server-side redirect most administrators can handle in a few minutes. In effect, it's a proper way of telling search engines to ignore the content from the redirected website, and just take it from the original.



You may have many good reasons for owning multiple domains, including brand protection. For example, we own refreshedmedia.com AND refreshedmedia.co.uk for the simple reason that we don't want someone else feeding off our success and diverting our customers away, simply by purchasing 1 domain for a few £.



So how do you keep Google and friends happy?



If you secure more than one domain, redirect the other sites to your main website using a 301 redirect, or use the extra domains for unique content (perhaps showcasing other products or services).

Solution:


****How to get back into Google



Once you have cleaned up your website (and we mean really cleaned it up as you may only get one chance to get back in), you should contact Google. Explain that you made a mistake, you have corrected it and certainly won't do it again.



You don't have to contact Google but it can't hurt. They will eventually spider your site again and see that you have cleaned up your website. You may have to wait a few months for Google to re-index your site so be patient and don't tinker with your website too much unless dictated by your site's products or content needs.



The worst case scenario is to start a new site. Sometimes this can be necessary but only in the most extreme cases.


Helpful Google & Yahoo Links

* Google webmaster guidelines
* Google's warning about search engine optimization companies
* What Yahoo! Considers unwanted
* Yahoo! Spam Report Form

The Developers at Google have been kind enough to offer a web API for developers using the SOAP protocol. When you do a search using Google, you may have noticed that you are prompted with possible alternatives to any words you may have misspelled.
The Google web API "spell check" allows you to send a string of text and receive alternatives for misspelled words. The power in this web API is that the Google dictionary includes technology words that are used in website searches, but may not have been included in a Standard English dictionary.
Setup

Setting up the Google SDK on your server is as simple as downloading the API from http://www.google.com/apis/. You'll need to register with the website which will give you your own key. You'll need the key for Google to accept SOAP connections from your server.
There are some limitations to be mentioned as well. The Google web API allows 10 words to be sent at a time and a limit of 1000 connections per key per day. The following script works around the 10 word limit, however is still limited by the 1000 connections.
The Code

<%
'SEARCH ENGINE using GOOGLE API for RLC search engine

'Define Variables for Script
Dim objSoapClient, results
Dim googleAPIKey
Dim xmldocument
Dim xsldocument
Dim estimatedTotalResultsCount
Dim endIndex, startIndex
Dim searchQuery
Dim i 'as loop counter

'Instantiate the application variable for the GoogleAPI
application("googleSearchNumber") = application("googleSearchNumber") + 1

'Set the Google API Key you received when registering.
googleAPIKey = "00000000000000000000000000000000"

'Set the string to send to Google for spell checking
query = "The Lazy Bron Dog Jumpd Over The Fnce"


'Need to make an array with blocks of 10 words because the
'Google API will only accept the first 10 words of a string.
Dim arrTemp
Dim strTemp
Dim wordCount
Dim ArrayLength
Dim dicWords

i = 0
arrTemp = Split(query," ")
set dicWords = Server.CreateObject("Scripting.Dictionary")
wordCount = 1

'Google only allows up to 10 words at a time. Must make it 10 word strings.
If UBound(arrTemp) < 10 Then
dicWords.Add i, query
Else
For each Item in arrTemp
IF UBound(arrTemp) < 10 Then
dicWords.Add i, query
Else
If wordCount = 10 Then
dicWords.Add i, strTemp
wordCount = 1
i = i + 1
strTemp = Item & " "
Else
wordCount = wordCount + 1
strTemp = strTemp & Item & " "
End If
End If
Next
'Get any left overs
IF strTemp <> "" Then
dicWords.Add i, strTemp
'response.write "Aux: " & strTemp & ":" & i & "
"
End If
End If

strTemp = ""

' set up a Soap Client object from MS Soap
set objSoapClient = Server.CreateObject("MSSOAP.SoapClient30")

' set up an XML document
set xmldocument = server.CreateObject("MSXML2.DOMDocument.4.0")

' set up an xsldocument
set xsldocument = server.CreateObject("MSXML2.DOMDocument.4.0")
' load the style sheet
'xsldocument.load(Server.MapPath("search.xsl"))

' initialise the SOAP object with the WSDL file from Google.
objSoapClient.mssoapinit Server.MapPath("GoogleSearch.wsdl")

'If you need to, set proxy settings here; Otherwise, comment out this line.
'objSoapClient.ConnectorProperty("ProxyServer") = "my.proxy.com:8000"

' Call the Google API search. This can only be done 1000 times a day
For i = 0 to UBound(dicWords.Items)
Results = objSoapClient.doSpellingSuggestion(googleAPIKey, dicWords.Item(i))
If Results = "" Then
strResults = strResults & dicWords.Item(i)
Else
strResults = strResults & Results
End If
Next

'Display the alternative String, if any.
response.write strResults
%>
Note: If the script gives you an error about not being able to create the MSSOAP.SoapClient30 object (on line 66 or so) you may not have the SOAP Toolkit 3.0 installed. It's a free download and doesn't even require a reboot. Everything else you need (WSDL file, etc.) you'll find in the API you downloaded from http://www.google.com/apis/.
Conclusion

Although limitations exist in the Google web API, it makes a pretty handy spell check for small sites or intranet applications. The dictionary and algorithm are all stored off site so you won't need to update word lists. The best part is that technology words are corrected as well, For Example: Google is an error in English, however the web API sees this as a legitimate word.

Google recently launched their Google Customized Search Engine, which allows webmasters to easily integrate Google search results into their site while also giving webmasters editorial control to bias the results.
Webmasters can bias the results harnising the power of Topic Sensitive PageRank, tag relevant results, allow editors or users to tag relevant results, and select a seed set of sites to search against or bias the results toward (and sites to remove from the results).
Surely some shifty outfits will use this as a way to show their ranking success, but this also makes me wonder what the net effect on Google's brand will be if people see powered by Google on sites which provide terrible relevancy, or results that are obviously biased toward racism or other horrific parts of humanity. Will searchers learn to trust search less when they start seeing different Google results all over the web? Or will anyone even notice?
Will most people be willing to subscribe to relevancy which reinforces their current worldview?
This release essentially will make Google the default site search on millions of websites, which is great for Google given the volume of site level search. I still think Google's stock is priced ahead of itself trading on momentum and short covering, but this release gives Google a bunch more inventory and further establishes them as the default search platform.
By allowing webmasters to easily integrate results biased toward internal content, backfilling the results with other content when the site does not meet all of a searchers needs, and then allowing the delivery of profitable relevant ads near the content, Google is paying webmasters in numerous highly automated ways that build great value by being layered on top of one another.
I also have to think this is going to further place a dent in the business model of running directories, or other sites with thin content that do not add much editorial value to the subject they talk about. This blend of editorial and algorithms is invariably going to kill off many editorial only listing companies.
As an SEO, I think this customized tool can also be used to help further test the depth and authority of a site relative to others in its group by allowing you to bias the results to multiple similar seed sites and see which pages on those sites that Google promotes most. This could even be used as a tool to help you determine which domain is more valuable in terms of ranking potential if you are comparing a couple domains that you are thinking of buying.

Trying to make a living from your blog is a great idea if you are up for a challenge. Some articles will make it seem like a piece of cake to sit back and watch the dough roll in. The truth is that making money from your blog takes time and effort on your part. There are a few things that can make your task a little easier however. Google Adsense is one of those things for many. If you are curious about whether or not Google Adsense is right for your blog, take a look at these characteristics of their advertising program.


Learn How VoIP is Dramatically Cutting Telecom Costs for Small Businesses
With VoIP...
Businesses Beware - New Battlefront in Email and Web
FierceIPTV
Research Report: Magic Quadrant for E-Mail Security Boundary, 2006
Digital Transactions
Hundreds more titles...


The first thing to consider when looking for a program like Google Adsense is their track record. You can rest assured that Adsense is the oldest program of its type. It has the largest base of publishers than any other program like it. Knowing this information, you can feel secure in becoming a member of their program. By using Adsense, you will know that you are in good hands of a company that certainly knows what they are doing.

Another thing to consider when signing up with an advertising program is the appearance of the program on your blog. You don’t want to sign up with a program that has annoying flashy ads that will scare people off from your site. Adsense provides ads that are mostly text. They have around 10 different sizes with various dimensions so you can place them in the most appropriate places on your blog, without sacrificing your content and overall site appearance. If there is an image advertisement run in your advertising space, it typically fills up the whole ad block and is less annoying than most.

One thing to remember when considering Google Adsense is that they have strict terms of service. While some may see them as being too restrictive, they are not abnormal by any means. Basically to be an approved Adsense member you must agree not to publish pornographic material or coversation on your blog. You need to keep it pretty family friendly. This is not a huge problem for most people. Also you are not allowed to change the Adsense code.

Another portion of the terms of service explains that those who agree also cannot specify keywords that ads will be pointed to. You pretty much surrender the choosing of the ads to Google when you agree with their terms. You may have control over the content of your blog, but they have control over the ad content. Lastly, if you work with the Adsense program, you must agree to have no more than three blocks of ads on one particular page. If you are looking for a program that also allows you to see which pages earn more for you than others, Adsense is a great fit. Some programs will not allow you to do this, so if you are interested in this option, Adsense is a great choice.

Again, Adsense is based on the context of your page. Whatever you allow on your page will have an effect over what type of ads are presented to your readers. Keep this in mind when you are posting on your blog. With Adsense you cannot afford to hide your main points within a bundle of text. You have to be clear and use defining vocabulary to ensure your ads will fit just right.

If you are a blogger who would love to get paid for referring others to a program like Google Adsense, then you will be glad to know that for each referral you have that signs up and earns $100, you will get $100 as well. This can be great for someone with many contacts or a blog building business. Overall, there are quite a few rules that you must agree to follow in order to become an Adsense publisher. The rules are not so restrictive so that you cannot make money. In fact, they are made so that the Google reputation is preserved and yours as well. By using an honest system such as Adsense, you will know that you are in the best possible position out there. You can make a bundle if you put in the time, so go ahead and check it out today.

Google has just released a new Google Talk (GTalk) version - 1.0.0.104. This new GTalk version does not really add any type of improvements to existing Google Talk versions except for support for Microsoft’s latest operating system, Windows Vista.
Here are some other interesting features added to Google Talk versions prior to 1.0.0.104.
Google Talk Orkut integration

Google Talk now enables you to chat online or place a call to your Orkut friends. Additionally, Google Talk automatically adds your Orkut friends to your Google Talk account, and notifies you if someone writes a scrap entry on your profile.
Support for Google Talk online messages

Just enable chat history in your Google Talk account and you’ll be able to receive messages when you’re offline. The system stores the messages you’ve received while offline and displays them the next time you log into your Google Talk account.
Google Talk File Transfer

The file transfer feature was introduced in version 1.0.0.95, along with voicemail and music status and music trends display.
File transfer is unlimited in number and size in Google Talk. You can send files of virtually any size to one or more Google Talk friends (Google Talk uses the peer-to-peer transfer).
Voicemail support was introduced in Google Talk version 1.0.0.95. It basically allows you to record audio messages and send them to your friend’s e-mail address as an audio file attachment.
Music status display is also a feature available as of Google Talk version 1.0.0.95. It changes your status message by displaying the audio file (e.g.: song) that you currently play on your PC.
If interested on the type of music other Google Talk users are listening to, feel free to visit Google Music Trends. There is also an embedded feature in Google Talk allowing you to send to Google Music Trends information related to the type of music you are playing.

In an exclusive tip-off from inside Google, Tech’s Message has learned that the search giant is sitting on a huge announcement that it intends to make public around the date that coincides with the consumer launch of Microsoft Windows Vista, January 30th 2007.
Google has a history of stealing Microsoft’s thunder, for example when Microsoft announced its MSN Search database held records for 5 billion web pages, Google co-announced that it infact held records for 8 billion. Could this be another such event?
Perhaps…
Possibilities
1) The announcement is likely to be the unveiling of a new Google product, the most hotly anticipated of which is the Google OS - a collection of web-based applications that provide free, networked alternatives to Microsoft’s Office suite of applications. Obviously Google Docs and Spreadsheets compete already with Excel and Word, but alternative solutions to spreadsheets, desktop email, frontpage etc could be easily deployed by the Google machine.
2) Another large possibility could be the fabled GDrive. This would be an easy way to consolidate all of a user’s Google files (Google Docs, Spreadsheets, Video etc) in one place, in a desktop environment. The GDrive may also offer a certain number of gigabytes of backup space, though this is less likely given the sheer exploitability factor.
3) Lets not overlook the YouTube acquisition. It’s unlikely anything major would come as a result so soon, but there’s always a chance.
4) There’s also the possibility of a more competative offering in the VoIP market. Google Talk is not a widely used application and was met with a damp response from users, and continues to only be lightly adopted in comparison to Skype. Skype, however, offers the ability to call physical phone lines, something Google talk does not do (yet). Adding “calling out” features would certainly increase the mass appeal and would make the service less of an IM underdog, and more of a competitor to Skype. This would undoubtedly create tension, however. Google has a multi-year partnership agreement with Skype’s parent eBay, to provide text-based adverts on the auction site outside of the United States.
Would a call-out service work from Google? Probably not. Could you imagine having a conversation with a friend about your recent contraction of herpes, only to get a automated announcement on the phone from the “Google system” about related anti-herpes offers from Google advertisers? No, certainly not a good system. Google would struggle to profit from a voice-based system but keeping track of all your conversations would, in a way, organise your data and would therefor be within Google’s manifesto of organising the world’s data. I leave the thinking to you.
5) A blog on ZDnet gave some insights on potential Google releases in 2007, one of which is claimed to be 100% certain. That is, an enterprise version of Google Apps For Your Domain. Currently the system offers Google applications to run as if hosted on your own website’s domain, but is actually hosted on Google’s servers. Various features are expected with such a product and you can read some insights here.
6) The announcement may be something less interesting but no less notable, such as Google’s GMail finally coming out of beta after almost 3 years. The site has recently allowed signing up functionality (previously, new users had to be invited by existing users). Whether this is a sign or not that the mail service is approaching its “finished” status, I wouldn’t like to say. It wouldn’t be huge news but it’d at least beat Windows Live Mail to the punch (that said, WLM hasn’t been in beta for anything like as long as GMail).
In Closing
This news should be treated as a rumour, but a strong one, and from a trusted source.
Personally, I want a gPod with a gFox browser and integrated gTooth and g-Fi for sharing my songs wirelessly with other gPods. To avoid Apple lawsuits, Google may want to annoy Microsoft more, as referred to above, by calling it the gUne (Goon). Either way, I want one. It won’t happen, but I want one.

Everyone and their dog (yes, there are a few dogs out there with their own blogs) have started up a blog these days, but many people just aren’t taking the steps needed to optimize their blogs for both readers and search engines. While blogs can be business related (another blog about mesothelioma anyone?) they can also be personal where you talk about the great ham sandwich you had for lunch today or the crappy service you had at that trendy restaurant last night.

But whether your blog is business or personal, you should ensure that you are optimizing your blog for both your readers (after all, you want to keep those readers coming back) and the search engines. Unfortunately, optimization is an important step that far too many blogs seem to be skipping over, even those that have a broad appeal to surfers and have the potential to be monetizable.

However, optimizing a blog is a bit different than your standard website search engine optimization (SEO), particularly because most blogs run off standard blog platforms, or worse, run as a hosted blog on someone else’s domain name. And there are design issues that can be unique to blogs which can impact your rankings.

Let’s face it, when you commission a styling’ new blog template, most blog designers focus on making your blog look the way you want it to. But unfortunately for bloggers, not very many of those great blog designers are also SEOs by trade, meaning that the blog design you use could actually be hurting your search engine rankings. While you may have a great design that looks wonderful to readers, new readers might not find you if your blog isn’t ranking well organically in the search engines.

Also, when you optimize your blog for the user experience, you make it easy for users to return and engage in your blog without dealing with any of the hassles that can cause them to abandon other sites or blog entries. Repeat visitors are the cream of your blog, so by following these tips you have given them the tools they need to return as well as the user experience that makes them want to come back.

Fortunately, if you are on the case to make your blog rank well while not hindering your visitor’s experience on your site, there are definitely things you can check – and fix – to prevent any indexing issues from occurring, and ensuring your blog a happy and healthy existence in the search engines.

So here is advice on how you can optimize that blog of yours for both users and search engines without alienating one or the other.

1) Dump The Default Template - Looks Count!
I cringe when I see a blog using the “out of the box” Wordpress or MovableType template. Hire a designer to create a unique look for your blog, or at the very least, take advantage of some of the free templates available and customize it a bit with a unique logo or a slight color upgrade.

2) Just Say No To Bad Color Schemes
While a hot pink with lime green color scheme might be your favorite, consider what your readers will be expecting. That color scheme might work perfectly on a teenage gossip site, but would look extremely out of place as the corporate blog for a men’s suit company. Likewise, gamers would think nothing of a black background on an Xbox 360 blog, but it would look horrendous on a parenting or pregnancy site. So while you should experiment with colors to find a good mix for your blog, keep in mind user experience and their expectations.

3) RSS Me!
Make sure you have RSS available. Many hosted blogging solutions don’t have RSS automatically available, so you will need to add it. And when you do add it, ensure you have those RSS links in an obvious spot. Don’t tuck them away at the very bottom of your index page after your most recent 20 entries, or hide them on a separate “About Us” page. Place all those handy subscribe links in your sidebar, which is exactly where people will look for them. If you use Feedburner currently, have a look at their new MyBrand option which allows you to host your own feeds for a seamless user experience.

4) Offer RSS & Feed Subscription Buttons
Yes, when people want to subscribe to a blog, they will often look for that orange RSS logo as well as the logos of the standard aggregators such as Bloglines. So it is worth the time to add the most popular ones to your blog so visitors can easily do their one-click subscriptions to your feed without it require much effort on their part. If you make it hard to subscribe, most just won’t bother. FeedButton offers a service that allows you to offer multiple RSS aggregator and feed reader buttons with a single expanding rollover button.

5) Offer Posts Via Email
Some people just don’t get RSS. So cater to them by offering them an option to get your blog posts by email instead. The most popular service to do this automatically is FeedBlitz, although there are also many other tools available to do this.

6) Decide On Full Or Partial Feeds
Do you offer full feeds or partial feeds? This is a personal preference, and is often dependent on what market space you are blogging in. One option is to offer two feeds, one being an ad-supported full feed, with an RSS ad included, and the other being an ad-free snippet copy of the feed, where readers won’t see ads but will have to actually view your blog in order to read your full entry. But this will often come down to personal preference, and the preferences of your readers.

7) Write Compelling Snippets/Descriptions
If you do use snippets for your RSS feed, be sure to make them compelling or leave readers with a cliffhanger to encourage them to click and read the full entry. This will get you many more readers to your entries than just using the default option of including the first X number of words in the blog post as the snippet. Use your excerpts to generate interest and clicks.

8) Pay Attention to How You Write.
One of my favorite bloggers has the unfortunate habit of writing detailed long entries… without a single paragraph break and with the double whammy of also writing with a font size smaller than usual. If I look up for a moment, it is hard to find my place again in her 1000 word entries. As a result, I don’t read it as often as I would like to, simply because reading it is such a painful experience.

9) Spelling Counts
Spelling is also worth mentioning. Add one of the many spell checkers to your internet browser and run a quick spell check before you publish your entry. Every word doesn’t have to be perfect, and I am certainly guilty myself of letting on occasional typo slip through unnoticed. But I also get annoyed when I am reading typo after typo after typo in an entry. And yes, if it happens enough, I will unsubscribe out of sheer frustration.

10) Fontography Counts
Make the font easy to read. Some bloggers think it is cool to have their handwriting turned into a customized font, or use a trendy font that would be better suited to a scrapbook layout. But not everyone has those wild and weird fonts installed, which means that those people will see a standard font such as Times New Roman, and it can really kill the look of your blog. So instead design the text of your blog entries to use a standard font in a standard size.

11) Don't Forget Navigation
Is this blog part of a larger site, such as a corporate blog on a site for a major company? Don’t just link to the main page of the blog. Syndicate your recent headlines in the sidebar to encourage visitors on the main site to check out the blog too.

12) How Fast is Your Host?
Another one of my favorite blogs has such a slow response time when I click from the snippet in my RSS to the full blog entry that I only actually end up waiting around for it to load about 10% of the time. Don’t lose readers because your hosting company thinks 30 seconds is a perfectly reasonable amount of time to load up a page.

13) Avoid Widget Overload!
Yes, there are definitely some cool widgets you can add to your blog, such as MyBlogLog or a Flickr photo box tied to your photo gallery. But be aware that having a large number of javascripts can slow down your site. So don’t sacrifice timely loading time for nice-but-not-all-that-necessary widgets.

14) Have Descriptive Titles
Some blog software actually makes your entry titles seem pretty repetitious in the search engine result pages, and can result in a lower click through than you might have had otherwise with highly optimized titles. If your title’s say something like “Jason’s Tech Industry Rants & Ramblings Blog >> New Xbox 360 title announced for April release” you should change it to “New Xbox 360 title announced for April release”. Unless you are well known as an authority blog in that market, the blog name is simply wasting crucial space at the beginning of the title tag and causing the rest of the entry title to end up getting truncated in the search results. And make sure your titles actually enhance the entry and don't leave the reader wondering what on earth the blog entry could be about. Ensuring you have great titles when you have a small readership and are depending on search engines to send you readers is one of the first steps you should take to optimize your blog.

15) Look at your Cascading Style Sheets.
Most blogs use a tremendous amount of CSS to create that custom look. And while most of the “out of the box” designs that come standard with the installed template include all CSS in an external file, there definitely are some blog designers who will put their CSS on the individual template pages rather than placing it all in an external CSS file. And when you don’t place CSS in an external file, it can clutter up your pages and result in the most important part of the page – the entry text – being much further down in the HTML code when it has to go after the masses of CSS coding lines.

16) Post Often
The more frequently you post, the more likely Googlebot and other bots will stop by on a more regular basis. If you only post once in a blue moon, expect that it might take a while for Google to stop by and see that you actually have updated again. Google loves updated fresh sites, so it make sense to feed the bot what it wants.

17) Spread the Link Love
If you are blogging about a story, link up the original story as well as other’s commentary on the same topic. When you do so, you will often make those bloggers aware of your blog’s existence (if they weren’t already) when people click from your blog to theirs. And it also increases the odds that they will either link to you on that story or on something you blog about in the future.

18) Be Aware of Your Anchor Text
When you link to someone’s blog entry, or even a previous blog entry on your own site, make sure you link well. This means instead of linking to someone’s blog entry with the anchor text “click here”, you link to them using anchor text related to the blog entry, such as “Jason’s scoop on the new Widget Xbox 360 game”.

19) Create Unique Stories
Bloggers love to link to other bloggers. When you write original blog entries, rather than just rehashing something someone else has already said, you increase the odds that someone will find yours interesting enough to link to and talk about. And a reader of that blogger’s blog might read the entry and decide to write something about what you said as well, meaning yet another link as well. And if you are fortunate, it will go viral, meaning suddenly it seems like every blogger in your market space is talking about what you wrote. Rinse and repeat as often as possible for maximum exposure and link juice.

20) Use a Related Posts Plugin
Not only does this make sense to keep readers around for other articles on your site that are related to your current post, but it also allows you to deeplink from a current page on your blog to older entries. Often, older entries get buried several pages deep on an archive page, and this allows you to showcase entries written months or years previously and give those “oldies but goodies” an extra little kick in the search engines. There are several related post plugins available depending on which blog platform you use.

21) Ping Other Sites
When you add a new blog entry, you might want to ping site such as Technorati and FeedBurner to let them know you have a brand new blog entry on your site. You can also now ping Google’s Blog Search as well for faster indexing in their blog search engine at blogsearch.google.com. Automatic pinging is an option in the control panel of most blog platforms including WordPress and MovableType. And Ping-o-Matic offers a service that allows you to quickly pick and chose what to ping.

22) Buy Your Own Domain Name
Don’t always think your free blog hosting company will be around forever. What will you do if you build up a loyal readership then one day you discover yourblogname.examplebloghost.com no longer works because examplebloghost.com has gone out of business? You want to make sure the search engines have a URL they will always find your blog at, rather than have to worry about them re-indexing your previously well-ranked blog on am entirely new domain… that is if you are lucky enough to get your blog posts from your free hosting company. Both Google's Blogger & Wordpress allow you to use their hosted blog service while displaying it on your own domain instead of their own branded one.

NOTE: See also our related story, Stay Master Of Your Feed Domain.

23) Manage Your Trackback & Comment Spam
You don’t want Google or Yahoo to find masses of spammy links on your site to all manner of less-than-quality sites submitted to your blog by a blog spammer. Use one of the many tools on the market for your blog platform to manage both comment and trackback spam.

24) Use a Good URL Structure
Don’t use “permalinks” such as www.yourblogsite.com/?p=123 . Instead, use www.yourblogsite.com/2007/01/01/blog_entry_title_here. Most blogging platforms allow you to change from the standard numbered permalinks to this style of search engine friendly ones. And just in case the blog platform you use has funky dynamic URLs for each entry, you will want to ensure that the bots can crawl them easily or use a mod rewrite to create a good structure such as in the example.

25) Use Great Categories
When you write a post, place it in 1 to 3 different categories related to the post. For example, and article on the television show Grey's Anatomy could go under "Grey's Anatomy" and "ABC". Avoid the temptation to add it to ten different categories though, such as including "drama," "hospital," "interns" and "Seattle" because that is just overkill. But if you wrote something great on Grey's Anatomy, you have made it easy for your reader to find all your posts on Grey's Anatomy because they simply have to click on the category link at the top or bottom of the entry.

While some bloggers insist that search engine rankings will come naturally to those who wait, who really wants to wait for Google? A blogger can run into several unique challenges when it comes to optimizing for search engines, and it makes sense to get the jump on it now than simply hoping that if you write it, the bots will come. It is far easier to ensure you have a well optimized blog now than trying to figure out what the issue is 6 months down the road when only your blog’s index page is found in Google!

Does anyone else have tips they would have put in their own top 25 list of blog optimization tips? I had some that didn't make the cut for the top list, but am interested to hear what others feel are the most important tips.


By Jennifer Slegg