SEO Report Card: ShopWildPlanet.com

October 1st, 2006

by

Originally published in Practical Ecommerce

Report CardThis month’s recipient of an SEO critique is www.shopwildplanet.com. Since my SEO how-to column this month is on RSS feeds (see page 18), I thought it would be only fitting that the website that I critique have an RSS feed. Brian Almashie of 3D Joe Corporation, the firm that built the site for Wild Planet, believes they have done a pretty good job of optimizing it. Let’s see if his confidence is well founded.

  1. They certainly appear to have a large number of pages indexed in Google: 138,000. This number seemed unbelievable to me, and indeed it was massively over-inflated. As I started digging, I found that many of the pages in Googleâ??s index don’t have titles and snippets. This is significant because it means the pages have not been indexed by Google, and, therefore, won’t show up for most searchers. I then used one of my super-secret SEO tricks to eliminate the bulk of the snippet-less unindexed pages from the results and discovered only 463 English pages. (Dying to know what my super-secret trick is? Okay, I’ll tell you. Go to the Preferences link on www.google.com and select under Search Language the option to search only those pages written in English, and conduct the search again. Most of the pages that have no titles or snippets should disappear.)
  2. Yahoo! shows only a few hundred pages indexed, a much more realistic estimate than Googleâ??s. I found, included in those results, URLs with affiliate tracking parameters appended and that means duplicate contentâ??something you’ll want to avoid. Furthermore, searchers who click on those affiliate-tagged pages will be counted as affiliate referrals and trigger a commissionâ??probably not an intended consequence.
  3. Error pages (e.g., MIVA Merchant Fatal Error pages) have made their way into Google’s index. This is one of my pet peeves. Error pages should always return a 404 status code, thus ensuring the page will not end up in the SERPs (which stands for “Search Engine Results Pages,” an oft-used acronym amongst us SEO types).
  4. The HTML code is quite bloated with inline JavaScripts, tables being used for layout, comment tags and so forth. Bloated HTML, particularly high up in the page, pushes the keyword- rich body copy further down the page, thus lowering keyword prominence.
  5. The sister site www.wildplanet.com links to www.shopwildplanet.com in multiple places, which is good. However most of those links have tracking tags appended on the URL. This passes link gain to another version of the home page, e.g., not to http://www.shopwildplanet.com but instead to http://www.shopwildplanet.com/?utm_medium=referral?utm_source=wildplanetcom&utm_term=shop+wild+planet+tab.
  6. There are two versions of the website in Yahoo!. One is at shopwildplanet.com and the other is at www.shopwildplanet.com. That is because a 301 permanent redirect has not been put in place on shopwildplanet.com to point to www.shopwildplanet. com. That redirect should be in place across all pages on shopwildplanet. com, not just the home page.
  7. The RSS feed features quite a number of items (119 in all), however, none of them have a <content:encoded> container. Therefore, no HTML is embedded into the RSS feed, which means no tracking of “opens” using “web bugs” (which are 1 pixel GIFs with unique, trackable filenames that when loaded, confirm that the item has been viewed). Furthermore, a summary feed like this isnâ??t as meaty to the RSS search engines, and it isnâ??t as attractive or useful to users when viewed in their RSS newsreader.
  8. The meta keywords are too long (currently 40 words, should be more like 10) and, thus, look spammy. This same list of meta keywords is used across the whole site, rather than being unique and relevant to the page on which it is included. Meta keywords offer no real ranking benefit, therefore, it’s probably not worth the effort to tailor the keywords to each page. Removing the meta keywords tag altogether is probably the easiest solution here. 8. I was pleased to see that CSS was employed for the mouseover navigation, so the navigation choices underneath the mouseover are accessible to spiders.