Entrepreneur Magazine interview

January 9th, 2004

by

Originally published in Entrepreneur Magazine

Hear the full interview at WSRadio.com

1) When it comes to search engines and search marketing, what are the major engines for consumers these days?

Of course there are quite a few search engines in existence but at this time the big players are Google, AOL, Yahoo, and MSN. These control 94% of all search.

Now according to comscore media metrix, Google has 32% of market, Yahoo 26%, AOL 19%, MSN, 17%. With others like Lycos, Alta Vista, Ask Jeeves, All the Web and others rounding out the other 5%.

What is significant about these stats is that Google powers AOL and Yahoo search queries, while Inktomi currently powers MSN’s crawler based results. This boils the marketplace down to 2 players: Google which powers nearly 80% of the market, and Inktomi which powers 17% of market.

These numbers are expected to shift this year, but for now Google remains the optimum engine for search marketers. Not since the early days of television have marketers had this kind of reach economy and even then you had to advertise on 3 networks to attain that kind of penetration.

2) Why has Google been so successful?

Began with their methodology for determining relevant results. If you recall, 5 years ago search was an exercise in frustration. It was very difficult to find what you wanted. A large part of that was because the algorithms used by the first generation engines were susceptible to manipulation by search engine optimizers. By repeating various keywords in your meta tags you could raise your rankings, thus polluting the overall result set. And that chaos worked in favor of the engines, because they could sell advertising where you guarantee your position.

Google set out with a more user-friendly model. Their ranking methodology became based on reputation or link popularity, what they call PageRank. PageRank evaluates the relative importance of each page of the web based on the quantity and importance of the inbound links pointing to that page. So a link from CNN is infinitely more valuable to you, than a link from Jim Bob’s personal home page.

So it became nearly impossible to spam Google and artificially inflate your positions, you had to actually contact other human beings and ask them to link to your site. Users loved this accuracy and speed of Google, and told their friends.

Google’s methodology always worked to the advantage of web designers because it meant they could focus on designing the site to perform well in Google from the ground up – legitimately – by providing good content, creating a spider-friendly layout.

This gave Google more useful content to spider which helped them index vastly more content than other engines, and the self-reinforcing spiral was in full motion. That all established their dominance and helped them land distribution or syndication deals through AOL and Yahoo and Netscape etc.

3) What about Inktomi?

Well Yahoo purchased the crawler Inktomi in 2002 (as well as Overture I might add) and is expecting to supplant Google’s results which currently power Yahoo with Inktomi results in the next few weeks. At that point Inktomi will inherit 26% marketshare. Whether it or Yahoo can keep that marketshare remains to be seen.

I’m skeptical. From a content supply standpoint, Inktomi charges you a fee for the pages you get indexed, or a charge for each visitor delivered, usually around 20-40c per visitor.

In my view, that is what has prevented Inktomi from becoming as dominant as Google, because for users Google scours a wider universe, and for suppliers it costs nothing to get your pages indexed by Google and the traffic is free.

4) As a marketer, how do you calculate the value search engines could have on your business?

Just guess. Of course not! It starts with keyword research. Tools like WordTracker help you understand how many people search for phrases like Levis’s or Pressure Washers or Men’s wallets. Once you understand how many people search globally per day, then you can apply each engine’s amplification factor to determine the market size that engine accounts for. For example if there are 100,000 people every month searching for digital cameras, then 75,000 of them are occurring through Google. Then you can apply your conversion rate and average transaction amount to determine how big that pie is.

Of course, if you are on page 3 of Google for “digital camera” fewer people will find you than if you’re on Page 1. Research by Penn State showed that 55% of searchers click on page 1, 19% on page 2, and only 10% on page 3. So you can apply those to your market value to help the decision making process of whether it’s worth it to focus on a particular engine given the competitiveness of the terms and so forth.

Some of our retail clients are seeing anywhere from $10 to $80 per sku indexed by Google per month in sales. When you have a site that has few pages indexed, seeing a 400% jump in traffic is not ununual.

5) Do you have any tips for how to design your site to be friendly to Google and other engines?

Sure. There are over 30 site design factors that influence how well your site performs in Google. I would say the first thing you must do is to make sure that the spiders CAN crawl your entire site. If you have a dynamic site like most retailers do, then chances are your product pages aren’t being indexed by Google especially if you have ? & = in your URL. This is actually why some of the largest retailers like Best Buy and Radio Shack who sell hundreds of thousands of products online, and have only a handful of products indexed by Google.

You’ll need to install a web-server mask that can rewrite the URLs on the fly to replace those characters with ones that engines can crawl.

Also make sure none of your navigation is hidden by javascripts or flash which search spiders generally can not read.

Beyond that, make sure your category and product pages (if you’re a merchant) are designed to “sing” about their topic. The important keywords should be built into your navigation, with plenty of cross linking and merchandising to help guide not only spiders, but users as well. It’s important to remember that a search engine friendly site is also a user friendly site.

It is important to use header tags in your HTML, take advantage of your image alt tags and your title tags which double as your call to action in the listings themselves. This should contain the keyword and a reason to click on you. And make a concerted effort to build links with other sites.

6) How do you keep up with the changes the engines make to their algorithms?

Actually that’s a bit of a myth that many SEOs like to use to scare you into signing long term contracts. We’ve found that if you design your site to be friendly to search engines from the ground up, it takes remarkably little maintenance time, perhaps a few hours every few months to touch up your keywords.

Now Google does make periodic updates to the algorithm. Around thanksgiving time they came out with their Florida update, which threw the entire industry into a tailspin. At the Chicago SES you had people standing up accusing Google of destroying their business. The change had to do with what’s called stemming – including word derivatives in the search results. But again, if you build a well designed site with good content that other people will link to, you will weather the storms.

7) Any sites marketers should study for best or worst practices?

Yes, in our State of Search Engine Marketing report we published last year, the most search friendly websites we found were Crutchfield, Dell, IBM, Amazon, LandsEnd. And I would say some of the sites Netconcepts has developed like vandykes.com.

Sites that fared the worst in our study were Williams Sonoma, JC Penney, Talbots, Jcrew, BestBuy.

Some common worst practices to avoid in your own site: Splash pages that are devoid of content Having the same title tags across the entire site. Using home page redirect scripts. Relying on drop down menus.

8) How do you check whether your site is visible to Google or not?

A few ways. Type into Google the phrase allinurl colon, then your domain name. You can also augment this query with the syntax of your product pages to see how many are indexed.

Allinurl:jcpenney.com+ProductList.aspx

Or if you email me at brian@netconcepts.com with your contact details and a few competitors, I will email you back a customized query for your site and your competitors to show you your Google visibility.

9) So what changes should marketers be aware of as we go through 2004?

Well the biggie is the landscape shift. Yahoo will be swapping out Google for Inktomi. And MSN will be launching its own crawler instead of relying on Inktomi.

And we’re also moving towards more localized search where the engines will know your zip code and be able to feed you listings that match your own geographic area.

Imagine being able to look up in the yellow pages not only a merchant’s contact details, but to be able to query their inventory instantly and determine who has what you need, how much does it cost, availability and so forth.

I recommend any website that does not currently have 100% visibility to Google need to plan on doing so this year. Because the risks of finding some other vendor that may be local or global are increasing daily.

I don’t know where search engine marketing will end up. But search definitely isn’t going away. It is becoming more important as we see do not call lists and do not email lists and do not mail lists proliferating.

I think all businesses need to consider how natural search marketing can be incorporated into their websites and begin to design them accordingly.