Driving Traffic to Your Website (Part 2)

July 1st, 2000


Originally published in Building Online Business

Part two of a two-part series
Originally published in Bulding Online Business magazine, July 2000

Last updated March 25, 2002.

Last month, part one of this article covered the basics of
driving more traffic to a website through the use of domain names,
directory listings, and better positioning in the major search engines.
Part two delves deeper into search engine placement tactics. Top
rankings in the top search engines are the goal.

The Internet’s top engines and directories account for more than 95
percent of all search traffic. Yahoo! alone commands more than half the
market, and a Compaq study found that 68 percent of 500 million users
only looked at the first page of results. For a top 10 search results
position, the focus must clearly be on the big players.

A company should start with its existing website. Each page
should be optimized with its own five- to 13-word title tag, H1 tag,
meta description (a good description of your page’s content but with a
call-to-action that compels the visitor to click through to your site
from the search results), and meta keywords (specifically relevant to
the page). Meta keywords needn’t be capitalized, as searchers usually
use lower case. Don’t separate meta keywords with both commas and
spaces — search engines typically ignore keywords after a certain
number of characters, so save space by using one or the other. Reduce
or eliminate meta keyword repetition by combining phrases. “Vacation
travel,” “travel agency” and “Caribbean vacation,” for example, can
become “Caribbean vacation travel agency.” Avoid potential lawsuits by
refraining from using competitors’ trademarks or trade names in the
meta tags.

Minimize the amount of code — Javascript, tables, imagemap
definitions, etc. — above the body copy in the HTML source.
Javascripts can be moved to the bottom of the page, or better yet, to
separate .js files that search engines will skip.

As noted in part one, frames and dynamic pages can cause grief
in many of the major search engines. Many of the major search engines
are not very frames-capable. If frames are a must though, include the
noframes tag in the frameset pages. Nested within that should be body
tags, links, and keyword-rich content. Some excellent examples of this
can be found at Search Engine Watch (www.searchenginewatch.com) and Search Engine Showdown (www.searchengineshowdown.com).

Dynamic (database-driven) content is a bit more difficult to get
around — the trick is to remove all question marks from the URLs
without breaking anything. Fortunately, there is a way to rewrite URLs
to make the pages appear static and without having to bother with any
programming code. To do this, use the mod_rewrite Apache module (www.engelschall.com/pw/apache/rewriteguide/).

Besides HTML page content itself, a number of so-called “off the
page” criteria affect search ranking. Search engine Direct Hit (www.directhit.com),
for example, determines site popularity by the number of clickthroughs
from search results, and ranks sites accordingly. Inktomi (www.inktomi.com) considers link popularity (number of links from other sites) as a major determinant of search result placement. Google (www.google.com)
takes this a step further by differentiating among referring sites, so
that some referring sites carry more weight than others. Overture.com
has a pay-to-play model in which companies bid for top placement. Link
popularity on Google, AltaVista, Lycos, and HotBot can be checked with
a free service from Marketleap (linkpop.marketleap.com).

Try to have at least one link to each page of your site from
either the home page or another page close to the home page, and from
other domains that your company owns. This will increase link
popularity and help avoid being dropped from a search engine when it
“re-spiders” (revisits) the site. Hyperlinked text on those
pages should contain relevant keywords. Be careful about hiding these
links, though — search engines may interpret the technique as
“spamdexing” (i.e. search engine spamming), and drop the site.

“Free for all” links pages yield substandard results. It’s
better for a company to establish reciprocal links with other sites, or
to develop an affiliate program where affiliates drive traffic through
links and merchants in return share revenue on sales directly
attributable to that traffic (pay-for-performance).

Careful thought must be put into the URLs of website pages and
how those pages are linked. Pages placed many subdirectories deep are
at a disadvantage as some search engines may not crawl that deep.
Including good keywords into the names of files and/or directories may
help rankings, although this is not proven. Some search engines start
spidering exclusively at the main index pages, or give preference to
index pages and pages found by the spider rather than by direct
submission. Remember to include links to every one of your web pages
from your site map page and from other locations in your site.

Sometimes a “hallway page” (www.webposition.com/mp-0799.htm)
is employed as an alternative to or to augment a site map in assisting
the search engine spiders to discover content pages. To avoid being
flagged as potential spam, the hallway page should include relevant
content in addition to links. Sometimes a hallway page is hosted on a
different domain from the main website. This tactic can improve
visibility in those search engines that limit the number of pages from
a domain name that are listed on a page of search results. However,
such a page will be at a disadvantage too if it has a low link

Rather than working to build great content, some overzealous
search engine optimizers choose to build search engine bait known as
“doorway pages.” Such pages are typically devoid of meaningful content
for human visitors. A much better strategy is to make interesting and
useful content pages that are an integral part of your website.
Furthermore, most doorway pages are poorly designed and contain
unintelligible gibberish, which does not deliver a good first
impression. Web pages should be built ethically and have “real”

Don’t try to outsmart the search engines. Trouble may follow if your web pages are:

  • Machine-generated
  • Duplicated with minimal changes
  • Stolen from other sites
  • Targeted to obviously irrelevant keywords
  • Overstuffed with keywords
  • Filled with gibberish (“spamglish”)
  • Filled with invisible text and links the same color as the background
  • Set up to automatically redirect the user or to send vastly different page content to search engines than to users (“bait-and-switch”)

Google, Inktomi, AltaVista, and others are cracking down on
such techniques. Play by the rules — spamdexers are inevitably found
and banned from the search engines.

With keyword-rich content pages written, designed, optimized
and uploaded, it’s time to submit to the search engines. Or not. Some
search engines like Inktomi actually penalize your site if you submit
your URL to them through the free URL submission process. Thus it’s
often better to let the search engines discover your site on their own.
Of course this requires that your pages are well-linked, both
internally and from the outside. Turnaround time varies from several
days to several months, depending on the search engine. Do submit your
site to the directories. Open Directory is free, but Yahoo! and
LookSmart require commercial sites to pay a submission fee of $299/year
and $299 one-time, respectively.

If several months pass with no spidering by one of the search engines,
resubmit. Each search engine has its own limits: To avoid
spamdexing penalties, don’t do “deep submissions,” or simultaneous
submissions of multiple pages deep within the site. Submit to the
search engines manually. Indeed, there is no other way to submit to
AltaVista but manually. WebPosition Gold does have an automated
submission component that impersonates a web browser to the search
engines when submitting, but search engines are becoming more clever at
spotting WebPosition Gold submissions, so this little time-saver could
come back to haunt you.

Also worth considering is registering one or more “Internet keywords” with RealNames (www.realnames.com),
at $100 to $200 per year per keyword. RealNames is affiliated with
search engines such as AltaVista, meta-search engines such as
MetaCrawler, and even the popular web browser Internet Explorer.

After pages are submitted, monitor rankings periodically over
time and make changes as needed. Online services such as
PositionAgent.com and WebPosition Gold (www.webposition.com)
offer automated search position monitoring and reporting, however the
search engines despise these tools as automated position-checking
queries comprise as much as 30% of search engines’ usage. Instead,
check your rankings by hand and constrain such searches to non-peak

Identify the keywords people use in the search engines to
reach your site. WebTrends and other log analysis software packages can
provide this information.

Just as websites and search engine optimization are never
finished, improving rankings in search engines is an ongoing process. A
word of caution: don’t obsess over creating the “perfect” page. Since
search engines don’t release information such as optimal keyword
densities, it’s all an educated guess based on empirical evidence, and
those numbers can change as quickly as they are established.
Concentrate on creating and refining great content, establishing more
links to your pages, and ethically employing the above tactics, then
watch the company’s web traffic increase (standard disclaimers apply).

Part I previous page

One Comment

  1. by Improving Search Engine Rankings and Increasing Traffic: A Primer — February 14, 2008 @ 11:33 am