Usability

Effective Tagging for Both Usability & SEO

November 15th, 2007

by

Originally published in Search Engine Land

“In this era of Web 2.0, it seems that blogs, mash-ups, RSS feeds, and wikis have been the buzzwords occupying most of the limelight. But personally, tagging is the Web 2.0 technology that excites me the most, because of its versatility and wide applicability,” writes Stephan Spencer, President and Founder of Netconcepts, in this article written for Search Engine Land. Find out how you can utilize effective tagging for your website, social bookmarks, or other Web 2.0 functionality to get the most out of tagging and SEO.

Continue reading »

Using Flickr to Optimize for Yahoo Image Search

September 19th, 2007

by

Originally published in Natural Search Blog

Google Blogoscoped reports that Yahoo’s Image Search now particularly likes Flickr content, so this may be incentive for webmasters to use Flickr “as a kind of Yahoo search engine optimization”. My frequent readers know that I’ve been advocating using Flickr for image search optimization for some time now, and I’ve been speaking on this subject at Search Engine Strategies conferences as well.

Continue reading »

New Google Analytics still poor experience

August 1st, 2007

by

Originally published in Natural Search Blog

Have you accessed the new Google analytics package yet? Chris Smith gives us an inside look at usability in this article from the Natural Search Blog. Chris calls the new analytics “upgrade” as being “All glitz with little beneficial substance.” Read more about the updated Google Analytics from an SEO expert point-of-view.

Continue reading »

Options for Optimizing AJAX

March 2nd, 2007

by

AJAX-driven web applications are becoming increasingly popular on commercial websites. AJAX has an ability to enrich, yet simplify a userâ??s experience when used properly. AJAX can also provide a highly user-friendly interface that works smoothly, quickly, and often better than traditional programming.

AJAX is short for Asynchronous JavaScript and Extensible Markup Language. Make no mistake about it — JavaScript and XML are not “new” technologies. Both programming models have been around for some time. However, the unique combination of JavaScript and XML is relatively recent, as are the problems AJAX presents for a site’s search engine visibility.

The primary benefit of developing a site with AJAX is the ability to work invisibly in the background of a site. AJAX is used to supply data to the client browser that renders up as a relatively seamless “application” instead of the click-and-wait-to-load functionality associated with more conventional web page constructs.

How seamless is the user experience with AJAX? Check out Google Maps or Google Suggest to see world-class AJAX applications in motion. You can find what you want, when you want it, with relative ease and accuracy when AJAX is in use. What you can’t find is a unique URL or navigational links for search engine spiders to crawl and index, which brings us to our first SEO barrier to overcome — the “J” in AJAX.

JavaScript has been a stumbling block for search engine visibility for quite some time. None of the major search engines show any indication of overcoming these types of scripted data issues anytime soon. Consequently, the single greatest optimization issue with AJAX is the tendency to not generate unique, bookmarkable, linkable and therefore indexable URLs.

The comparative shopping engine Become.com overcomes this barrier by creating and linking together static URLs of search results pages. A quick [site:www.become.com] search in Google reveals how well this AJAX-workaround in indexed.

Meanwhile, sites like Scion.com fail to make the same programmatic leap to provide a similar search experience. Imagine how the carmaker could promote celebrity built custom automobiles in the search engines if only static pages of a punked-out Ashton Kutcher or a blinged-out Usher-mobile were rendered and linked to throughout the site.

While AJAX can be a great way to enhance the user experience, not all visitors will have a great on-site experience when non-JavaScript-enabled browsers are being used. When it comes to site accessibility and SEO, itâ??s imperative that an AJAX-alternate experience be provided.

Because AJAX relies on JavaScript — as well as Cascading Style Sheets (CSS) and XML â?? itâ??s relatively easy to provide an alternate experience for non-JavaScript users. The key is to tap into your CSS and XML files to render other versions of the AJAX application. This tactic is as â??progressive enhancement.â??

Progressive enhancement is a web design strategy that emphasizes accessibility, semantic markup, external style sheet, and scripting technologies. By layering designs in a concatenated progressive enhancement allows all users â?? and search engine spiders â?? to access the basic content and functionality of any web page.

When implementing progressive enhancement, a basic markup document is created, geared toward the lowest common denominator of browser software functionality. The web designer then adds functionality or enhancements to the presentation and behavior of the page using CSS, JavaScript or other combinations of Flash or Java applets. In tandem with user-agent detection, progressive enhancement will automatically render both user- and search engine-friendly pages.

You can observe progressive enhancement in motion by visiting Amazonâ??s Create Your Own Ring page. Simply turn off your JavaScript capabilities to see how the program maintains its AJAX-like functionality for all users. Also note that the initial load of the AJAX application contains the optimized elements such as title attributes, header tags and meta description, as well as a crawlable static URL. All of this is visible in Google cache and revealed in the pageâ??s search engine snippet:

 

Amazon.com: Create Your Own Ring: Diamond Search
The Amazon.com Collection. Why Buy Jewelry & Watches at Amazon?
… More to Explore. Preset Engagement Rings … Create Your Own Ring …

www.amazon.com/gp/cyo/cyor-fork.html

 

To produce these particular SEO elements, server side scripts and .htaccess rewrite modules are required. (If site is not Apache server-based then the rewrite module may not be an option, but there are always solutions.)

When optimizing AJAX it’s important to remember three things: Search engine results are affected by on-the-page, behind-the-page and off-the-page factors. It’s essential to provide an alternate way for users and spiders to navigate their way through to all of your great content without sacrificing usability, accessibility and linkability.

Resolve to Produce Great Content

January 3rd, 2007

by

Originally published in ClickZ

The best thing you can do to grow your search engine referrals this year is focus on producing great content says PJ Fusco, lead strategist with Netconcepts in this article for Click Z. After all “content is king” and it’s all about crowning that king by speaking to your audience in a language that appeals to them. And Pat advises that when writing articles for the web, short stories are better than novels.

Continue reading »

We’ve Googlized a client’s home page!

June 15th, 2006

by

I’m usually of the mind that home pages should be rich with textual content so the search engines have something to sink their teeth into. In most cases it’s your home page that gets the most weight of all the pages of your site, so you don’t want to squander that opportunity. However, there are (rare) exceptions to this — times when another approach is in order — where you strip away all but the most essential components (sometimes all the way down to just a search box).

Trustcite.co.nz home page screenshotThis is referred to in some circles as “home page Googlization.” Usability guru Jared Spool recently blogged about home page Googlization. I pretty much agree with his take on this subject. However, we felt that the homepage of our client TrustCite was an exception that warranted Googlizing. The design is very minimalistic. Have a look at it. For this site, simplicity and responsiveness was of primary importance, because the site is meant to become a frequently used resource for New Zealanders. Its singular purpose is to help Kiwis find reputable tradespeople and service providers by relying on feedback from the user’s social network. The primary method of locating these suppliers is through the search box, although there are strong trigger words on the page tucked away under the “Browse categories [+]” link.

Other examples of sites where I think home page Googlization would be in order:

  • Wikipedia (rarely are any of the trivia featured on the home page of interest to me, and never has this filler content been what I went to Wikipedia for)
  • most bank homepages (all I care about as a customer is the online banking login form… take me to my money!)

Usable and Findable: Optimising Search Rankings and User Experience

Usability Professionals Association Auckland Chapter Meeting — Auckland

September 27th, 2005

Seminar by

The marriage of search engine optimisation and usability can be a happy one. Granted, just creating a successful user experience can be a challenge. But to also cater to the search engine’s algorithms concomitantly – this can seem downright daunting. Many companies, often inadvertently, choose one approach over the other. The goal, elusive as it may seem, is improved search engine rankings ALONG WITH greater accessibility and better overall usability. Get ready for a dose of insight, strategy, process, and well-considered opinion to cure what ails your site.

Join Stephan for an information-packed session covering:

  • Wordsmithing approaches
  • Benchmarking criteria
  • Contextual linking
  • Role of keyword analysis
  • Optimal site structure
  • Wielding the full power of CSS
  • Measuring Return On Investment
  • Best practices & worst practices
arimidex buy uk

New eyetracking study: where Google searchers look and click

March 10th, 2005

by

aggregate mapI found the eyetracking study from Enquiro and Did-It unveiled last week at Search Engine Strategies and covered in Search Day fascinating. The aggregate heat map shown on the right (larger version here) shows where participants focused their eyes (and their attention) the most. As you can see, the first listing not only drew the most attention; the full listing was read more fully from left to right, than other listings.

Visibility drops the further down the search results you go, and clickthroughs drop even more markedly (as you can see from the graphs below). This got me thinking about Zipf’s Law. Zipf’s Law is applicable to Top Ten Lists, as Seth Godin explains, perhaps Zipf’s Law might be applicable to the SERPs (search engine results pages) too? (In general terms, Zipf’s Law states that being #1 is much, much better than being #2 which is much, much better than being #3 and so on. So dominating a Top 10 list is critical.) Although these graphs don’t follow Zipf’s Law exactly, nonetheless given this data I’d consider it foolish to be complacent if your search listings are not at the very top of the SERPs.

What is it about searchers that makes them so blind to relevant results further down the page? Is this due to the “implied endorsement” effect, where searchers tend to simply trust Google to point them to the right thing? Or is it just the way humans are wired, to make snap decisions, as Malcolm Gladwell insightfully explains in his new book, Blink? According to the study, 72% of searchers click on the first link of interest, whereas 25.5% read all listings first, then decide. My guess is that both effects (“implied endorsement” and “rapid cognition”) play a role in searcher behavior.

A few other important take-aways from the study:

  1. 6/7 (85%) of searchers click on natural (“organic”) results (not 60/40 as the search engines and PPC (pay-per-click) vendors would have you believe).
  2. The top 4 sponsored slots are equivalent in views to being ranked at #7 – #10 natural.
  3. (corollary to #2): This means if you need to make a business case for natural search, then (assuming you can attain at least #3 rank in natural for the same keywords you bid on) natural search could be worth two to three times your PPC results.

In all, a superb research study. Great job Did-It, Enquiro, and EyeTools!

line graph of visibility
line graph of clickthroughs

testosterone powder wholesale

Web content really IS critical!

August 26th, 2004

by

Today I had the pleasure to hear web content guru Gerry McGovern speak at a full-day workshop in Wellington, New Zealand. He’s got to be one of the very best speakers I’ve ever heard! His course material, his sense of humor, his thought-provoking insights, and especially his Irish accent — had everyone in the audience mesmerized. Here’s a sampling of the day’s take-aways:

Gerry covered so much more than this, but it would take a book to cover it all. Oh, wait a minute… there is a book covering it all. Buy Gerry’s book, Content Critical.

how much do steroids cost

Your Web Site Should Not Need a Manual

February 1st, 2003

by

Originally published in Unlimited

Usability. Boring but crucial, it’s about making your website easy and intuitive to use. Users shouldn’t need to learn how to use your site. Put stuff where people expect it.

Continue reading »