Articles

Options for Optimizing AJAX

March 2nd, 2007

by

AJAX-driven web applications are becoming increasingly popular on commercial websites. AJAX has an ability to enrich, yet simplify a userâ??s experience when used properly. AJAX can also provide a highly user-friendly interface that works smoothly, quickly, and often better than traditional programming.

AJAX is short for Asynchronous JavaScript and Extensible Markup Language. Make no mistake about it — JavaScript and XML are not “new” technologies. Both programming models have been around for some time. However, the unique combination of JavaScript and XML is relatively recent, as are the problems AJAX presents for a site’s search engine visibility.

The primary benefit of developing a site with AJAX is the ability to work invisibly in the background of a site. AJAX is used to supply data to the client browser that renders up as a relatively seamless “application” instead of the click-and-wait-to-load functionality associated with more conventional web page constructs.

How seamless is the user experience with AJAX? Check out Google Maps or Google Suggest to see world-class AJAX applications in motion. You can find what you want, when you want it, with relative ease and accuracy when AJAX is in use. What you can’t find is a unique URL or navigational links for search engine spiders to crawl and index, which brings us to our first SEO barrier to overcome — the “J” in AJAX.

JavaScript has been a stumbling block for search engine visibility for quite some time. None of the major search engines show any indication of overcoming these types of scripted data issues anytime soon. Consequently, the single greatest optimization issue with AJAX is the tendency to not generate unique, bookmarkable, linkable and therefore indexable URLs.

The comparative shopping engine Become.com overcomes this barrier by creating and linking together static URLs of search results pages. A quick [site:www.become.com] search in Google reveals how well this AJAX-workaround in indexed.

Meanwhile, sites like Scion.com fail to make the same programmatic leap to provide a similar search experience. Imagine how the carmaker could promote celebrity built custom automobiles in the search engines if only static pages of a punked-out Ashton Kutcher or a blinged-out Usher-mobile were rendered and linked to throughout the site.

While AJAX can be a great way to enhance the user experience, not all visitors will have a great on-site experience when non-JavaScript-enabled browsers are being used. When it comes to site accessibility and SEO, itâ??s imperative that an AJAX-alternate experience be provided.

Because AJAX relies on JavaScript — as well as Cascading Style Sheets (CSS) and XML â?? itâ??s relatively easy to provide an alternate experience for non-JavaScript users. The key is to tap into your CSS and XML files to render other versions of the AJAX application. This tactic is as â??progressive enhancement.â??

Progressive enhancement is a web design strategy that emphasizes accessibility, semantic markup, external style sheet, and scripting technologies. By layering designs in a concatenated progressive enhancement allows all users â?? and search engine spiders â?? to access the basic content and functionality of any web page.

When implementing progressive enhancement, a basic markup document is created, geared toward the lowest common denominator of browser software functionality. The web designer then adds functionality or enhancements to the presentation and behavior of the page using CSS, JavaScript or other combinations of Flash or Java applets. In tandem with user-agent detection, progressive enhancement will automatically render both user- and search engine-friendly pages.

You can observe progressive enhancement in motion by visiting Amazonâ??s Create Your Own Ring page. Simply turn off your JavaScript capabilities to see how the program maintains its AJAX-like functionality for all users. Also note that the initial load of the AJAX application contains the optimized elements such as title attributes, header tags and meta description, as well as a crawlable static URL. All of this is visible in Google cache and revealed in the pageâ??s search engine snippet:

 

Amazon.com: Create Your Own Ring: Diamond Search
The Amazon.com Collection. Why Buy Jewelry & Watches at Amazon?
… More to Explore. Preset Engagement Rings … Create Your Own Ring …

www.amazon.com/gp/cyo/cyor-fork.html

 

To produce these particular SEO elements, server side scripts and .htaccess rewrite modules are required. (If site is not Apache server-based then the rewrite module may not be an option, but there are always solutions.)

When optimizing AJAX it’s important to remember three things: Search engine results are affected by on-the-page, behind-the-page and off-the-page factors. It’s essential to provide an alternate way for users and spiders to navigate their way through to all of your great content without sacrificing usability, accessibility and linkability.

Good Cloaking, Evil Cloaking & Detection

March 1st, 2007

by

Originally published in Search Engine Land

Is cloaking evil? It’s one of the most heavily debated topics in the SEO industry – and people often can’t even agree on what defines cloaking. In this column, I wanted to look at an example of what even the search engines might consider “good” cloaking, the middle-ground territory that page testing introduces plus revisiting how to detect when “evil” old-school page cloaking is happening.

Continue reading »

DIY SEO

February 28th, 2007

by

Originally published in ClickZ

Lead Strategist with Netconcepts, PJ Fusco states that SEO can be summed into a 4-step process: “set some ground rules; get your site right; post some great content; and earn inbound links.” Master these tactics and you are well on your way to building a better site and getting found…

Continue reading »

SEO Report Card: Escaping the Google Sandbox

February 19th, 2007

by

Originally published in Practical Ecommerce

New sites are always at a disadvantage when it comes to ranking well in Google, particularly when the domain name is new, too. This phenomenon, known by some as the “Google Sandbox” and by others as the “TrustBox,” is not a myth. It is very real and very much an issue for the subject of this issue’s SEO Report Card – the fair trade supporting merchant “Two Hands Worldshop.”

Continue reading »

Canonicalization Made Simple

February 14th, 2007

by

Originally published in ClickZ

P J Fusco, lead strategist with Netconcepts highlights canonicalization, “the process of converting data that has more than one possible representation into a ‘standardized’ canonical representation.” Easy, right?

To put this into an clearer context, canonicalization is the process that search engines take to choose the cleanest URLs to display in the SERPs.

Continue reading »

SEO: Can Wikipedia Help Your Business?

February 12th, 2007

by

Originally published in Practical Ecommerce

In Google, Wikipedia is everywhere. Pretty much anything you type into Google seems to result in a Wikipedia entry being returned as a top-10 result. Wikipedia’s status in the search engines as an “authority site” is undisputed. Those lucky, well-connected, skillful or famous enough to be cited enjoyed the benefits of Wikipedia’s unique “golden link effect.” Then a new policy instituted in January changed all that. As a countermeasure to thwart spammers competing in an SEO contest, all external links within Wikipedia were “nofollowed.” This effectively cut off the outward flow of “link juice” (PageRank) to websites referenced in Wikipedia…

Continue reading »

Website Critique: Putting Jegs.com in Drive

February 1st, 2007

by Stephan Spencer and David Fry

This website critique was conducted by David Fry and Stephan Spencer. David Fry focused on the site’s content and functionality while Stephan Spencer, Founder and President of Netconcepts, tested Jegs.com’s search capabilities.

Continue reading »

SERPs and the Super Bowl

January 31st, 2007

by

Originally published in ClickZ

Can SERPs predict the outcome of Super Bowl XLI? Lead Strategist with Netconcepts, PJ Fusco keeps score as three major search engines tell all.

Looking only at indexation, back-links, result snippets, on-page content references, and engine popularity, MSN, Yahoo and Google are put to the ultimate Nostradamus Super Bowl test. Will Yahoo and Google level the playing field or will MSN come up with a flea-flicker at the last second?

Who will emerge triumphant in this battle of the Super Bowl SERPs? Click here to see PJ’s results.

Clearing the Clutter

January 30th, 2007

by

Originally published in MarketingProfs

Marketers, by the very nature of their job function, must juggle numerous campaigns, a range of portfolios, multiple channels, and various corporate, political, and personnel issuesâ??all simultaneously.

Like so many folks, I made a New Year’s resolutionâ??to get organized, to get the clutter out of my head and into a system where I don’t have worry about it on a daily basis, but where it will pop up when the time is right for action. And I have discovered how to do it with Getting Things Done is a best-selling book by productivity guru David Allen.

Continue reading »

Stop, Thief! How to Protect Your Site from Copyright Infringement

January 23rd, 2007

by

Originally published in MarketingProfs

They say that “Imitation is the sincerest form of flattery.” Not if you are a Web site owner and you have a brand to protect, however!

I’ve seen designs copied, content copied, even entire sites copied. It’s so easy for infringers to “View Source” and take whatever they like, without regard to copyright.

You can locate copyright infringers pretty easily with Copyscape if they’ve lifted some of your page copy. It’s much more difficult if they’ve limited their sticky fingers to just your design.

Continue reading »



web hosting reviews how to get rid of acne fast painkiller addiction How to Get Rid of Ringworm body building workouts how to loose weight in a week what causes migraines