From Good to Great Content

March 14th, 2007

by

Originally published in ClickZ

“In SEO, what’s the difference between good content and great content?” states P J Fusco, lead strategist with Netconcepts. Both are a win win right? Wrong. Good content makes your site visible to search engines, but the inspiration ends there. Search engines love good content — but Users love Great content. Great content engages the visitor to read further, subscribe, purchase, etc. Creating great content does not come easy.

However here are a few tips and tricks to put you on the right path to increased visitor conversion…

Options for Optimizing AJAX

March 2nd, 2007

by

AJAX-driven web applications are becoming increasingly popular on commercial websites. AJAX has an ability to enrich, yet simplify a userâ??s experience when used properly. AJAX can also provide a highly user-friendly interface that works smoothly, quickly, and often better than traditional programming.

AJAX is short for Asynchronous JavaScript and Extensible Markup Language. Make no mistake about it — JavaScript and XML are not “new” technologies. Both programming models have been around for some time. However, the unique combination of JavaScript and XML is relatively recent, as are the problems AJAX presents for a site’s search engine visibility.

The primary benefit of developing a site with AJAX is the ability to work invisibly in the background of a site. AJAX is used to supply data to the client browser that renders up as a relatively seamless “application” instead of the click-and-wait-to-load functionality associated with more conventional web page constructs.

How seamless is the user experience with AJAX? Check out Google Maps or Google Suggest to see world-class AJAX applications in motion. You can find what you want, when you want it, with relative ease and accuracy when AJAX is in use. What you can’t find is a unique URL or navigational links for search engine spiders to crawl and index, which brings us to our first SEO barrier to overcome — the “J” in AJAX.

JavaScript has been a stumbling block for search engine visibility for quite some time. None of the major search engines show any indication of overcoming these types of scripted data issues anytime soon. Consequently, the single greatest optimization issue with AJAX is the tendency to not generate unique, bookmarkable, linkable and therefore indexable URLs.

The comparative shopping engine Become.com overcomes this barrier by creating and linking together static URLs of search results pages. A quick [site:www.become.com] search in Google reveals how well this AJAX-workaround in indexed.

Meanwhile, sites like Scion.com fail to make the same programmatic leap to provide a similar search experience. Imagine how the carmaker could promote celebrity built custom automobiles in the search engines if only static pages of a punked-out Ashton Kutcher or a blinged-out Usher-mobile were rendered and linked to throughout the site.

While AJAX can be a great way to enhance the user experience, not all visitors will have a great on-site experience when non-JavaScript-enabled browsers are being used. When it comes to site accessibility and SEO, itâ??s imperative that an AJAX-alternate experience be provided.

Because AJAX relies on JavaScript — as well as Cascading Style Sheets (CSS) and XML â?? itâ??s relatively easy to provide an alternate experience for non-JavaScript users. The key is to tap into your CSS and XML files to render other versions of the AJAX application. This tactic is as â??progressive enhancement.â??

Progressive enhancement is a web design strategy that emphasizes accessibility, semantic markup, external style sheet, and scripting technologies. By layering designs in a concatenated progressive enhancement allows all users â?? and search engine spiders â?? to access the basic content and functionality of any web page.

When implementing progressive enhancement, a basic markup document is created, geared toward the lowest common denominator of browser software functionality. The web designer then adds functionality or enhancements to the presentation and behavior of the page using CSS, JavaScript or other combinations of Flash or Java applets. In tandem with user-agent detection, progressive enhancement will automatically render both user- and search engine-friendly pages.

You can observe progressive enhancement in motion by visiting Amazonâ??s Create Your Own Ring page. Simply turn off your JavaScript capabilities to see how the program maintains its AJAX-like functionality for all users. Also note that the initial load of the AJAX application contains the optimized elements such as title attributes, header tags and meta description, as well as a crawlable static URL. All of this is visible in Google cache and revealed in the pageâ??s search engine snippet:

 

Amazon.com: Create Your Own Ring: Diamond Search
The Amazon.com Collection. Why Buy Jewelry & Watches at Amazon?
… More to Explore. Preset Engagement Rings … Create Your Own Ring …

www.amazon.com/gp/cyo/cyor-fork.html

 

To produce these particular SEO elements, server side scripts and .htaccess rewrite modules are required. (If site is not Apache server-based then the rewrite module may not be an option, but there are always solutions.)

When optimizing AJAX it’s important to remember three things: Search engine results are affected by on-the-page, behind-the-page and off-the-page factors. It’s essential to provide an alternate way for users and spiders to navigate their way through to all of your great content without sacrificing usability, accessibility and linkability.

Good Cloaking, Evil Cloaking & Detection

March 1st, 2007

by

Originally published in Search Engine Land

Is cloaking evil? It’s one of the most heavily debated topics in the SEO industry – and people often can’t even agree on what defines cloaking. In this column, I wanted to look at an example of what even the search engines might consider “good” cloaking, the middle-ground territory that page testing introduces plus revisiting how to detect when “evil” old-school page cloaking is happening.

Continue reading »

DIY SEO

February 28th, 2007

by

Originally published in ClickZ

Lead Strategist with Netconcepts, PJ Fusco states that SEO can be summed into a 4-step process: “set some ground rules; get your site right; post some great content; and earn inbound links.” Master these tactics and you are well on your way to building a better site and getting found…

Continue reading »

INNOVATION GOLD: GravityStream

February 20th, 2007

by

Originally published in The New Zealand Marketing Association

The New Zealand Marketing Association announces that Netconcepts’ patent-pending GravityStream technology to optimize the â??long tailâ?? of product-related natural search traffic and sales for online retailers won Gold in the category of Innovation.

Read the entry submission and GravityStream product overview published by the NZ Marketing Association and written by Netconcepts’ very own Chief Executive Officer, Nigel Varcoe.

SEO Report Card: Escaping the Google Sandbox

February 19th, 2007

by

Originally published in Practical Ecommerce

New sites are always at a disadvantage when it comes to ranking well in Google, particularly when the domain name is new, too. This phenomenon, known by some as the “Google Sandbox” and by others as the “TrustBox,” is not a myth. It is very real and very much an issue for the subject of this issue’s SEO Report Card – the fair trade supporting merchant “Two Hands Worldshop.”

Continue reading »

Interview with Wikipedia editor Jonathan Hochman

February 15th, 2007

Get an inside look into this valuable online encyclopedia in Stephan Spencer’s interview with SEO specialist Jonathan Hochman. They cover topics like: building knowledge about your brand, delivering traffic to your website, avoiding “linkspam” and things you should definitely not do to anger the Wiki editors.

Canonicalization Made Simple

February 14th, 2007

by

Originally published in ClickZ

P J Fusco, lead strategist with Netconcepts highlights canonicalization, “the process of converting data that has more than one possible representation into a ‘standardized’ canonical representation.” Easy, right?

To put this into an clearer context, canonicalization is the process that search engines take to choose the cleanest URLs to display in the SERPs.

Continue reading »

Screencast on link building with Stephan Spencer and Eric Ward

February 13th, 2007

by

Join our founder and president, Stephan Spencer, along with renowned link builder Eric Ward, in this archived webinar of an information-packed 90-minutes of link building tips and tricks. The webinar, for MarketingProfs.com, was called “Inside Secrets to Building Links for Online Publicity, Buzz and Search Engine Optimization”. It was a follow-on to Stephan’s webinar for MarketingProfs 6 months prior, on the topic of boosting Google rankings through links (also available as a 90-minute screencast).

Watch Stephan and Eric’s webinar as a streaming Flash video »

Or, alternatively download/watch as a Quicktime (m4v) movie (169 MB) or as a Windows Media (wmv) file (59 MB).

Organic Search Interview with Elastic Path

February 13th, 2007

by

In this engaging interview with Dave Olson, Marketing Coordinator for Elastic Path Vancouver, Netconcepts’ VP of Search, Brian Klais discusses his first-hand knowledge of organic search.

In this podcast, Brian focuses on the knowledge and on-going expertise needed to succeed in the organic search market. SEO Best practices can be found anywhere, (just search Google). However, how are you scaling optimization across your entire website? Which techniques will light the most search engine bulbs? How can my keywords open the door to the Long Tail of Natural Search? Listen as Klais answers these questions and shares his expertise on the latest techniques used to get found through organic search.

Interview conducted by Dave Olson, Marketing Coordinator for Elastic Path (Elastic Path podcast 24) on Tuesday, February 13, 2007.