Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Panda and Ajax-based applications serving static pages to crawlers.

         

Marfola

2:30 pm on Feb 1, 2012 (gmt 0)

10+ Year Member



We use Ajax throughout our website and our Ajax apps were built prior to google’s hashbang (#!) solution.

To make our pages readable to search engines and non-Javascript-enabled browsers we set up a ‘parallel universe’ and send static urls to search engines and non-Javascript-enabled browsers and # fragment to Javascript-enabled browsers, for example:

<a href="category/subcategory2/page.html" onClick="navigate('category/subcategory1/page.html#subcategory2'); return false">subcategory2</a>


Until now we had considered the hashbang (#!) solution too costly to implement.

Does panda change that?

In the ‘parallel universe’ google sees separate pages and unlike the user with a Javascript-enabled browser doesn’t know how or if the pages are related.

Google sees this:
www.example.com/category/subcategory1/page.html
www.examplecom/category/subcategory2/page.html

Users with Javascript-enabled browsers see this:
www.example.com/category/subcategory1/page.html
www.example.com/category/subcategory1/page.html#subcategory2

In a panda world do the following static urls sent to search engines risk being interpreted as near-duplicate content?
www.example.com/category/subcategory1/page.html
www.example.com/category/subcategory2/page.html

tedster

12:15 am on Feb 3, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I worked with one website that created such an "alternate universe" for Google in order to implement complex AJAX pages. It seemed to work OK in terms of getting the AJAX content into the index. However, it also seemed that the relationships between the various parts of the site was not being analyzed the way they thought it should be and search traffic to the AJAX content seemed, well, low and not always appropriate. They moved to a hash-bang solution because of that and saw improvement.

However, if you are not seeing any issues with your search traffic, then I'd see little motive in going through any redevelopment process.

Marfola

8:41 am on Feb 6, 2012 (gmt 0)

10+ Year Member



Thanks Tedster.

Our ajax content was well indexed and appropriately indexed until last november.

While much of our ajax content is still well indexed we’ve noticed a gradual deindexation of some ajax content in the last three months.

Hence my question about googles ability to map the relationship between the parts without the hashbang.

If google can’t map the ‘parallel universe’ does that structure fall victim to panda and get marked as duplicate content?

tedster

7:15 pm on Feb 6, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It's a theory worth looking into more deeply, I'd say.

How does this parallel universe interact with natural backlinks, for example? Is it possible that the rest of the web doesn't really acknowledge the parallel URLs you serve for Google? That all the off-site signals are saying the parallel universe doesn't really count the way the "natural" URLs do?