Forum Moderators: open
Both domains point to the same web site. Depending on the domain name in the url the script decided which prices to show.
companyname.com/page.asp (show US prices)
companyname.ca/page.asp (show CAN prices)
Everything is exactly the same except prices and in some product categories certain products are only available in canada so they are not shown at all in the .com site. (maybe 10% of products)
Does google recognize that these are country specific pages and NOT lable them as spam?
In a related situation I was reading that google may now be indexing javascript links. The site in question has a javascript menu system with links as:
[page.asp?prodid=x] in the javascript.
and also a semi-static page named for the product:
[product-x.asp]
I did this because previously the javascript menu would not allow robots to find the pages. If G is now indexing those javascript links will we be penalized for having 1200 duplicate pages? (One static, one dynamic for each page of the site) Compound that be COM/CA and we would have 4 duplicate pages for every page on our site.
Clearly to the human eye this is not spamming. Are the robots so dicerning?
Thanks for any advice.
I recently began to experiment with a similar scenario to the one you described. Several domains all containing the same website and subdomains which had the same website html, apart from modified page content. We have yet to determine if changing the sites and subdomains to referral pages (site has moved, click here) and eliminating the “duplicate” content has resolved our problem. We are very low in the SERPs and have been since Florida. 800 to 80 to 12 and now 60-80 (with daily fluctuations of +/- 10) If we are indeed being penalized because of seemingly duplicate content, whether it be on different domains or subdomains I will not be happy.
In my case some of our domains are .ca and some .com. We rectified our “problems” within the last month as a last resort. Since Florida we have tried everything to re-establish rankings and hopefully these duplicate content filters are the reason for low rankings.
It is completely ridiculous that G could potentially be penalizing sites that make use of multiple domains or subdomains, while maintaining the same website layout. Penalties should be given to obvious spammy sites. There should be no grey area with a margin of error deemed acceptable.
Have you noticed a slippage in your client’s SERPs since the talk of JavaScript spidering?
We have had the com/ca configuration for many years and I was still able to get almost 100% of the pages indexed and for dozens of pages we enjoyed very good ranking (1-4) for generic product terms. So come to think about it the com/ca is probably fine (although maybe we would have 100's of pages with high rank without it :)
These positions were held for about 3 months, then about a month ago poof. The javascript indexing was pretty recent so.....