Forum Moderators: Robert Charlton & goodroi
My question is if you are banned by the Google translator engine, will Google Search also ban your site?
Thanks for reading my post.
I wouldn't have thought that a ban from Google Translator would affect your established ranking in organic Google Search. Certainly many sites use the plug-in without seeing that effect. However, there may some tie-in here since your server's IP address would be in the referer at Google Translator.
What's more likely to cause a problem, IMO, is the creation of 13X more urls for your site, all at one time. I would hesitate to let googlebot crawl all those auto-translations, and especially all of them at once. The profile for the site would be very much like autogenerated spam pages.
However, the timing of this makes me wonder. Seeing SERP trouble just one day after the installation seems too quick to me, and there may be another issue with your website that's complicating things.
1 day seems a little quick for a penalty but if there is garbage html and/or garbage links it could be possible.
It appears that Global Translator is automating this process (a guess on my part), and it's likely therefore that Google would balk at the use of Global Translator, or of any tool that in effect uses Google Translate as a content generator. Google objects to automated search queries, eg, in part because of the extra load they place on Google's servers.
[edited by: Robert_Charlton at 11:31 pm (utc) on Jan. 24, 2009]
The combination of optimized permalinks and caches, though, certainly suggests spiderable "pages." All together, this suggests to me that if there were enough activity to get banned by the translation engines (by creating a set of translated pages, eg), there would then also be enough additional urls to create problems as tedster describes....
What's more likely to cause a problem, IMO, is the creation of 13X more urls for your site, all at one time.
If used only on demand, the plug-in might not cause any problems.
What's more likely to cause a problem, IMO, is the creation of 13X more urls for your site, all at one time. I would hesitate to let googlebot crawl all those auto-translations, and especially all of them at once. The profile for the site would be very much like autogenerated spam pages.I think that is the culprit. Based on my observation when I enabled it again and integrating it on sitemap, it happens again. I have to remove those auto generated pages on my sitemap. Now the problem is how to gradually ad those pages to my sitemap since the sitemap plugin in wordpress don't have that feature. When you hit update, all of your pages will be included in the sitemap.
I am working with 8 languages, but have restricted access with robots.txt to the domain.co.uk/lang/ folder. Thus far I have only enabled one extra language - French despite the fact that I have the 8 live on the site.
I would like to obviously have the site translated, as a lot of my potential market for this blog is international, however, will not do that at the sacrifice of PR.
I will let you know my progress.