I have Googled this to no avail. Maybe you good folk can help me...
I have a series of 25 websites which are location specific and each one has its own members. However, I have coded a basic forum which is viewable on all of the sites. If someone on one site posts in the forum then that post appears on the forum on all 25 sites.
Of course, I know about Google and repeated content. I imagine that it won't be good for me to have lots and lots of content which appears exactly the same on 25 different websites. Google will think that I am harvesting content or something, or that none of the content is relevant or good quality.
But as you know, forum posts are content rich so blocking the forum from Google altogether would be a real waste.
Therefore I would like to make it so that all of the forum's content is available on Google somewhere, but only once. But I don't want to favour any one of the 25 sites over the others.
So... my solution is to hide from Googlebot the posts on WebsiteLondon which were made by members on WebsiteManchester and vice-versa. This would mean that all of the posts get found by Googlebot at some point (leading to either WebsiteManchester or WebsiteLondon when people find it in search results), but only once, and people viewing the forum on any of the websites will see all of the content at once.
Is there a way to block specific areas of content from Googlebot without getting blacklisted? Kind of like how you can block links with nofollow.
Thanks in advance, fellow web chums.