Welcome to WebmasterWorld Guest from 18.104.22.168
I have similar problem about the domain that has been hi-jacked and they placed links [to adult content and pills] on our pages.
This domain is a sub-domain using as a community forum where people can sign up and join the community and we use community server platform to operate the forum. We also have a link to this sub-domain out from our main site homepage as well.
On 20th July, we found hundreds of new pages under our sub-domain listing in Google serps and having Title for something like...
"[product] - our brand name" or
"Order [product] online - our brand name"
Normally it should be "Member name - our brand name", so we think it could be some auto generated software that perform the auto registration with those keywords as an account name.
Actually the site was inactive for a while but Sign-in and Join function were still working as we were missing to disable them because we didn't think something like this was going to happen.
After click on the link in Google, it goes to a blank white page with 2 links in H1 says something like "Click Here For [product]" and "Buy Now" linking straight out to their site.
We checked on html code and found that actually our page are still there but they create a blank page with 2 links on the top of surface on our page.
Then it looks like our sub-domain is now linking to hundreds of bad neighborhood sites, so we immediately removed a link to this sub-domain from our main site home page.
We want to get rid of those listing in Google but unfortunately we didn't verify this sub-domain in GG webmaster tools so we can't request the domain removal in WMT. So now we took the site down showing 404 and block them in robots.txt hoping GG to slowly drop the page out of their index.
Is there any better idea to do apart from this? Appreciated any advice.
[edited by: tedster at 4:45 am (utc) on July 27, 2009]
[edit reason] remove specific spam products [/edit]
So now we took the site down showing 404 and block them in robots.txt
If the urls are blocked in robots.txt, then they won't be spidered any more and googlebot never sees the 404. I would go with a straight 404 to get the message through most quickly.
Odds are that the spammer has also linked to those urls, so googlebot may continually request them. After you see them drop out of the search results, then you can add the urls to robots.txt if you want to save the crawling cycles and bandwidth. Also, if they are backlinked by the spammer, then even if they are blocked in robots.txt, they can show as a url-only listing -- not what you want.
Also, be sure you've plugged that security problem.
One more thing, if the pages are returning 404 then can I still verify the site in webmaster tools? My point is to be able to request the whole site removal with Google. Please advice.
thanks again in advance.
All that can happen is that Google won't allow verification for an empty subdomain - and you're no worse off in that case.
What do you reckon on this?