DirigoDev - 6:09 pm on Nov 21, 2012 (gmt 0)
I have a URL that is set to disallow everything from the robots.txt. The site is a portal used by our customers. The URL is active on 3rd party websites. The URL has listing in Google SERPs in the following structure:
Close this block - KW1 KW2
The KW1 and KW2 is relevant and factual. The domain is correct. Where does "Close this block -" come from? It is not in my code anywhere. My site does not have a Title because it was set to noindex. Could this be the issue?
I know that Google won't crawl or index the content of pages blocked by robots.txt, but that they still index the URLs if they find them on other pages on the web.
Anyone know how to fix this?