Looking for some expert opinions on the following:
I recently updated my website for my services-based business and have been trying to actively market it using AdWords. I've been having difficulty with the Ad Quality (not my question). After checking Google Webmaster Tools for that domain, I realized that it was including a subdirectory on my domain that included a dev-version of a client's website. Thus those keywords were completely monopolizing Google's view on my domain's new website.
I've since updated the robots.txt file to exclude this directory (which GWT shows, and the robots.txt tester confirms).
The source of the problem was a few links that didn't get updated by the client when they published their website on their domain (thus, it linked back to my domain and that dev directory). Those links have since been removed. They no longer appear in Google's cache, but a "site:domain mydomain" search of the client's domain still comes up with those pages in the results (despite the fact that both the cache and live site no longer show those links).
So, my problem, I think, is that Google is still associating the keywords from that subdirectory/client website with my domain's website. Which, I believe, is skewing my AdWords quality valuation significantly.
GWT shows Crawl errors for pages under that directory now. The new sitemap has been accessed. But the keywords list still shows the bad keywords.
Any idea how long it will take for Google to update it's "analysis" of my website to exclude those keywords? Is there something else I should be doing to get G to disassociate the subdir with my domain's website?
All help greatly appreciated. Thanks!