Forum Moderators: Robert Charlton & goodroi
So I started trying to find anything wrong and I found that when a user searches my site using a program I installed, a search results page is created and many of these results pages were being indexed by google so I used robots.txt to block those pages and I removed all the pages with the url removal tool. At first I was just thinking duplicate blocks of content were being created...
Now, a few days later I am realizing that I never fully understood how my search script was working. Apparently if you enter a url it automatically creates a link to that url on the results page, and I can see in my usage statistics that dozens of urls to <bad neighborhoods> have been entered - but I can't check to see if those results pages were indexed because I already removed all those pages.
So, is it likely that this is my problem?
[edited by: tedster at 12:36 am (utc) on June 17, 2008]
Maybe it takes more than a few days to see any changes or there is some other problem...
Should I completely remove the search program that added the unwanted links and make a reconsideration request?