Forum Moderators: Robert Charlton & goodroi
Error We're sorry... but your query looks similar to automated requests from a computer virus or spyware application. To protect our users, we can't process your request right now.
We'll restore your access as quickly as possible, so try again soon. In the meantime, if you suspect that your computer or network has been infected, you might want to run a virus checker or spyware remover to make sure that your systems are free of viruses and other spurious software.
We apologize for the inconvenience, and hope we'll see you again on Google.
I don't have spyware on my computer, I've always had anti-virus and anti-spyware installed, and it's updated at least daily.
I can look at the first 10 pages if I have my preferences set for 10, and I get the error message on page 11, but I can only look at 1 page if it's set to 100 results and the error message pops up on page 2.
There is no box to fill in a verification that I'm not automated, and I've been getting this for the last two days.
Is it just me or is something at Google broken?
When it happens now we shut down internet explorer and it works again.
I suspect they have put some sort of filter to target scrapers.
Ken
Earlier in the week it was giving me a captcha to proceed but now it's back to just blocking the request.
[edited by: QuantumEntanglement at 5:24 pm (utc) on Feb. 10, 2007]
So, I have no way to determine what is still supplemental, as I can't dig down that far. And with no capture to input, you're out of business.
I'm logged in to my Webmaster Tools account, you'd think that would be good enough. It's not like I'm slamming Google with thousands of requests every day. In fact, I doubt if I ask for even a hundred pages from Google. That certainly doesn't seem suspicious to me.
It's apparently not cookie based, as I deleted all cookies and tried again. As I said, it took me over an hour to get to page 11, so I was hardly putting a load on Google.
I guess they don't want us to really have access to the tools they provide.
I would be very interested to see if anybody else can do an inurl:query and get beyond 11 pages.
(this is also while not being logged in to GWTools)
You're just flat dead in the water.
I'm a bit concerned that they think I'm using an automated process to do this, as I am not. It's just me and Internet Explorer.
I can't get past the first 2 clicks, no matter how it's done. Which prevents me from seeing which pages are supplemental. I haven't attempted this for several days, and I was hoping that whatever the problem was/is it would be fixed by now.
How ridiculous to provide a link to see pages on your site in Webmaster Tools, and then NOT ALLOW YOU TO SEE THE PAGES.
This must be IP based or query based. I'm clicking on the link that returns results for site:example. com, so there's nothing questionable about that.
Any ideas how to get around this?
Edit to add: I had it set to return 10 results, I clicked on page 10 first, then page 19, and that was when I got the error message.
[edited by: AndyA at 2:44 pm (utc) on Feb. 18, 2007]
The scraper protection is set to IP blocks, not individual IP addresses, so perhaps someone near your address is scraping Google, and you're the collateral damage?
Back to your question though, have you tried inurl:example.com?
It produces a slightly different set, with the supplementals excluded. So if something is not on the list, it's either supplemental or not even indexed.
Although I haven't used it too often, and may be even less reliable than the site: operator.
Thanks for the tip. It worked, but I got the "Error" message when I clicked to page 3 of 3 pages. This is getting very annoying. I can't even view my own site when logged into Webmaster Tools.
How does Google expect us to fix our sites so they meet their requirements when they won't even let us view the pages?
This particular site has under 1,000 pages total, so I just don't see any reason that this is an issue with Google. And I've checked for adware/spyware, and they do not exist on my computer. Also, at the rate I'm checking pages, there's no way Google could think I'm a bot.
Get your act together, Google.
It doesn't seem to matter if I'm logged in to Tools or not.
F - R - U - S - T - R - A - T - I - N - G ~!
Right from the google guidelines-----
"Don't use unauthorized computer programs to submit pages, check rankings, etc. Such programs consume computing resources and violate our Terms of Service. Google does not recommend the use of products such as WebPosition Gold™ that send automatic or programmatic queries to Google"
[google.com...]
Unless Internet Explorer 7 is considered to be an automated program, I'm not using anything bad. I don't have any of those automated rank checkers, I just use IE7 and check using the site: link in Webmaster Tools.
This site has less than 1,000 pages, and I check it once a day when I have time to make changes to supplemental pages, sometimes twice a day, but that's it. At the most, with preferences set to 100, I would look at 7 or 8 pages. That doesn't seem to be overdoing it in my book.
[edited by: AndyA at 7:43 pm (utc) on Feb. 19, 2007]
I do a lot of quick checks by hand for all kinds of reasons during almost any day, and last week I got the spyware message a lot of the time. Not over the weekend or today, however. Perhaps there was a technical error that they've now fixed.
No, I'm not on the same IP as my site. My site has a static IP, and I usually check Tools from my home computer, which also has a static IP.
I've noticed that in regular surfing, just for personal use, I can click to higher SERP numbers and don't have a problem. So, it's definitely the use of the site: operator that is triggering it.
As I said, I would think the fact that I'm logged in to my tools account would somehow supercede this error message, as I've verified who I am.
This could be a reason why google has pushed some sites down in the serps because it has some unknown spyware or traffic generating script/program....
Recently I read of about a hacked server that was pushed down in the serps due to email spam! Basically the server was hacked and the spammer was sending out spam emails from the hacked server.