| 4:41 pm on Feb 10, 2007 (gmt 0)|
This happens to us also but very infrequently. It usually occurs when we are checking out our keyword positions, especially when there are common words in the keyword phrase.
When it happens now we shut down internet explorer and it works again.
I suspect they have put some sort of filter to target scrapers.
| 5:23 pm on Feb 10, 2007 (gmt 0)|
I have been getting a lot of these too in the last couple of days. Usually when I skip to the last visible page of results on a site: query for example.
Earlier in the week it was giving me a captcha to proceed but now it's back to just blocking the request.
[edited by: QuantumEntanglement at 5:24 pm (utc) on Feb. 10, 2007]
| 5:41 pm on Feb 10, 2007 (gmt 0)|
I get the feeling that Google doesn't want us to check our URLs indexed anymore. I just spent over an hour going through updating pages, I had my preferences set for 10, and when I got to page 11, I got the error message.
So, I have no way to determine what is still supplemental, as I can't dig down that far. And with no capture to input, you're out of business.
I'm logged in to my Webmaster Tools account, you'd think that would be good enough. It's not like I'm slamming Google with thousands of requests every day. In fact, I doubt if I ask for even a hundred pages from Google. That certainly doesn't seem suspicious to me.
It's apparently not cookie based, as I deleted all cookies and tried again. As I said, it took me over an hour to get to page 11, so I was hardly putting a load on Google.
I guess they don't want us to really have access to the tools they provide.
| 5:31 pm on Feb 11, 2007 (gmt 0)|
Exactly my situation Andy.
I got blocked again just now on the 11th page of a 10-result-per-page site: inurl: query.
| 7:26 am on Feb 12, 2007 (gmt 0)|
OK, I did a few more tests and it seems like the culprit is the inurl: operator. Any combination of other terms/operators with inurl won't let me get past the first 110 results. (11 pages)
I would be very interested to see if anybody else can do an inurl:query and get beyond 11 pages.
(this is also while not being logged in to GWTools)
| 1:08 pm on Feb 12, 2007 (gmt 0)|
I'm not able to get past the first 100 results, no matter how I try. The interesting thing is normally when this happens, you can input the letters/numbers displayed at the bottom to verify you aren't a bot, and then you can go on, but that isn't even appearing now.
You're just flat dead in the water.
| 1:16 pm on Feb 12, 2007 (gmt 0)|
I get these on certain "100 results" queries.
| 3:46 pm on Feb 12, 2007 (gmt 0)|
I hope this is some kind of a bug, and not a change Google has made. I would think being logged in to your Webmaster Tools account would be enough to prevent this from happening. After all, you are viewing the links on your site, and my login info is at the top of the SERPs, so they know who it is.
I'm a bit concerned that they think I'm using an automated process to do this, as I am not. It's just me and Internet Explorer.
| 2:43 pm on Feb 18, 2007 (gmt 0)|
Sorry to revive this thread, but does anyone know what to do to look at your pages using the site: operator?
I can't get past the first 2 clicks, no matter how it's done. Which prevents me from seeing which pages are supplemental. I haven't attempted this for several days, and I was hoping that whatever the problem was/is it would be fixed by now.
How ridiculous to provide a link to see pages on your site in Webmaster Tools, and then NOT ALLOW YOU TO SEE THE PAGES.
This must be IP based or query based. I'm clicking on the link that returns results for site:example. com, so there's nothing questionable about that.
Any ideas how to get around this?
Edit to add: I had it set to return 10 results, I clicked on page 10 first, then page 19, and that was when I got the error message.
[edited by: AndyA at 2:44 pm (utc) on Feb. 18, 2007]
| 4:52 pm on Feb 18, 2007 (gmt 0)|
Are you saying that this would happen for site: searches no matter what IP you send the query from? That would be some news.
The scraper protection is set to IP blocks, not individual IP addresses, so perhaps someone near your address is scraping Google, and you're the collateral damage?
Back to your question though, have you tried inurl:example.com?
It produces a slightly different set, with the supplementals excluded. So if something is not on the list, it's either supplemental or not even indexed.
Although I haven't used it too often, and may be even less reliable than the site: operator.
| 8:52 pm on Feb 18, 2007 (gmt 0)|
Try this search string [ site:example.com *** -nichts ] -- I see only supplemental results this way. Play around with the character string following the minus sign as well.
I'm not sure exactly why this query is working the way it is right now, or how long it will remain usable.
| 11:32 pm on Feb 18, 2007 (gmt 0)|
Thanks for the tip. It worked, but I got the "Error" message when I clicked to page 3 of 3 pages. This is getting very annoying. I can't even view my own site when logged into Webmaster Tools.
How does Google expect us to fix our sites so they meet their requirements when they won't even let us view the pages?
This particular site has under 1,000 pages total, so I just don't see any reason that this is an issue with Google. And I've checked for adware/spyware, and they do not exist on my computer. Also, at the rate I'm checking pages, there's no way Google could think I'm a bot.
Get your act together, Google.
| 1:00 am on Feb 19, 2007 (gmt 0)|
Did you set your Google preferences to 100 per page? That should get you 300 results.
I've been getting this spyware challenge message a lot more, too. Sometimes I do better by directly changing the start parameter in the location-bar/address-field, sometimes by clicking on the links.
| 2:54 am on Feb 19, 2007 (gmt 0)|
Yes, I had preferences set to no filtering for adult content and 100 per page. Perhaps Vanessa Fox could look into this. It would seem if you're logged in to your Webmaster Tools account, there shouldn't be any problem. Perhaps there's a way to circumvent the spyware challenge?
It doesn't seem to matter if I'm logged in to Tools or not.
F - R - U - S - T - R - A - T - I - N - G ~!
| 12:27 pm on Feb 19, 2007 (gmt 0)|
If Your using a ranking or link checker program that is not a google api, you will see this message. When you send hundreds of queries to google and search for your -950 pages google will flag your ip. Ranking and link checker programs are very similiar to other attacks that attempt to slow a server down.
Right from the google guidelines-----
"Don't use unauthorized computer programs to submit pages, check rankings, etc. Such programs consume computing resources and violate our Terms of Service. Google does not recommend the use of products such as WebPosition Gold™ that send automatic or programmatic queries to Google"
| 7:42 pm on Feb 19, 2007 (gmt 0)|
Unless Internet Explorer 7 is considered to be an automated program, I'm not using anything bad. I don't have any of those automated rank checkers, I just use IE7 and check using the site: link in Webmaster Tools.
This site has less than 1,000 pages, and I check it once a day when I have time to make changes to supplemental pages, sometimes twice a day, but that's it. At the most, with preferences set to 100, I would look at 7 or 8 pages. That doesn't seem to be overdoing it in my book.
[edited by: AndyA at 7:43 pm (utc) on Feb. 19, 2007]
| 8:26 pm on Feb 19, 2007 (gmt 0)|
Are you surfing from the same IP that your server is on?
| 8:40 pm on Feb 19, 2007 (gmt 0)|
trinorthlighting, I've been getting the same behavior lately - and definitely not browsing on the same IP as any domain.
I do a lot of quick checks by hand for all kinds of reasons during almost any day, and last week I got the spyware message a lot of the time. Not over the weekend or today, however. Perhaps there was a technical error that they've now fixed.
| 9:09 pm on Feb 19, 2007 (gmt 0)|
No, I'm not on the same IP as my site. My site has a static IP, and I usually check Tools from my home computer, which also has a static IP.
I've noticed that in regular surfing, just for personal use, I can click to higher SERP numbers and don't have a problem. So, it's definitely the use of the site: operator that is triggering it.
As I said, I would think the fact that I'm logged in to my tools account would somehow supercede this error message, as I've verified who I am.
| 9:58 pm on Feb 19, 2007 (gmt 0)|
It would be very easy to pass spyware or some sort of hacking tool from your personal PC to your server when you log in. If someone used a keylogger to get your passwords they could have easily went into your servers as well. You guys might want to thourougly go through your PCs and servers and look for just that.
This could be a reason why google has pushed some sites down in the serps because it has some unknown spyware or traffic generating script/program....
Recently I read of about a hacked server that was pushed down in the serps due to email spam! Basically the server was hacked and the spammer was sending out spam emails from the hacked server.
| 12:14 am on Feb 20, 2007 (gmt 0)|
I get the same message when doing checks for our new landing pages (sometimes unranked). Using the FF browser if you clear your cache and cookies it fixes the problem and I am able to resume checking.
| 3:23 am on Feb 20, 2007 (gmt 0)|
I've got the same thing a few times. It's always been on the last page of the results for me. Maybe just a coincidence. What page does everyone else get it on?