Welcome to WebmasterWorld Guest from 54.167.177.207

Google is Now Asking for a Code if You Use a Rank Checker

   
6:10 am on Mar 5, 2005 (gmt 0)

WebmasterWorld Senior Member ogletree is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I just ran a rank checker and I get a google page telling me I have a virus and that I can't search until I put in a code that is in a picture on the page. I ran it last night and this morning and many other times for over a year and never got that. Is this new?
11:37 pm on Mar 8, 2005 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Just to echo what lftl said--you get 1000 searches a day, and you can use those searches to dig way down past the top 10 if you want. It's just that each search can only fetch 10 results.
12:43 am on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yeah, querying webservers programmatically should be illegal, eh Google?
1:05 am on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



blaze, the collective weight of all those queries add up; that's one of the main reasons we ask people not to do it. But the WebAPI is a great way to do 1000 queries/day of whatever you're interested in for your own personal purposes.
1:13 am on Mar 9, 2005 (gmt 0)

WebmasterWorld Administrator brett_tabke is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Well blaze, I do see where you want to go with that, and I do agree that there would be a new standard - but until then, all we have is bots.txt. Unfortunatly, Google has devalued it by using nonstandard syntax.

[google.com...]

1:31 am on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member caveman is a WebmasterWorld Top Contributor of All Time 10+ Year Member



... the collective weight of all those queries add up ...

One might view that as the price to be paid for monetizing our content, on the way to becomming a $50 billion dollar company.

OTOH, I'm sure that those wm queries do put a strain on. Personally, I'm very respectfull of G's servers. It's a citizen of the 'net thing. ;-)

Think it's time to go back into the cave for a while. :)

3:49 am on Mar 9, 2005 (gmt 0)

10+ Year Member



I have had the opinion for some time now that this Google API feature is being used heavily by site scrapers. The outright scrapers get banned on my servers right off, yet I see alot of my text in the pseudo scraped adwords sites. There are a few other engines that have similiar features that are mostly being abused for purposes other than what was intended.
10:00 am on Mar 9, 2005 (gmt 0)

10+ Year Member



this Google API feature is being used heavily by site scrapers

My guess is that most site scrapers don't bother with it because it's too slow and clumsy. They probably use screen scraping because it is (was?) faster and easier.
11:00 am on Mar 9, 2005 (gmt 0)

10+ Year Member



It seems often the API results dont match the "real" Google results. Is this a datacenter thing? Do the API results catch up with what a user would see?
11:12 am on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I second that. Querying via the API and querying manually gets in almost every case differing results. Usually the API-results are off by 1-2 positions (on the negative side).
12:55 pm on Mar 9, 2005 (gmt 0)

10+ Year Member



I saw that message the other day and I have never used any Rank checking or any other type of automated script of any kind whatsoever.

I immediately assumed that I had a virus or spyware, however after extensive scanning, I have found nothing.

1:49 pm on Mar 9, 2005 (gmt 0)

10+ Year Member



The only tools I use to check my sites' rankings, apart from the odd query here and there, are the stats of my affiliate programs.
2:19 pm on Mar 9, 2005 (gmt 0)



then how do we get the data? Sites are often dependent on Google rankings for profitiability, but we should just cross our fingers and hope we do well rather than seeing where we do well so we can make informed business decisions?

Relying on Google's results for profitability probably isn't an informed business decision. True - it comes in very handy, but only at the whim of the algorithm, which will change whether you like it or not over time.

But - given that we all take the Google gravy train from time to time, tracking Google traffic (or better, Sales that convert from Google searcjes), rather than Google rankings is a better plan.

Dixon.

3:31 pm on Mar 9, 2005 (gmt 0)

10+ Year Member



I've counted Google as a great tool since I read this cool research paper in the summer of 1998, and I don't see it getting less interesting anytime soon. As of this morning, Google accounted for 63.77% of leads to our site. And we shouldn't pay close attention? Like Receptional, I could tell you the conversion rate from those hits to sales. Or for any other engine, or partner, or a shopping site. It's not a lot of data if you really know what you're doing with SQL; I have a pile of interesting reports ready for me in the mornings.

And no, 10,000 results (10 results x 1000 queries) is not enough. It's not even in the right order of magnitude. There's lots of ways people search: "widget wx", "widget wx100", "widget w-100"... We want to be doing well for all of them. Just as GoogleGuy suggested, we mine our server logs to keep finding the new ways that customers find things. It's dead easy to use server logs and ranking results to figure out the clickthrough rate for a given position, and that told us it's not important to look deep into the results for a query, it's important to have something for all the variations.

So, yeah, we do a fair bit of scraping. We do it as politely as we can: big delays between searches, more than one originating IP, we spread hits accross the datacenters (which has also told us a fair bit about how updates happen), etc. That politeness has apparently mostly kept us under the captcha limits. I could invest time into looking less bot-like (more IPs, more browser-like headers, occasional clickthroughs, etc.), but I'd really rather just pay for access to the API.

Let me say that again: I'd rather pay for access to results than scrape them. We could have an arms race between bot detection and human emulation, but I'd rather just be a customer. As a businessman and generally nice human being I don't want to play cat and mouse, I want to have reliable transactions and I don't understand why Google doesn't want that, too.

3:31 pm on Mar 9, 2005 (gmt 0)

10+ Year Member



I think this is related to Google's new verification code system, when adding a URL you must now verify you are human by typing in the displayed characters. This lets Google know you are human and not a bot. I guess Google is fighting back. Google has already said that it is not recommended to use programs like "WebPosition Gold" and the alike, which automatically submits your URL to search engines and directories. Google has always had its nose turned down on this and it finally wants to fight back. Something as simple as a verification code, I would have thought they would have thought of this sooner, but I guess it is not that easy for them, because I'm sure they work with some of the online submitters like addme.com, submitexpress.com, etc. I wonder if it has affected any of them.
4:31 pm on Mar 9, 2005 (gmt 0)

WebmasterWorld Administrator rogerd is a WebmasterWorld Top Contributor of All Time 10+ Year Member



>>Everybody thought I was nuts.

C'mon, ogletree, it's going to take more than one data point to convince us of your sanity. ;)

5:50 pm on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member




I second that. Querying via the API and querying manually gets in almost every case differing results. Usually the API-results are off by 1-2 positions (on the negative side).

Is it 1-2 position off or a couple weeks behind? I'm sure it's the later, i.e., the site where you have submitted your API is only allowed to query outdated results somehow.

11:10 pm on Mar 9, 2005 (gmt 0)

10+ Year Member



I'm 100% behind insight's comments here. Google needs an API that can be used for commercial purposes. Why isn't there a commercial option? I guess Google doesn't need the money nowadays :-)

In which case.. the coder vs Google arms race can continue, and I'm okay with that, because Google aren't trying that hard. Or, at least, to me it's always seemed easy to get the information required.

11:13 pm on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Is it 1-2 position off or a couple weeks behind?

Uh, you got me there... need to check into it.

3:18 am on Mar 10, 2005 (gmt 0)

10+ Year Member



I don't see a problem with using the Web API to check rankings for your own sites--just don't sell that tool/service

Funny, that's what I feel about Autolink. I don't see a problem with changing webpage content for yourself-- just don't sell [or give away] a toolbar that does it.

3:32 am on Mar 10, 2005 (gmt 0)

10+ Year Member



blaze, the collective weight of all those queries add up; that's one of the main reasons we ask people not to do it. But the WebAPI is a great way to do 1000 queries/day of whatever you're interested in for your own personal purposes.

What a shame all those programming resources have to be dedicated to preventing misuse of Google results. Publishers feel the same way about Autolink's misuse of their work.

3:32 am on Mar 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Well, what about those of use who mine our logs and then use rank checking software to keep up with what we find in the logs?

-s-

3:58 am on Mar 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Perhaps a better question would be how much is too much? Would a minute apart be google friendly or 30 seconds?

I'm certainly willing to be gentle on google's resources - but still need to do my job too.

Any Thoughts?

-s-

11:49 am on Mar 10, 2005 (gmt 0)

10+ Year Member



inurl:showrefs.php
5:09 pm on Mar 10, 2005 (gmt 0)

10+ Year Member



Insight said:
> GoogleGuy: then how do we get the data [page rankings]? Sites are often dependent on Google rankings for profitability, but we should just cross our fingers and hope we do well rather than seeing where we do well so we can make informed business decisions?

GoogleGuy replied:
> insight, I'd start with your server logs--that's gold that's usually not mined nearly as much as rank checking. Personally, I don't see a problem with using the Web API to check rankings for your own sites--just don't sell that tool/service. :)

Forgive my ignorance, but what kind of "gold" is GG referring to? Could one determine the link popularity for each of one's pages from the server logs?

Thanks,

Ric

5:27 pm on Mar 10, 2005 (gmt 0)

10+ Year Member



"Forgive my ignorance, but what kind of "gold" is GG referring to? "

The kind you 'pan' for, as opposed to 'strip mining'.

5:55 pm on Mar 10, 2005 (gmt 0)

10+ Year Member



Forgive my ignorance, but what kind of "gold" is GG referring to? Could one determine the link popularity for each of one's pages from the server logs?

He means that there is alot of information in your server logs that can help you improve your site. You can see where your referrals are coming from, you can see what people are using, what paths through the site they are taking, etc..

Page rank info is not available in the logs but you can see what keywords people are using that find you in given search engines.

10:30 pm on Mar 10, 2005 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



What communitynews said. I'd avoid getting obsessed about backlinks/PR and concentrate on useful content/services you can add to your site that will attract links naturally. Server logs give you a great picture of what words/phrases people are using to find your site. Usually looking through those logs will give you ideas about other pages would be good to add to your site, for example.
12:47 am on Mar 11, 2005 (gmt 0)

10+ Year Member



GoogleGuy,

I, and a slew of others Iím sure, am waiting with baited breath for a response to Insights comments:

I'd rather pay for access to results than scrape them. We could have an arms race between bot detection and human emulation, but I'd rather just be a customer. As a businessman and generally nice human being I don't want to play cat and mouse, I want to have reliable transactions and I don't understand why Google doesn't want that, too.

Iím aware that silence is a response, but I thought I'd prod a bit more.

2:05 am on Mar 11, 2005 (gmt 0)

10+ Year Member



MoneyMan, your name contains something google doesn't want for this. I think Google doesn't want others making money on something they may want to make money on.

The lack of response to the suggestion they charge money for using the service implies this.

Am I right, GoogleGuy? You don't have to answer if I am right.

2:43 am on Mar 11, 2005 (gmt 0)

WebmasterWorld Senior Member caveman is a WebmasterWorld Top Contributor of All Time 10+ Year Member



> Am I right, GoogleGuy? You don't have to answer if I am right

Hehe. I'll have to remember that one. ;-)

This 77 message thread spans 3 pages: 77
 

Featured Threads

My Threads

Hot Threads This Week

Hot Threads This Month