Welcome to WebmasterWorld Guest from 54.198.222.129

Message Too Old, No Replies

Does Google Block Web Position Gold Ranking Reports?

     
9:17 pm on Aug 5, 2008 (gmt 0)

10+ Year Member



Has anyone else noticed that after years of threatening it appears that Google has now blocked WP Gold from reporting rankings through their tool? We called the WP support line and they said they are waiting for Google to 'do' something and have NO ETA has to when it will be fixed.



WebPosition and Google - The SAGA:

Backround history:

Penalise you for using webpostion gold: (April 2003)
[webmasterworld.com...]

Google and position checking software: (Jan 2003)
[webmasterworld.com...]

Whats the status of Google vs WPG Users? (July 2002)
[webmasterworld.com...]

What exactly gets banned using WebPositionGold; the website or the IP (June 2004)
[webmasterworld.com...]

Web Position Gold & Google IP bans (May 2009)
[webmasterworld.com...]

(sorry the org thread about google banning WPG from 1999-2000, was lost)

[edited by: Brett_Tabke at 3:06 pm (utc) on Aug. 7, 2008]
[edit reason] added history links [/edit]

8:58 pm on Aug 8, 2008 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



the title is still misleading, as the big G is not blocking it

Well, some people are reporting that they are blocked. So to be fair to everyone's reports, I've changed the title of this thread into a question.

9:01 pm on Aug 8, 2008 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



I'm glad that they are preventing it now. All these queries from programs used to check rankings has to inflate the search frequency number that google uses in products like adwords or trends... Now we may start to get a better idea of the actualvolume performed on a search term...
9:02 pm on Aug 8, 2008 (gmt 0)

WebmasterWorld Senior Member jimbeetle is a WebmasterWorld Top Contributor of All Time 10+ Year Member



he never answers the "why only this one company" part...

Well, as I learned in freshman philosophy, they answer to "Why?" is "Because." So...

...because of the sheer quantity of unwelcome queries that WP Gold used to send to Google to scrape Google’s rankings

...appears to be an answer. Maybe not the one you want, but one many understand (though some maybe not agree with it).

9:34 pm on Aug 8, 2008 (gmt 0)

WebmasterWorld Senior Member pageoneresults is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Who knows, it really makes my head spin and I'm just some noob in a cube.

Heh! A peace offering. We all feel like that some time.

You're here as a voice for WPG. You've got a bunch of snarling Webmasters nipping at your ankles. What do we do now?

When is it going to get fixed? We may want to check Emergency Rooms around the country to see just how many have fallen out due to this "broken HTML" thing.

And what about that rogue affiliate site that "snookered" me? I ain't real happy about that. The schmucks caused me to go off on a tangent! They are also causing WPG a bit of Brand Damage at the moment.

10:30 pm on Aug 8, 2008 (gmt 0)

5+ Year Member



Hello Scott, I'm not questioning your honesty, but perhaps you just lack full knowledge of your company.

You made the following comments about the site at 1stplacesoft.com, and the statements on that site:

I'm not sure what the deal is with [1stplacesoft.com...]
But the bits that they put on their site are not something that we would say.

You can find our URLs at:
[webposition.com...] and
[webtrends.com...]

Other than those two, anything else is a reseller or something else.

I'm sorry Scott, but both domains are owned by Webtrends located 851 SW 6th Ave., Suite 600, Portland OR, 97204 US. The phone # for 1stplacesoft.com is answered by the "Webposition by Webtrends" phone system.

Cheers,
Mike

10:38 pm on Aug 9, 2008 (gmt 0)

5+ Year Member



Google has changed the HTML of their SERPS, but the rollout seems to be taking some time. I'm sure that when the rollout is complete, rank checking software companies will be able to update their software and test that it works.
2:40 am on Aug 10, 2008 (gmt 0)

WebmasterWorld Senior Member pageoneresults is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I'm sure that when the rollout is complete, rank checking software companies will be able to update their software and test that it works.

For how long? Maybe these HTML updates are going to occur like the changes in the day to day SERPs. Maybe there is a reason for the change? Maybe? Just maybe...

9:59 pm on Aug 10, 2008 (gmt 0)

5+ Year Member



That would be the easiest way of outwitting ranking software.
11:44 pm on Aug 11, 2008 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Vanessa Fox has weighed in with various thoughts in another place.
9:02 am on Aug 12, 2008 (gmt 0)

5+ Year Member



I run a company in the UK who actually provide a managed rank reporting service for SEOs on our own servers.

We've been running since 2004, and haven't had any significant banning issues from any major search engine.

But then we're a bit different - we limit the search phrases to a sensible number, we limit the search depth to 30, we limit the report frequency and we query as politely as we can. Software will let you query with no limit (leaving Google to set its limits by ip blocks or whatever), whereas we set limits on our customers to what we deem to be a sensible level. Using someone like us means you don't need to use software - and so the number of queries to Google and others are reduced drastically by our calculations.

We too believe that rank reports can't be the be-all like they were in the late 90s, but we believe that you must track rankings to monitor trends and spot weak spots.

Use becomes abuse when people over-do it, and this is the result.

6:03 pm on Aug 12, 2008 (gmt 0)

5+ Year Member



@jimbeetle
Maybe I'm asking the question incorrectly then. I know that there are programs that are more aggressive in how they search for results. There are programs that click through paid result (ack, click fraud) and organic results to make sure that you actually get to the destination URL. Why not list them? What with all of the anti-link buying messaging that Google puts out, why not list the most popular link buying company?

@karlwales
Karl, from your own experience, do you have periodically re-program how your service checks rankings? I.E. looking for tokens and such in the HTML to begin/end reading sections and what not?

6:06 pm on Aug 12, 2008 (gmt 0)

WebmasterWorld Senior Member pageoneresults is a WebmasterWorld Top Contributor of All Time 10+ Year Member



What with all of the anti-link buying messaging that Google puts out, why not list the most popular link buying company?

Oh-oh, treading on thin ice there scottg, be careful. Remember, you are here to represent WPG which is a WebTrends company and I think we need to confirm that you are actually a representative of WPG at this point. If you are, do your supeeriors know of your activities online right now? I think you may get yourself into more trouble at the current pace so be very careful. ;)

7:51 pm on Aug 12, 2008 (gmt 0)

5+ Year Member



@pageoneresults
Like Google is going to listen to me? I don't think I have any pull w/them. I'm also not recommending that they include the name of any particular company. Just saying... it seems odd that they would call out one company, for one policy point and not do the same with other policies. It reminds me of the Cnet reporter "ban" a few years ago. In your own, non-corporate, whatever opinion was that "justified" as a third party seeing what happened, with just the little that we know?
8:28 pm on Aug 12, 2008 (gmt 0)

WebmasterWorld Senior Member jimbeetle is a WebmasterWorld Top Contributor of All Time 10+ Year Member



scottg, I really do appreciate your being here in this back and forth. However, I think one of the problems folks are having with your arguments and justifications is that they don't speak directly to the point. You've been throwing out a lot of red herrings for us to follow, such as Walmart, Net Neutrality, PPC, privacy, click fraud, paid links, Cnet, and whatever else that really don't apply to the conversation at hand.

Then, whether intentionally or not, you happen to overlook SnaptechSEO's post about the ownership of 1stplacesoft.com, whose "bits that they put on their site are not something that we would say."

I'm with P1R. I believe you should rethink your arguments and run them through your firm's PR department.

9:30 pm on Aug 12, 2008 (gmt 0)

5+ Year Member



@Scottg - we have a report which runs each day to check that any html changes in the results pages haven't affected parsing accuracy, but recently we've not had to do any Google updates at all.
10:18 pm on Aug 12, 2008 (gmt 0)

5+ Year Member



@jimbeetle
Sorry, I thought that I'd already addressed the central issue. WP is having problems reporting against Google at the moment. We are aware of this. We are working on an update.

No red herrings intended. Just thought that we had covered the main issue and was putting my 2 cents in about WP and Google related topics. Sometimes it's nice to vent and/or explore ideas.

@SnaptechSEO, I do not think this is our site. Wrong URL for one thing. Looking at a public archive of the site, they use to have an affiliate URL when you click through to buy WP from that site. Thus... probably an old affiliate.

@karlwales - I'm not saying that you've had to recently make any significant changes in order to keep parsing Google results with your own software. What I was trying to ask/show was that in the past, you've had to adjust how your program works so that it can continue to parse. Right? Thats all.

Talking to some one today, he said that every time Apple came out with a new flavor of their O/S, that he knew programmers who went scrambling to create updates so that their programs would be compatible. In an extension to this, because not everyone programs the same way, not every one needs to create the same updates at the same time. When one my competitors is having an issue with Google, this does not necessarily mean that WP is having the same issue... And when they are having a problem, it does not necessarily mean that we are.

4:05 am on Aug 13, 2008 (gmt 0)

5+ Year Member



@scottg - ah, now I see what you mean. Yes - we have made a few systematic changes, but these have been mainly in the timing and spacing of queries. We have also recenty changed the the AJAX API. We also used the SOAP API until it was pulled and we couldn't get new keys for new clients. I'm not sure how much detail I want to post on other changes we've done, but none of them are evil to Google and we do try to query as politely as we can by tightly limiting keywords/frequency and search depth (30).
3:08 pm on Aug 13, 2008 (gmt 0)

5+ Year Member



Ahhh. There we go, thanks Karl. That is the thing. If you use an API like Ajax, you are getting fairly consistent format for output. Little, if any, changes are needed as it is a pretty consistent format.

It has been my experience though, that you sometimes miss out on some of the IP/regional/etc. variations by using an API. I know that customers complained to me in the past that when using the soap API that it wasn't always the same as their manual Google searches. And in some of my competitors programs, they try to be "nice" to the engines by setting their search to download 100 results per page rather than 10, and thus end up with some really odd results. I would think that this is one reason why we try to grab the actual results from a normal, actual, search from a customer's IP address. Thus.. what they get should be very similar to what they get in a normal search.

But who knows, maybe something to look into in a future version of WP. No tool is 100% accurate because the output comes from many data centers, and various other biases from IP address, location, etc. IMO, the general idea is to get, a general idea of how things are generally moving along with rankings. I look at search engine rankings as similar to the stock exchange except that there is never a "closing price". Your stocks/rankings are constantly moving up/down and can vary from query to query.

3:37 pm on Aug 13, 2008 (gmt 0)

WebmasterWorld Senior Member pageoneresults is a WebmasterWorld Top Contributor of All Time 10+ Year Member



IMO, the general idea is to get, a general idea of how things are generally moving along with rankings. I look at search engine rankings as similar to the stock exchange except that there is never a "closing price". Your stocks/rankings are constantly moving up/down and can vary from query to query.

Now we're getting somewhere. :)

Don't quote me on this but, I really do believe Google is grooming us for something similar to WP within Google Webmaster Tools. I have this sneaky suspicion and my suspicions have been fairly accurate up to now. Me "Tin Hat" is constantly receiving signals from all over the universe.

If we want that data, we'll most likely have access to it via Google in the near future. ;)

4:52 pm on Aug 13, 2008 (gmt 0)

5+ Year Member



Problem appears to have gone away and everything is hunky dorey. Other online rank checkers I occasionally consult (multi DC, blah blah) seem to still be having problems.
5:15 pm on Aug 13, 2008 (gmt 0)

5+ Year Member



Spoke to soon. I just didn't get any errors this time, but the problem remains.
9:01 pm on Aug 13, 2008 (gmt 0)

5+ Year Member



Yep, still working on it. I've had some that say they're now OK, with the last update, but that doesn't mean everyone is back up and running...
11:30 pm on Aug 13, 2008 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



There isn't "one" page format to be parsed. There's obviously results with different types of universal results, and there are results with images and maps in them. There are the "split screen" results, and so on, too. Not only those, but sometimes results have extra javascript code on the links for tracking, so there are a variety of different code sets to parse - and I guess Google might track whether a real browser is seeing the results as to how they handle this "additional stuff". They could also easily detect that you showed multiple UA from the one IP concurrently.
4:33 pm on Aug 14, 2008 (gmt 0)



I've talked before (including with scottg at least twice) about how rank checking against Google can consume server capacity that we would prefer to use for actual searchers.

Um, without dropping a link, you can read my "Generic Malware Debunking Post" to see the comments between Scott and me <moderator note: see link in next post>.

There was another conversation I've had with Scott about WPG and rank checking in Google Groups <moderator note: see next post>. Google's response on rank checking has been the same for years and years.

I wouldn't be surprised if our bot detection continued to get better over time for many types of software that scrape Google. We'd prefer to use that server capacity for real users, not someone who wants to check how they rank for tons of queries going down hundreds of pages.

[edited by: tedster at 4:42 pm (utc) on Aug. 14, 2008]

4:45 pm on Aug 14, 2008 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Here are the links that Matt mentioned above:

Generic Malware Debunking Post [mattcutts.com] on Matt's blog

Google Groups discussion [groups.google.com] between Scott and Matt

5:26 pm on Aug 14, 2008 (gmt 0)

WebmasterWorld Senior Member pageoneresults is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I wouldn't be surprised if our bot detection continued to get better over time for many types of software that scrape Google.

I always enjoy when you add those little pieces of information at the public level. :)

We'd prefer to use that server capacity for real users, not someone who wants to check how they rank for tons of queries going down hundreds of pages.

Ooh-Rah! If that ain't a shot over the bow. ;)

So, you wouldn't get mad at me in 2009 January when we start Blocking 75% of the Planet from accessing our regionally based websites? < Rhetorical question. ;)

scottg, I believe we have our answer. Your company's name is officially mentioned in the "do not" section of Google and now you have Matt Cutts, the official Google Representative confirming why.

Best thing to do at this point is to take it underground like the others have and stop rubbing it the faces of the search engines! You're company is charging people for those reports. But yet, the resources you scrape for that data are not charging you? Google has made millionaires out of many and not charged them one penny in the process. That concept appears to be changing...

5:50 pm on Aug 14, 2008 (gmt 0)

WebmasterWorld Administrator brotherhood_of_lan is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



take it underground

Sounds prudent, WPG is pretty much synonymous with 'rank checking'

Mind you, the argument about server resources wouldn't have held much water in 1998..but probably slightly off topic.

6:12 pm on Aug 14, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



rank checking against Google can consume server capacity that we would prefer to use for actual searchers

IMHO if Google is concerned about WPG consuming server capacity, seems to me they would be vulnerable to all sorts of malicious botnet activity, which I don't see being discussed ... anywhere ... ever ... but if Google wants me to help spread that message I'll gladly get the word out :-)

A much more plausible reason is Google doesn't want you having/using the data. Since they are in business of scraping/crawling websites they need a little creative framing to change your point of view.

6:21 pm on Aug 14, 2008 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



IMHO if Google is concerned about WPG consuming server capacity, seems to me they would be vulnerable to all sorts of malicious botnet activity, which I don't see being discussed ... anywhere ... ever

Was waiting for that "spinned" excuse to be blown apart. TY graywolf

6:02 am on Aug 16, 2008 (gmt 0)



graywolf, whitenight, it's not spin at all. Bots send a lot of traffic to Google, not only for scraping and ranking checking but also to try to find email addresses, guestbooks to spam, and forums to linkbomb. That's not all, either. Anybody remember the Santy worm? Boy, I sure do. It spread to new webservers by searching on Google to find vulnerable webservers. In fact, Niels Provos wrote a whole paper about this stuff. Search for [search worms] to read it.

Does the name Niels Provos sound familiar? It should. :) Do a search on Google for the phrase The reason behind the "We're sorry..." message to find his post which was all about bots, DDoS attacks, search worms, and botnets.

graywolf, just because you don't hear about it doesn't mean that we don't have a team of people working to protect Google against various types of bots. Because we do. :)

This 132 message thread spans 5 pages: 132
 

Featured Threads

Hot Threads This Week

Hot Threads This Month