| 8:58 pm on Aug 8, 2008 (gmt 0)|
|the title is still misleading, as the big G is not blocking it |
Well, some people are reporting that they are blocked. So to be fair to everyone's reports, I've changed the title of this thread into a question.
| 9:01 pm on Aug 8, 2008 (gmt 0)|
I'm glad that they are preventing it now. All these queries from programs used to check rankings has to inflate the search frequency number that google uses in products like adwords or trends... Now we may start to get a better idea of the actualvolume performed on a search term...
| 9:02 pm on Aug 8, 2008 (gmt 0)|
|he never answers the "why only this one company" part... |
Well, as I learned in freshman philosophy, they answer to "Why?" is "Because." So...
|...because of the sheer quantity of unwelcome queries that WP Gold used to send to Google to scrape Google’s rankings |
...appears to be an answer. Maybe not the one you want, but one many understand (though some maybe not agree with it).
| 9:34 pm on Aug 8, 2008 (gmt 0)|
|Who knows, it really makes my head spin and I'm just some noob in a cube. |
Heh! A peace offering. We all feel like that some time.
You're here as a voice for WPG. You've got a bunch of snarling Webmasters nipping at your ankles. What do we do now?
When is it going to get fixed? We may want to check Emergency Rooms around the country to see just how many have fallen out due to this "broken HTML" thing.
And what about that rogue affiliate site that "snookered" me? I ain't real happy about that. The schmucks caused me to go off on a tangent! They are also causing WPG a bit of Brand Damage at the moment.
| 10:30 pm on Aug 8, 2008 (gmt 0)|
Hello Scott, I'm not questioning your honesty, but perhaps you just lack full knowledge of your company.
You made the following comments about the site at 1stplacesoft.com, and the statements on that site:
|I'm not sure what the deal is with [1stplacesoft.com...] |
But the bits that they put on their site are not something that we would say.
You can find our URLs at:
Other than those two, anything else is a reseller or something else.
I'm sorry Scott, but both domains are owned by Webtrends located 851 SW 6th Ave., Suite 600, Portland OR, 97204 US. The phone # for 1stplacesoft.com is answered by the "Webposition by Webtrends" phone system.
| 10:38 pm on Aug 9, 2008 (gmt 0)|
Google has changed the HTML of their SERPS, but the rollout seems to be taking some time. I'm sure that when the rollout is complete, rank checking software companies will be able to update their software and test that it works.
| 2:40 am on Aug 10, 2008 (gmt 0)|
|I'm sure that when the rollout is complete, rank checking software companies will be able to update their software and test that it works. |
For how long? Maybe these HTML updates are going to occur like the changes in the day to day SERPs. Maybe there is a reason for the change? Maybe? Just maybe...
| 9:59 pm on Aug 10, 2008 (gmt 0)|
That would be the easiest way of outwitting ranking software.
| 11:44 pm on Aug 11, 2008 (gmt 0)|
Vanessa Fox has weighed in with various thoughts in another place.
| 9:02 am on Aug 12, 2008 (gmt 0)|
I run a company in the UK who actually provide a managed rank reporting service for SEOs on our own servers.
We've been running since 2004, and haven't had any significant banning issues from any major search engine.
But then we're a bit different - we limit the search phrases to a sensible number, we limit the search depth to 30, we limit the report frequency and we query as politely as we can. Software will let you query with no limit (leaving Google to set its limits by ip blocks or whatever), whereas we set limits on our customers to what we deem to be a sensible level. Using someone like us means you don't need to use software - and so the number of queries to Google and others are reduced drastically by our calculations.
We too believe that rank reports can't be the be-all like they were in the late 90s, but we believe that you must track rankings to monitor trends and spot weak spots.
Use becomes abuse when people over-do it, and this is the result.
| 6:03 pm on Aug 12, 2008 (gmt 0)|
Maybe I'm asking the question incorrectly then. I know that there are programs that are more aggressive in how they search for results. There are programs that click through paid result (ack, click fraud) and organic results to make sure that you actually get to the destination URL. Why not list them? What with all of the anti-link buying messaging that Google puts out, why not list the most popular link buying company?
Karl, from your own experience, do you have periodically re-program how your service checks rankings? I.E. looking for tokens and such in the HTML to begin/end reading sections and what not?
| 6:06 pm on Aug 12, 2008 (gmt 0)|
|What with all of the anti-link buying messaging that Google puts out, why not list the most popular link buying company? |
Oh-oh, treading on thin ice there scottg, be careful. Remember, you are here to represent WPG which is a WebTrends company and I think we need to confirm that you are actually a representative of WPG at this point. If you are, do your supeeriors know of your activities online right now? I think you may get yourself into more trouble at the current pace so be very careful. ;)
| 7:51 pm on Aug 12, 2008 (gmt 0)|
Like Google is going to listen to me? I don't think I have any pull w/them. I'm also not recommending that they include the name of any particular company. Just saying... it seems odd that they would call out one company, for one policy point and not do the same with other policies. It reminds me of the Cnet reporter "ban" a few years ago. In your own, non-corporate, whatever opinion was that "justified" as a third party seeing what happened, with just the little that we know?
| 8:28 pm on Aug 12, 2008 (gmt 0)|
scottg, I really do appreciate your being here in this back and forth. However, I think one of the problems folks are having with your arguments and justifications is that they don't speak directly to the point. You've been throwing out a lot of red herrings for us to follow, such as Walmart, Net Neutrality, PPC, privacy, click fraud, paid links, Cnet, and whatever else that really don't apply to the conversation at hand.
Then, whether intentionally or not, you happen to overlook SnaptechSEO's post about the ownership of 1stplacesoft.com, whose "bits that they put on their site are not something that we would say."
I'm with P1R. I believe you should rethink your arguments and run them through your firm's PR department.
| 9:30 pm on Aug 12, 2008 (gmt 0)|
@Scottg - we have a report which runs each day to check that any html changes in the results pages haven't affected parsing accuracy, but recently we've not had to do any Google updates at all.
| 10:18 pm on Aug 12, 2008 (gmt 0)|
Sorry, I thought that I'd already addressed the central issue. WP is having problems reporting against Google at the moment. We are aware of this. We are working on an update.
No red herrings intended. Just thought that we had covered the main issue and was putting my 2 cents in about WP and Google related topics. Sometimes it's nice to vent and/or explore ideas.
@SnaptechSEO, I do not think this is our site. Wrong URL for one thing. Looking at a public archive of the site, they use to have an affiliate URL when you click through to buy WP from that site. Thus... probably an old affiliate.
@karlwales - I'm not saying that you've had to recently make any significant changes in order to keep parsing Google results with your own software. What I was trying to ask/show was that in the past, you've had to adjust how your program works so that it can continue to parse. Right? Thats all.
Talking to some one today, he said that every time Apple came out with a new flavor of their O/S, that he knew programmers who went scrambling to create updates so that their programs would be compatible. In an extension to this, because not everyone programs the same way, not every one needs to create the same updates at the same time. When one my competitors is having an issue with Google, this does not necessarily mean that WP is having the same issue... And when they are having a problem, it does not necessarily mean that we are.
| 4:05 am on Aug 13, 2008 (gmt 0)|
@scottg - ah, now I see what you mean. Yes - we have made a few systematic changes, but these have been mainly in the timing and spacing of queries. We have also recenty changed the the AJAX API. We also used the SOAP API until it was pulled and we couldn't get new keys for new clients. I'm not sure how much detail I want to post on other changes we've done, but none of them are evil to Google and we do try to query as politely as we can by tightly limiting keywords/frequency and search depth (30).
| 3:08 pm on Aug 13, 2008 (gmt 0)|
Ahhh. There we go, thanks Karl. That is the thing. If you use an API like Ajax, you are getting fairly consistent format for output. Little, if any, changes are needed as it is a pretty consistent format.
It has been my experience though, that you sometimes miss out on some of the IP/regional/etc. variations by using an API. I know that customers complained to me in the past that when using the soap API that it wasn't always the same as their manual Google searches. And in some of my competitors programs, they try to be "nice" to the engines by setting their search to download 100 results per page rather than 10, and thus end up with some really odd results. I would think that this is one reason why we try to grab the actual results from a normal, actual, search from a customer's IP address. Thus.. what they get should be very similar to what they get in a normal search.
But who knows, maybe something to look into in a future version of WP. No tool is 100% accurate because the output comes from many data centers, and various other biases from IP address, location, etc. IMO, the general idea is to get, a general idea of how things are generally moving along with rankings. I look at search engine rankings as similar to the stock exchange except that there is never a "closing price". Your stocks/rankings are constantly moving up/down and can vary from query to query.
| 3:37 pm on Aug 13, 2008 (gmt 0)|
|IMO, the general idea is to get, a general idea of how things are generally moving along with rankings. I look at search engine rankings as similar to the stock exchange except that there is never a "closing price". Your stocks/rankings are constantly moving up/down and can vary from query to query. |
Now we're getting somewhere. :)
Don't quote me on this but, I really do believe Google is grooming us for something similar to WP within Google Webmaster Tools. I have this sneaky suspicion and my suspicions have been fairly accurate up to now. Me "Tin Hat" is constantly receiving signals from all over the universe.
If we want that data, we'll most likely have access to it via Google in the near future. ;)
| 4:52 pm on Aug 13, 2008 (gmt 0)|
Problem appears to have gone away and everything is hunky dorey. Other online rank checkers I occasionally consult (multi DC, blah blah) seem to still be having problems.
| 5:15 pm on Aug 13, 2008 (gmt 0)|
Spoke to soon. I just didn't get any errors this time, but the problem remains.
| 9:01 pm on Aug 13, 2008 (gmt 0)|
Yep, still working on it. I've had some that say they're now OK, with the last update, but that doesn't mean everyone is back up and running...
| 11:30 pm on Aug 13, 2008 (gmt 0)|
| 4:33 pm on Aug 14, 2008 (gmt 0)|
I've talked before (including with scottg at least twice) about how rank checking against Google can consume server capacity that we would prefer to use for actual searchers.
Um, without dropping a link, you can read my "Generic Malware Debunking Post" to see the comments between Scott and me <moderator note: see link in next post>.
There was another conversation I've had with Scott about WPG and rank checking in Google Groups <moderator note: see next post>. Google's response on rank checking has been the same for years and years.
I wouldn't be surprised if our bot detection continued to get better over time for many types of software that scrape Google. We'd prefer to use that server capacity for real users, not someone who wants to check how they rank for tons of queries going down hundreds of pages.
[edited by: tedster at 4:42 pm (utc) on Aug. 14, 2008]
| 4:45 pm on Aug 14, 2008 (gmt 0)|
Here are the links that Matt mentioned above:
Generic Malware Debunking Post [mattcutts.com] on Matt's blog
Google Groups discussion [groups.google.com] between Scott and Matt
| 5:26 pm on Aug 14, 2008 (gmt 0)|
|I wouldn't be surprised if our bot detection continued to get better over time for many types of software that scrape Google. |
I always enjoy when you add those little pieces of information at the public level. :)
|We'd prefer to use that server capacity for real users, not someone who wants to check how they rank for tons of queries going down hundreds of pages. |
Ooh-Rah! If that ain't a shot over the bow. ;)
So, you wouldn't get mad at me in 2009 January when we start Blocking 75% of the Planet from accessing our regionally based websites? < Rhetorical question. ;)
scottg, I believe we have our answer. Your company's name is officially mentioned in the "do not" section of Google and now you have Matt Cutts, the official Google Representative confirming why.
Best thing to do at this point is to take it underground like the others have and stop rubbing it the faces of the search engines! You're company is charging people for those reports. But yet, the resources you scrape for that data are not charging you? Google has made millionaires out of many and not charged them one penny in the process. That concept appears to be changing...
|brotherhood of LAN|
| 5:50 pm on Aug 14, 2008 (gmt 0)|
Sounds prudent, WPG is pretty much synonymous with 'rank checking'
Mind you, the argument about server resources wouldn't have held much water in 1998..but probably slightly off topic.
| 6:12 pm on Aug 14, 2008 (gmt 0)|
|rank checking against Google can consume server capacity that we would prefer to use for actual searchers |
IMHO if Google is concerned about WPG consuming server capacity, seems to me they would be vulnerable to all sorts of malicious botnet activity, which I don't see being discussed ... anywhere ... ever ... but if Google wants me to help spread that message I'll gladly get the word out :-)
A much more plausible reason is Google doesn't want you having/using the data. Since they are in business of scraping/crawling websites they need a little creative framing to change your point of view.
| 6:21 pm on Aug 14, 2008 (gmt 0)|
|IMHO if Google is concerned about WPG consuming server capacity, seems to me they would be vulnerable to all sorts of malicious botnet activity, which I don't see being discussed ... anywhere ... ever |
Was waiting for that "spinned" excuse to be blown apart. TY graywolf
| 6:02 am on Aug 16, 2008 (gmt 0)|
graywolf, whitenight, it's not spin at all. Bots send a lot of traffic to Google, not only for scraping and ranking checking but also to try to find email addresses, guestbooks to spam, and forums to linkbomb. That's not all, either. Anybody remember the Santy worm? Boy, I sure do. It spread to new webservers by searching on Google to find vulnerable webservers. In fact, Niels Provos wrote a whole paper about this stuff. Search for [search worms] to read it.
Does the name Niels Provos sound familiar? It should. :) Do a search on Google for the phrase The reason behind the "We're sorry..." message to find his post which was all about bots, DDoS attacks, search worms, and botnets.
graywolf, just because you don't hear about it doesn't mean that we don't have a team of people working to protect Google against various types of bots. Because we do. :)
| 4:26 pm on Aug 16, 2008 (gmt 0)|
Disclaimer: I don't personally use WPG. I care because of the principle of the matter
|graywolf, whitenight, it's not spin at all. |
By definition, everything is "spin". It just a matter of how much and for whom, to whom
if there's a Goog heaven you'll definitely make it. :P
As always, my gripe with how you handle things is the basic flaw in the understanding of YOUR relationship to US (webmasters)
Mr. Schmidt was on MadMoney the other day swooning over his love of CONTENT PROVIDERS. And then goes on to mention the NYT as an example?! What?!
Do you guys at the plex still not realize WEBMASTERS/SEOs are both your source AND your customers?!
And I'm talking about the same webmasters who think they need WPG to check their rankings every 4 hours...
THOSE are the webmasters writing topics on boring, obscure, silly, or unique content (read - not national news, gossip, or sports) that makes YOUR company so profitable.
I can seriously get updates on the War, Paris Hilton, and Yankees scores ANYWHERE. And rest assured, people NEVER needed Google for that.
What YOUR CUSTOMERS do need Google for, is the new page some smart SEO just wrote that people keep looking for information on, while Google keeps feeding up broad, unsatisfactory, or off-topic pages from Wiki, Answers.com (more scraped content), or A NYT article from 3 years ago.
It's the SEO community that's going to write the most comprehensive, thorough, user-friendly, etc page on any given subject precisely because we know what BOTH Goog wants and the CUSTOMERS want.
(and yes, most small time webmasters are going to CHECK if that page is ranking)
When are you guys over there, going to realize this?!...
And start working WITH US, to "make the world a better place" (quoting Mr. Schimdt) instead of perceiving every SEO activity as a "threat to your business model"
It's Neanderthal thinking (every Goog employee should be required to read Ken Wilber's work). Google itself works as an advanced-form of business BUT THEN you make the ultimately fatal mistake of slipping back into US vs. THEM thinking with everyone outside of Goog.
| This 132 message thread spans 5 pages: < < 132 ( 1 2  4 5 ) > > |