Welcome to WebmasterWorld Guest from 220.127.116.11
Penalise you for using webpostion gold: (April 2003)
Google and position checking software: (Jan 2003)
Whats the status of Google vs WPG Users? (July 2002)
What exactly gets banned using WebPositionGold; the website or the IP (June 2004)
Web Position Gold & Google IP bans (May 2009)
(sorry the org thread about google banning WPG from 1999-2000, was lost)
[edited by: Brett_Tabke at 3:06 pm (utc) on Aug. 7, 2008]
[edit reason] added history links [/edit]
he never answers the "why only this one company" part...
...because of the sheer quantity of unwelcome queries that WP Gold used to send to Google to scrape Google’s rankings
...appears to be an answer. Maybe not the one you want, but one many understand (though some maybe not agree with it).
Who knows, it really makes my head spin and I'm just some noob in a cube.
Heh! A peace offering. We all feel like that some time.
You're here as a voice for WPG. You've got a bunch of snarling Webmasters nipping at your ankles. What do we do now?
When is it going to get fixed? We may want to check Emergency Rooms around the country to see just how many have fallen out due to this "broken HTML" thing.
And what about that rogue affiliate site that "snookered" me? I ain't real happy about that. The schmucks caused me to go off on a tangent! They are also causing WPG a bit of Brand Damage at the moment.
You made the following comments about the site at 1stplacesoft.com, and the statements on that site:
I'm not sure what the deal is with [1stplacesoft.com...]
But the bits that they put on their site are not something that we would say.
Other than those two, anything else is a reseller or something else.
I'm sorry Scott, but both domains are owned by Webtrends located 851 SW 6th Ave., Suite 600, Portland OR, 97204 US. The phone # for 1stplacesoft.com is answered by the "Webposition by Webtrends" phone system.
I'm sure that when the rollout is complete, rank checking software companies will be able to update their software and test that it works.
For how long? Maybe these HTML updates are going to occur like the changes in the day to day SERPs. Maybe there is a reason for the change? Maybe? Just maybe...
We've been running since 2004, and haven't had any significant banning issues from any major search engine.
But then we're a bit different - we limit the search phrases to a sensible number, we limit the search depth to 30, we limit the report frequency and we query as politely as we can. Software will let you query with no limit (leaving Google to set its limits by ip blocks or whatever), whereas we set limits on our customers to what we deem to be a sensible level. Using someone like us means you don't need to use software - and so the number of queries to Google and others are reduced drastically by our calculations.
We too believe that rank reports can't be the be-all like they were in the late 90s, but we believe that you must track rankings to monitor trends and spot weak spots.
Use becomes abuse when people over-do it, and this is the result.
Karl, from your own experience, do you have periodically re-program how your service checks rankings? I.E. looking for tokens and such in the HTML to begin/end reading sections and what not?
What with all of the anti-link buying messaging that Google puts out, why not list the most popular link buying company?
Oh-oh, treading on thin ice there scottg, be careful. Remember, you are here to represent WPG which is a WebTrends company and I think we need to confirm that you are actually a representative of WPG at this point. If you are, do your supeeriors know of your activities online right now? I think you may get yourself into more trouble at the current pace so be very careful. ;)
Then, whether intentionally or not, you happen to overlook SnaptechSEO's post about the ownership of 1stplacesoft.com, whose "bits that they put on their site are not something that we would say."
I'm with P1R. I believe you should rethink your arguments and run them through your firm's PR department.
No red herrings intended. Just thought that we had covered the main issue and was putting my 2 cents in about WP and Google related topics. Sometimes it's nice to vent and/or explore ideas.
@SnaptechSEO, I do not think this is our site. Wrong URL for one thing. Looking at a public archive of the site, they use to have an affiliate URL when you click through to buy WP from that site. Thus... probably an old affiliate.
@karlwales - I'm not saying that you've had to recently make any significant changes in order to keep parsing Google results with your own software. What I was trying to ask/show was that in the past, you've had to adjust how your program works so that it can continue to parse. Right? Thats all.
Talking to some one today, he said that every time Apple came out with a new flavor of their O/S, that he knew programmers who went scrambling to create updates so that their programs would be compatible. In an extension to this, because not everyone programs the same way, not every one needs to create the same updates at the same time. When one my competitors is having an issue with Google, this does not necessarily mean that WP is having the same issue... And when they are having a problem, it does not necessarily mean that we are.
It has been my experience though, that you sometimes miss out on some of the IP/regional/etc. variations by using an API. I know that customers complained to me in the past that when using the soap API that it wasn't always the same as their manual Google searches. And in some of my competitors programs, they try to be "nice" to the engines by setting their search to download 100 results per page rather than 10, and thus end up with some really odd results. I would think that this is one reason why we try to grab the actual results from a normal, actual, search from a customer's IP address. Thus.. what they get should be very similar to what they get in a normal search.
But who knows, maybe something to look into in a future version of WP. No tool is 100% accurate because the output comes from many data centers, and various other biases from IP address, location, etc. IMO, the general idea is to get, a general idea of how things are generally moving along with rankings. I look at search engine rankings as similar to the stock exchange except that there is never a "closing price". Your stocks/rankings are constantly moving up/down and can vary from query to query.
IMO, the general idea is to get, a general idea of how things are generally moving along with rankings. I look at search engine rankings as similar to the stock exchange except that there is never a "closing price". Your stocks/rankings are constantly moving up/down and can vary from query to query.
Now we're getting somewhere. :)
Don't quote me on this but, I really do believe Google is grooming us for something similar to WP within Google Webmaster Tools. I have this sneaky suspicion and my suspicions have been fairly accurate up to now. Me "Tin Hat" is constantly receiving signals from all over the universe.
If we want that data, we'll most likely have access to it via Google in the near future. ;)
joined:Apr 26, 2015
Um, without dropping a link, you can read my "Generic Malware Debunking Post" to see the comments between Scott and me <moderator note: see link in next post>.
There was another conversation I've had with Scott about WPG and rank checking in Google Groups <moderator note: see next post>. Google's response on rank checking has been the same for years and years.
I wouldn't be surprised if our bot detection continued to get better over time for many types of software that scrape Google. We'd prefer to use that server capacity for real users, not someone who wants to check how they rank for tons of queries going down hundreds of pages.
[edited by: tedster at 4:42 pm (utc) on Aug. 14, 2008]
I wouldn't be surprised if our bot detection continued to get better over time for many types of software that scrape Google.
I always enjoy when you add those little pieces of information at the public level. :)
We'd prefer to use that server capacity for real users, not someone who wants to check how they rank for tons of queries going down hundreds of pages.
Ooh-Rah! If that ain't a shot over the bow. ;)
So, you wouldn't get mad at me in 2009 January when we start Blocking 75% of the Planet from accessing our regionally based websites? < Rhetorical question. ;)
scottg, I believe we have our answer. Your company's name is officially mentioned in the "do not" section of Google and now you have Matt Cutts, the official Google Representative confirming why.
Best thing to do at this point is to take it underground like the others have and stop rubbing it the faces of the search engines! You're company is charging people for those reports. But yet, the resources you scrape for that data are not charging you? Google has made millionaires out of many and not charged them one penny in the process. That concept appears to be changing...
rank checking against Google can consume server capacity that we would prefer to use for actual searchers
IMHO if Google is concerned about WPG consuming server capacity, seems to me they would be vulnerable to all sorts of malicious botnet activity, which I don't see being discussed ... anywhere ... ever ... but if Google wants me to help spread that message I'll gladly get the word out :-)
A much more plausible reason is Google doesn't want you having/using the data. Since they are in business of scraping/crawling websites they need a little creative framing to change your point of view.
IMHO if Google is concerned about WPG consuming server capacity, seems to me they would be vulnerable to all sorts of malicious botnet activity, which I don't see being discussed ... anywhere ... ever
Was waiting for that "spinned" excuse to be blown apart. TY graywolf
joined:Apr 26, 2015
Does the name Niels Provos sound familiar? It should. :) Do a search on Google for the phrase The reason behind the "We're sorry..." message to find his post which was all about bots, DDoS attacks, search worms, and botnets.
graywolf, just because you don't hear about it doesn't mean that we don't have a team of people working to protect Google against various types of bots. Because we do. :)