Welcome to WebmasterWorld Guest from 54.160.163.163

Message Too Old, No Replies

Google Cloaking and Keyword Loading On Pages

Breaking their own rules?

     

arrowman

5:49 pm on Mar 8, 2005 (gmt 0)

10+ Year Member



Funny story and discussion on Slashdot:
[slashdot.org...]

shri

2:18 am on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I gotta buy me one of these appliances and ... *grin*

Had a good laugh though. :)

GoogleGuy

2:32 am on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



arrowman, you're right--the GSA does support search over metatags. I suspect that when those internal support pages are changed, any additional information in our database that can help the Search Appliance will be in the metatags--for both users and googlebots. :)

Brett_Tabke

2:44 am on Mar 9, 2005 (gmt 0)

WebmasterWorld Administrator brett_tabke is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



The part I don't understand, is weren't those pages covered by a bots.txt ban? Or was it set to ignore that on it's own site?

lovethecoast

2:47 am on Mar 9, 2005 (gmt 0)

10+ Year Member



Once the pages are fully changed, people will have to follow the same procedure that anyone else would (email webmaster at google.com with the subject "Reinclusion request" to explain the situation).

Wonder if they'll have to wait for months for reinclusion...

(Not a slam -- I've, thankfully, never had a site black listed, and as much as I respect you, this answer just isn't holding much water with me.)

WA_Smith

2:49 am on Mar 9, 2005 (gmt 0)

10+ Year Member



WOW

shri

3:27 am on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>> weren't those pages covered by a bots.txt ban

And why is the GSA using the same UA as the regular bot. :)

stuntdubl

3:39 am on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Not 100% sure here Shri, but it could be that the GSA uses the same string: "googlebot", but not necessarily the same UA.

For instance: SA-Googlebot/2.1 (+http://www.googlebot.com/sa-bot.html) or something to this effect.

Chris_D

4:05 am on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hi Brett,

The [adwords.google.co.uk...] file says:

User-agent: *
Disallow: /

User-Agent: Googlebot
Allow: /
Allow: /support/
Disallow: /*?

The pagese were in the /support/ folder

[edited by: Chris_D at 4:06 am (utc) on Mar. 9, 2005]

msgraph

4:05 am on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"(the code only checks for "Googlebot") "

Hmmm that's funny, before your post and before it all of a sudden went poof, I tried some different versions with Googlebot but some didn't work. Like "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html) didn't work". If it checked for Googlebot wouldn't it have picked that up?

msgraph

4:20 am on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Also, I'd like to know from Google what happens if I make a database error on my sites where some keywords are accidently stuffed into my title on some pages that are buried so deep that they get missed. If the sites get penalized for some reason is there a rapid response form I can fill out to have them de-penalized within the next few days when the datacenters update?

chewy

4:32 am on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



did anyone catch a screen shot of the cached page?

Sorry I missed it.

Post it here?

WebFusion

4:52 am on Mar 9, 2005 (gmt 0)

10+ Year Member



Personally, I think all this "outrage" is laughable at best. Even if google was cloaking, it IS after all THEIR engine. If they wanted to use all the dirty SEO tricks in the world to make their own pages show up first in a search, then more power to them.

I don't mean to trumpet the google horn, as I'm not a big fan, but I think alot of people have really lost perspective as to what a independent company "owes" them.

Filipe

5:00 am on Mar 9, 2005 (gmt 0)

10+ Year Member



Ebay - some of the best seo'ers on the net.

I know it! I interviewed there with a couple of others at their San Jose offices.

I didn't make the cut :(

GoogleGuy

6:31 am on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



msgraph, the process will be the same: an email to webmaster at google.com with a subject of "Reinclusion request." The report won't be treated differently compared to other requests.

GoogleGuy

6:33 am on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



P.S. I'm going to bed now, but tomorrow I'll re-check this thread, and also this one which is interesting: [webmasterworld.com...]

Essex_boy

7:30 am on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member essex_boy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



lost perspective as to what a independent company "owes" them- It owes us nothing I agree, but at least follow your own rules and be loyal to your customers.

Anyway notice that G's home page is PR 9?

mrMister

9:39 am on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think it was a cheap publicity stunt to highlight the dangers of cloaking to the general public. :-)

keeper

10:42 am on Mar 9, 2005 (gmt 0)

10+ Year Member



the Google Search Appliance uses "Googlebot" as a user agent

I thought the default user-agent for the search appliance was "gsa-crawler"?

Or was this changed to be the same as the web crawler for some reason?

Just Guessing

11:27 am on Mar 9, 2005 (gmt 0)

10+ Year Member



Once the pages are fully changed, people will have to follow the same procedure that anyone else would (email webmaster at google.com with the subject "Reinclusion request" to explain the situation).

Interesting - a reinclusion request for individual pages that have been excluded, as opposed to a whole site. Is that new?

walkman

12:23 pm on Mar 9, 2005 (gmt 0)



I tend to agree with Brett,
it's stupid and whoever did has nothing to do with the search people. Instead of adding all those keywords and cloaking, they should've just added just a few keywords and ask for a link from a few good Google pages. That's how they make their money so I doubt he/she would've had a problem getting it.

To GoogleGuy:
"msgraph, the process will be the same: an email to webmaster at google.com with a subject of "Reinclusion request." The report won't be treated differently compared to other requests."

I would treat it differently and include it ASAP. Everyone here would do the same, so let's stop pretending. It's your own page and your own search engine and can do as you wish. End of the story!

mrMister

12:38 pm on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member




Walkman:

It's stupid and whoever did [this] has nothing to do with the search people.

GoogleGuy:

Those pages were primarily intended for the Google Search Appliances

The guys that write the search engine wern't involved. However, whoever did this is some kind of search person. They should really have known the side effects of stuffing extra keywords in the title to any user presenting the UA substring "googlebot".

However, as has been mentioned. All other search engines are blocked from this page, so it only affects their own engine. Even so, they've gone so far as to punish themselves by banning the offending pages from the index.

It was a mistake, and they've received the same consequences that any other site would do if they had made the same mistake. Seems fair enough to me.

[edited by: mrMister at 12:42 pm (utc) on Mar. 9, 2005]

walkman

12:41 pm on Mar 9, 2005 (gmt 0)



"Not the guys that write the search engine. But they're obviously people involved in search and should really have known that showing this page to any user presenting the UA substring "googlebot" is cloaking for Googlebot "

Look someone screwed up. With 3000 employees, for all they know, they might even have a serial killer on GooglePlex too. I mean, one guy or a group did something stupid, embarrased the company, got caught, I'm sure will be dealt with, and that does it for me.

mrMister

12:44 pm on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



main person who could have answered questions was flying back to the U.S. on a plane

Is this the guy that is responsible for the mistake? And Google bought him a return ticket? ;-)

mrMister

12:45 pm on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



for all they know, they might even have a serial killer on GooglePlex too.

/me looks suspicously at GoogleGuy

Nah, can't be him, they always say it's the quiet ones you have to watch out for :-)

EBear

12:50 pm on Mar 9, 2005 (gmt 0)

10+ Year Member



A fun thread. I'm sorry I didn't find it while the cached pages were live. I take it all those pages were over two years old, since they got into the index?

Just Guessing

12:57 pm on Mar 9, 2005 (gmt 0)

10+ Year Member



Is this the guy that is responsible for the mistake? And Google bought him a return ticket?

I'd fire his boss. How many Google employees actually have the authority to publish web pages on their site? What sort of guidelines, training, education, and supervision are they given? Given Google's business, and their ethos, this is one area where you would keep close tabs on your employees.

walkman

1:27 pm on Mar 9, 2005 (gmt 0)



"I'd fire his boss."

yp, fire or whatever they decide. Employees come up with stupid ideas all the time. Once the boss gives the OK to implement them, he's responsible.

GG: since you mentioned you'll be watching another thread, please take a look at this too: [webmasterworld.com...]
It's hard for us to say how much our sites are being hurt by it (because other things might be in play too), but the problem exists and can be seen in cached pages.

GoogleGuy

4:02 pm on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Just Guessing, in emails to webmasters at google.com you can give as much detail as you want; usually people request reinclusion for a whole site at once, but you could request just for individual pages.

walkman, I'm happy to walk around and ask people about this more. Have you sent an email to webmaster at google.com with the keyword "canonicalpage"? That will help make sure that any reports about canonicalization (including redirects) get to the right engineers.

walkman

4:13 pm on Mar 9, 2005 (gmt 0)



"walkman, I'm happy to walk around and ask people about this more. Have you sent an email to webmaster at google.com with the keyword "canonicalpage"? That will help make sure that any reports about canonicalization (including redirects) get to the right engineers. "

GG, reply will posted on the above 302 thread, not to go way too off topic here.

caveman

5:20 pm on Mar 9, 2005 (gmt 0)

WebmasterWorld Senior Member caveman is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Two days ago, our son blatantly disregarded an important house rule, even though the house rule is posted in writing on the 'fridge. He had been disregarding the house rule for some time, and had been repeatedly warned to stop his bad behavior.

So we banned him from the cave.

He has submitted a reinclusion request, and it is under consideration. Fortunately for him, the weather is relatively warm right now.

========

No offense meant G! Tthis has got to be a tricky one. But you gotta admit, from the outside looking in, it's hard not to make a few jokes. Happily we know you've got a sense of humor! :-)

This 75 message thread spans 3 pages: 75
 

Featured Threads

Hot Threads This Week

Hot Threads This Month