Welcome to WebmasterWorld Guest from 54.167.83.224

Forum Moderators: mademetop

Message Too Old, No Replies

Showing a "light-weight" template to spiders (Googlebot), acceptable?

Is showing a lightweight template against google TOS

     
7:27 pm on Jun 18, 2009 (gmt 0)

New User

5+ Year Member

joined:June 18, 2009
posts: 4
votes: 0


In a scenario a website is using a very graphics, html and advertisement heavy template for all users. Is it acceptable to intercept Googlebot and show the search spider the same exact URL and content but a completely different extremely light-weight template? Sounds black hat, but at the same time there is no malicious intent, just an attempt to improve the googlebot's experience on the site. Right? :)

Is this a bad idea?

8:48 pm on June 18, 2009 (gmt 0)

Junior Member

5+ Year Member

joined:Apr 30, 2009
posts:47
votes: 0


from personal trial and error google spiders would figure it out. content for users and content for spiders is risky but its up to if you try it, you may get some top serps but it won't last, google penality is not some thing your domain/ip really needs if its your business.

others may have diff opinions.

8:57 pm on June 18, 2009 (gmt 0)

New User

5+ Year Member

joined:June 18, 2009
posts:4
votes: 0


Yes David, thanks for the reply. I am thinking along the same exact lines; I guess I just wanted someone else to affirm it and get this idea out of my head.
9:38 pm on June 18, 2009 (gmt 0)

Junior Member

5+ Year Member

joined:Apr 30, 2009
posts:47
votes: 0


np, its a long time in googles blacklist. do it ethically. it may take time but it pays off.

as you says its heavy imags and advert driven maybe write some artilces about the niche etc. 1 way Inbound links help (PR permitting and niche)

Good luck in the adventure.

12:32 pm on June 19, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Nov 27, 2003
posts: 1642
votes: 0


Well, good design theory should have most of the images in css - so the bandwidth really shouldn't be impacted by a crawler
12:42 pm on June 19, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:May 31, 2008
posts:661
votes: 0


First of all: why should that be blackhat? If you don't change the content of the page, but say, the HTML, I don't see any unfair advantages here, just bandwidth-savings and possibly server-load (e.g. googlebot does not need the "who's online now?"-sidebar).
That has ethical cloaking all over it, imho. And unless google starts to work with something similar to yahoo's robots-nocontent or start improving their navigation-extraction, I'd be very happy if site-owners wouldn't send the navigation to google. around 20 - 30% of my serps include sites that have one of the keywords I'm looking for in the navigation and the others in the content - but the navigational keyword is not related to the content, so it's just not helping me at all, because, say, a site that has perl content and also has mysql content, does not necessarily have perl + mysql-content, which is what I'm looking for when I search for "perl mysql".

Not saying that google won't hand out penalties for that behaviour, just saying they souldn't, because it's not black hat and it's actually fixing their flaws.

12:50 pm on June 19, 2009 (gmt 0)

Junior Member

5+ Year Member

joined:Oct 3, 2008
posts:96
votes: 0


Pretty much everything I've read at Google or by people like Matt Cutts, indicates it all comes down to one question,

"Are you trying to deceive anyone ?"

Now since google doesn't display screenshots in SERPs it would seem that there's no deception going on by taking out decoration elements.

However, since Google does display the filesize of a page in SERPs, and showing Google a trimmed down version of the page would alter that number, it could be considered deceptive since the number shown in the SERPs will be much different from what the visitor will actually see.

12:55 pm on June 19, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:May 31, 2008
posts:661
votes: 0


However, since Google does display the filesize of a page in SERPs, and showing Google a trimmed down version of the page would alter that number, it could be considered deceptive since the number shown in the SERPs will be much different from what the visitor will actually see.

You're right, but on the other hand: what number do they show there? the gzipped version's size googlebot might receive (iirc, they can handle compression) or the uncompressed size? in either way, a browser that does or does not receive a compressed version would have a drastically different amount of data transferred.

1:00 pm on June 19, 2009 (gmt 0)

Junior Member

5+ Year Member

joined:Oct 3, 2008
posts:96
votes: 0


Actually I stand corrected, Google doesn't appear to show filesizes anymore.
3:49 pm on June 19, 2009 (gmt 0)

New User

5+ Year Member

joined:June 18, 2009
posts: 4
votes: 0


Thanks for the replies all. So, it sounds like this is perhaps somewhat "shady" but not necessarily against the terms. I think I am going to experiment with this and report back.