homepage Welcome to WebmasterWorld Guest from 54.237.38.30
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Marketing and Biz Dev / General Search Engine Marketing Issues
Forum Library, Charter, Moderators: mademetop

General Search Engine Marketing Issues Forum

    
Showing a "light-weight" template to spiders (Googlebot), acceptable?
Is showing a lightweight template against google TOS
bigbird

5+ Year Member



 
Msg#: 3936246 posted 7:27 pm on Jun 18, 2009 (gmt 0)

In a scenario a website is using a very graphics, html and advertisement heavy template for all users. Is it acceptable to intercept Googlebot and show the search spider the same exact URL and content but a completely different extremely light-weight template? Sounds black hat, but at the same time there is no malicious intent, just an attempt to improve the googlebot's experience on the site. Right? :)

Is this a bad idea?

 

davidh6781

5+ Year Member



 
Msg#: 3936246 posted 8:48 pm on Jun 18, 2009 (gmt 0)

from personal trial and error google spiders would figure it out. content for users and content for spiders is risky but its up to if you try it, you may get some top serps but it won't last, google penality is not some thing your domain/ip really needs if its your business.

others may have diff opinions.

bigbird

5+ Year Member



 
Msg#: 3936246 posted 8:57 pm on Jun 18, 2009 (gmt 0)

Yes David, thanks for the reply. I am thinking along the same exact lines; I guess I just wanted someone else to affirm it and get this idea out of my head.

davidh6781

5+ Year Member



 
Msg#: 3936246 posted 9:38 pm on Jun 18, 2009 (gmt 0)

np, its a long time in googles blacklist. do it ethically. it may take time but it pays off.

as you says its heavy imags and advert driven maybe write some artilces about the niche etc. 1 way Inbound links help (PR permitting and niche)

Good luck in the adventure.

leadegroot

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3936246 posted 12:32 pm on Jun 19, 2009 (gmt 0)

Well, good design theory should have most of the images in css - so the bandwidth really shouldn't be impacted by a crawler

janharders

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3936246 posted 12:42 pm on Jun 19, 2009 (gmt 0)

First of all: why should that be blackhat? If you don't change the content of the page, but say, the HTML, I don't see any unfair advantages here, just bandwidth-savings and possibly server-load (e.g. googlebot does not need the "who's online now?"-sidebar).
That has ethical cloaking all over it, imho. And unless google starts to work with something similar to yahoo's robots-nocontent or start improving their navigation-extraction, I'd be very happy if site-owners wouldn't send the navigation to google. around 20 - 30% of my serps include sites that have one of the keywords I'm looking for in the navigation and the others in the content - but the navigational keyword is not related to the content, so it's just not helping me at all, because, say, a site that has perl content and also has mysql content, does not necessarily have perl + mysql-content, which is what I'm looking for when I search for "perl mysql".

Not saying that google won't hand out penalties for that behaviour, just saying they souldn't, because it's not black hat and it's actually fixing their flaws.

jkovar

5+ Year Member



 
Msg#: 3936246 posted 12:50 pm on Jun 19, 2009 (gmt 0)

Pretty much everything I've read at Google or by people like Matt Cutts, indicates it all comes down to one question,

"Are you trying to deceive anyone ?"

Now since google doesn't display screenshots in SERPs it would seem that there's no deception going on by taking out decoration elements.

However, since Google does display the filesize of a page in SERPs, and showing Google a trimmed down version of the page would alter that number, it could be considered deceptive since the number shown in the SERPs will be much different from what the visitor will actually see.

janharders

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3936246 posted 12:55 pm on Jun 19, 2009 (gmt 0)

However, since Google does display the filesize of a page in SERPs, and showing Google a trimmed down version of the page would alter that number, it could be considered deceptive since the number shown in the SERPs will be much different from what the visitor will actually see.

You're right, but on the other hand: what number do they show there? the gzipped version's size googlebot might receive (iirc, they can handle compression) or the uncompressed size? in either way, a browser that does or does not receive a compressed version would have a drastically different amount of data transferred.

jkovar

5+ Year Member



 
Msg#: 3936246 posted 1:00 pm on Jun 19, 2009 (gmt 0)

Actually I stand corrected, Google doesn't appear to show filesizes anymore.

bigbird

5+ Year Member



 
Msg#: 3936246 posted 3:49 pm on Jun 19, 2009 (gmt 0)

Thanks for the replies all. So, it sounds like this is perhaps somewhat "shady" but not necessarily against the terms. I think I am going to experiment with this and report back.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Marketing and Biz Dev / General Search Engine Marketing Issues
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved