homepage Welcome to WebmasterWorld Guest from 54.204.90.135
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Hide menus from googlebot
member22




msg:4400104
 2:02 pm on Dec 21, 2011 (gmt 0)

Is there way ( a code ) to hide my menus from googlebot ? but not hide from people who come on my website.

 

tedster




msg:4400200
 5:04 pm on Dec 21, 2011 (gmt 0)

Showing googlebot something different from other visitors is called cloaking. If it's detected, you may lose all rankings.

Staffa




msg:4400279
 9:49 pm on Dec 21, 2011 (gmt 0)

True Tedster, but with all of G's shenanigans where do you draw the line ?

dstiles




msg:4400301
 10:44 pm on Dec 21, 2011 (gmt 0)

I do this with forms and a very few menu items (eg forms links) for all bots, scrapers etc.

Simplest is detect the bot name in the User-Agent (eg googlebot) and enclose the site code (eg menu, form etc) in an IF block.

Alternatively, and better if you have the patience, detect browser User-Agents (Firefox, MSIE etc) and Allow instead of Disallow.

netmeg




msg:4400314
 11:20 pm on Dec 21, 2011 (gmt 0)

You draw the line at how much risk you are willing to live with.

tedster




msg:4400344
 12:46 am on Dec 22, 2011 (gmt 0)

Back to the main question - what is so bad about your menus that you want to hide them from googlebot?

Planet13




msg:4400345
 12:48 am on Dec 22, 2011 (gmt 0)

Back to the main question - what is so bad about your menus that you want to hide them from googlebot?


I second the question.

By understanding what your concerns are, we might be able to suggest a less radical (read "risky") solution.

enigma1




msg:4400513
 2:20 pm on Dec 22, 2011 (gmt 0)

Is there way ( a code ) to hide my menus from googlebot

You can use javascript and present the same HTML to both visitors and bots so no cloaking risk. The difference is bots won't be able to follow your specific menus as they won't find any links, while people with default browser settings (ie js enabled), can navigate as before. There are other methods but with js is simpler.

These methods can also be used for internal page rank flow control, eliminating automatic comment form spam etc., without worrying of disadvantages as search engines changing their algorithms every so often.

tedster




msg:4400585
 5:47 pm on Dec 22, 2011 (gmt 0)

However, Google is executing more and more javascript (and forms) every year. Something to keep in mind if you want to "future-proof" your development.

I still want to hear what the perceived problems are with indexing the menu. In many cases, whatever they may seem to be, there may be straightforward approaches than run ZERO risk. And it's always possible that the concerns are not really something to worry about anyway.

My idea of Google SEO is more like surfing the Google wave than trying to funnel it into some tightly confined channel.

coachm




msg:4400609
 6:58 pm on Dec 22, 2011 (gmt 0)

Tedster, we have ten different sites that are related, but provide original focused content that might be of interest across the network. So, for example (I've changed the topics), let's say I have a site about leadership, and I have a site about careers. We advertise products that might be of interest to visitors from both sites, so we have common element menus in a sidebar, that are...well, almost identical.

It's logical to do it that way for a bunch of reasons, and we've got our sites templated to look similiar in terms of layout and function, but differing in end user appearance (colors, banner, other details).

We don't want to be penalized for dupe content in the sidebars that is the same across domains, so what we've done is we are now starting to serve common elements that occur on several domains via DFP (iframes).

It's an experiment, but doing it that way allows us other flexibility for other things, and it "should" have the added benefit of getting rid of the dupe content.

We're in the ridiculous situation of trying to serve content that is relevant to visitors with different interests, but the content is relevant to various demographics, but we've been slammed in the rankings probably because of what google sees as dupe content.

So, that's our thinking. Could be wrong. Ya know, the black box of Panda.

(then again, Google might see our use of DFP to serve content, as plastering our pages with ads (which isn't the case, in the sense of having adsense or affiliate ads all over the place.)

deadsea




msg:4400850
 12:04 pm on Dec 23, 2011 (gmt 0)

Put the menus in an iframe and put the src of the iframe in robots.txt. Use ajax to load the menus and put the data loaded by ajax in robots.txt.

Using robots.txt to hide things from Googlebot isn't cloaking. Its a design feature of their crawler that they make available for us to use.

The only drawback is that your site preview may look wonky if googlebot can't render the page completely.

enigma1




msg:4400853
 12:20 pm on Dec 23, 2011 (gmt 0)

However, Google is executing more and more javascript (and forms) every year. Something to keep in mind if you want to "future-proof" your development.

I believe they are still at square one. They can only decode some inline javascript and have to follow the html tags. There is no access to side scripts by googlebot on logs that I keep for years. Not unless you provide a hard-coded link to a css or js file.

And even if they start accessing them and deploy a browser engine to parse content. What chance do they have to see what this code does?


<div class="my-link">A link</div>


You think they can ever tell if the element is hooked with a click or hover handler and processed in a side script and perhaps activate a form action or load another page? That's without doing any encoding or encryption or combining multiple tags.

That's a straightforward approach and clean. No cloaking and zero risk and you have lots of advantages and flexibility vs a questionable nofollow attribute.

member22




msg:4400894
 3:22 pm on Dec 23, 2011 (gmt 0)

Thank you for all your replies. I like the idea of the robot.txt file with iframe but for the javascript I am not sure as I believe google can read that.

bumpski




msg:4402014
 10:17 pm on Dec 28, 2011 (gmt 0)

Put the menus in an iframe and put the src of the iframe in robots.txt. Use ajax to load the menus and put the data loaded by ajax in robots.txt.

Even simpler, put the menus in Iframes and mark the robots meta tag of the "IFramed" html file(s) as "noindex, follow". The keywords in site wide links that might "confuse" Google regarding the specific page's topic will be gone. ( So might your "site-links", if you have them! )

Google is now allowed to see (crawl) your menus but it is suggested, by your meta-tag, that Google not index (noindex) the content. And Google doesn't. I'm pretty sure Google will still follow your links in the Iframed html file too. I don't know whether Google will honor the nofollow meta-tag in this context.

I've done this for years without penalty; remember you are letting Google crawl your menus, but Google won't index the content of them.

You really don't need robots.txt to achieve your goal.

I dislike giving up this tip, but lately Google has sort have gotten on my nerves. (How bout you? !)

tedster




msg:4402018
 10:41 pm on Dec 28, 2011 (gmt 0)

member22, if you don't answer the question I asked above, then I don't think you can depend on any other advice in this thread to really help you. You may be taking actions that do absolutely nothing - or even hurt you.

So, back to that question - what is so bad about your menus that you want to hide them from googlebot?

lucy24




msg:4402034
 12:03 am on Dec 29, 2011 (gmt 0)

Using robots.txt to hide things from Googlebot isn't cloaking. Its a design feature of their crawler that they make available for us to use.

The only drawback is that your site preview may look wonky if googlebot can't render the page completely.

If it looks wonky, it won't be because of anything in robots.txt. Preview doesn't use the googlebot and is therefore exempt from any robots.txt restrictions. (The same applies to Translate.)

I think the thread's right next door.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved