homepage Welcome to WebmasterWorld Guest from 54.161.202.234
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

    
Googlebot and Javascript menu
Can Googlebot spider a page with javascript/DHTML menu?
QNetwork




msg:63666
 7:05 pm on Oct 16, 2002 (gmt 0)

I recently changed my site navigation using javascript & DHTML combination. Previously it had gif images with links. All spiders did quite well with that. I am now worried whether the spiders (specially Google) will be able to crawl my whole site or not after this DHTML menu change. Otherwise, I have to come with some sorts of sitemap pretty soon. If any of you have any experience with similar situaltion, please let me know. Thanks.

[Note to the moderator: I do not know whether it is okay to say "look at my profile for the example site". If it is not okay, please delete these lines.]

 

NeedScripts




msg:63667
 7:24 pm on Oct 16, 2002 (gmt 0)

I think, you will need a sitemap :)

jatar_k




msg:63668
 7:26 pm on Oct 16, 2002 (gmt 0)

I think you may have a bit of a problem. I am no expert but I would say your side menu won't get followed at all.

Mark_A




msg:63669
 8:07 pm on Oct 16, 2002 (gmt 0)

Hi QNetwork welcome, it has a nice look your site.

I think if you used dhtml menus like hierarchical ones rather than js menus the main link from the menu bar would be spiderable thus you would need to create an index page for each section with fixed links to the content. Spiders and those with security set to high would use that, the rest would use the dhtml drop down (hierarchical) menus.

e.g.
left menu item
electronics leads to page /electronics/index.html or whatever with your content and links on it .. but mouseover creates a pop up menu for that section ..

Oops thats not what your site does now is it .. forget it .. im sorry I spoke now .... well heck posting anyhow took me all of a minute to write this :-)

I am sure you could create the same effect as you have with a more spider friendly method.

Someone with a clue will no doubt be along shortly :-)

Jon_King




msg:63670
 3:09 am on Oct 17, 2002 (gmt 0)

I have used DHTML menus quite a bit and researched this topic a bit as well... I have proved to myself through actual SE results over time that they do not spider. My work around is a duplicate text-only menu I place at the bottom of each page.

QNetwork




msg:63671
 1:36 pm on Oct 17, 2002 (gmt 0)

Thanks for all the answers. I went ahead and added a site map link at the bottom of the home page. Site map got every link in text.

Allergic




msg:63672
 1:50 pm on Oct 17, 2002 (gmt 0)

There are no problem with *your* DHTML menu. BTW you always can test it with the spider simulater here : [searchengineworld.com...] to see how a robot see your page. On more sophisticated "software style" DHTML menu, usually Google see the first level for link but cant see the second and up level.

QNetwork




msg:63673
 6:04 pm on Oct 17, 2002 (gmt 0)

Thanks Allergic. I just tried the spider simulator. I did not show any of the links from menu. Anyway I liked that tool.

Allergic




msg:63674
 7:24 pm on Oct 17, 2002 (gmt 0)

Sorry for the mistake QNetwork. I saw so many links i assume there where inside your DHTML menu. You should change your DHTML menu code and also never put the javascript in clear in your code. Use a link instead like <script language=JavaScript src="your_dhtml_menu.js"></script>. By using this, youll have to change only in one file for a category addition and it is better for the robots.

QNetwork




msg:63675
 8:34 pm on Oct 17, 2002 (gmt 0)

Good suggestion. I'll make the changes.

TobyCow




msg:63676
 2:13 pm on Oct 18, 2002 (gmt 0)
I think you could try a bit more simple change: the <NOSCRIPT> tags - spiders care about 'em in order to get links listed there... I guess...
Sasquatch




msg:63677
 5:45 pm on Oct 18, 2002 (gmt 0)

It is also good to consider that a small percentage of people (along with all spiders) surf with javascript off, and your site has to REALLY interest them before they woule turn javascript on.

What percentage of your visitors are you willing to lose because your menu is in Javascript?

jtoddv




msg:63678
 6:34 pm on Oct 18, 2002 (gmt 0)

Your biggest issue is the dyanamic pages.

QNetwork




msg:63679
 6:45 pm on Oct 18, 2002 (gmt 0)

It is also good to consider that a small percentage of people (along with all spiders) surf with javascript off

I verified that the site works okay with even "low" setting in IE. If you trun off active scripting, the menu does not appear. I have no clue how many average users go that far. You might be right about the spiders. That's why I added the site map.

QNetwork




msg:63680
 6:47 pm on Oct 18, 2002 (gmt 0)

Your biggest issue is the dyanamic pages.

Biggest issue in terms of what? Spidering? Googleguy mentioned in one of the threads that Google is spidering dynamic pages well. FAST does it pretty good too. Let me know if you think there is nay other issue related to the dynamic pages.

Sasquatch




msg:63681
 7:04 pm on Oct 18, 2002 (gmt 0)

But it doesn't work of you set it to "high".

Now that I think about it, you really don't have anything to worry about. The people who would be surfing with javascript off or security set to high would not want to go to your site anyway. They are the sort of people that would not be clicking on affiliate links. No offense intended.

It is a problem with some real store sites and manufacturer sites though. they seem to be willing to throw out the 20% of users that do not use IE, or an even higher percentage when they start requiring lower security settings, or even flash.

QNetwork




msg:63682
 7:21 pm on Oct 18, 2002 (gmt 0)

Accept my appologies. I wanted to mean "medium", not "low". Obviously it works with "low". You put in thinking phase again. I got rid of the image-based menu, so that the pages load faster. If that prohibits users to view the page correctly, I do not think its worth it. I agree with you that the people with HIGH safety setting will rarely be frequent visitors of this site. Still I do not want to miss them.

QNetwork




msg:63683
 3:51 am on Oct 19, 2002 (gmt 0)

Thanks every one for providing me with so many important suggestions. I changed my home page menu to image links. Looks like the same. Load time is little bit more, but eventually will make the site much more spider-friendly (which counts a lot). Users with HIGH safty settings should not have any problem. I'll eventually change the other pages with this new image-based menu. Thanks again.

NeedScripts




msg:63684
 3:59 am on Oct 19, 2002 (gmt 0)

<script language=JavaScript src="your_dhtml_menu.js"></script>

Instead of your_dhtml_menu.js, can I use [domain.com...]

Or is there any other way, where I can use the code that will work on different level categories?

cminblues




msg:63685
 4:32 am on Oct 19, 2002 (gmt 0)

Use a link instead like <script language=JavaScript src="your_dhtml_menu.js"></script>. By using this, youll have to change only in one file for a category addition and it is better for the robots.

This is not exactly a link ;)
Perhaps because of this, I've never seen Googlebot [and I mean 'Googlebot' he..] fetching a .js "pointed" by a "src='".

cminblues

NeedScripts




msg:63686
 8:51 pm on Oct 19, 2002 (gmt 0)

So what would be a way, where I can call the java code from same subdirectory (one sub dir under root for all js files), but use it everywhere on the web site (1100+ sub directories).

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved