Welcome to WebmasterWorld Guest from 54.221.30.139

Forum Moderators: open

Message Too Old, No Replies

Googlebot and Javascript menu

Can Googlebot spider a page with javascript/DHTML menu?

     

QNetwork

7:05 pm on Oct 16, 2002 (gmt 0)

10+ Year Member



I recently changed my site navigation using javascript & DHTML combination. Previously it had gif images with links. All spiders did quite well with that. I am now worried whether the spiders (specially Google) will be able to crawl my whole site or not after this DHTML menu change. Otherwise, I have to come with some sorts of sitemap pretty soon. If any of you have any experience with similar situaltion, please let me know. Thanks.

[Note to the moderator: I do not know whether it is okay to say "look at my profile for the example site". If it is not okay, please delete these lines.]

NeedScripts

7:24 pm on Oct 16, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think, you will need a sitemap :)

jatar_k

7:26 pm on Oct 16, 2002 (gmt 0)

WebmasterWorld Administrator jatar_k is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I think you may have a bit of a problem. I am no expert but I would say your side menu won't get followed at all.

Mark_A

8:07 pm on Oct 16, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hi QNetwork welcome, it has a nice look your site.

I think if you used dhtml menus like hierarchical ones rather than js menus the main link from the menu bar would be spiderable thus you would need to create an index page for each section with fixed links to the content. Spiders and those with security set to high would use that, the rest would use the dhtml drop down (hierarchical) menus.

e.g.
left menu item
electronics leads to page /electronics/index.html or whatever with your content and links on it .. but mouseover creates a pop up menu for that section ..

Oops thats not what your site does now is it .. forget it .. im sorry I spoke now .... well heck posting anyhow took me all of a minute to write this :-)

I am sure you could create the same effect as you have with a more spider friendly method.

Someone with a clue will no doubt be along shortly :-)

Jon_King

3:09 am on Oct 17, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I have used DHTML menus quite a bit and researched this topic a bit as well... I have proved to myself through actual SE results over time that they do not spider. My work around is a duplicate text-only menu I place at the bottom of each page.

QNetwork

1:36 pm on Oct 17, 2002 (gmt 0)

10+ Year Member



Thanks for all the answers. I went ahead and added a site map link at the bottom of the home page. Site map got every link in text.

Allergic

1:50 pm on Oct 17, 2002 (gmt 0)

10+ Year Member



There are no problem with *your* DHTML menu. BTW you always can test it with the spider simulater here : [searchengineworld.com...] to see how a robot see your page. On more sophisticated "software style" DHTML menu, usually Google see the first level for link but cant see the second and up level.

QNetwork

6:04 pm on Oct 17, 2002 (gmt 0)

10+ Year Member



Thanks Allergic. I just tried the spider simulator. I did not show any of the links from menu. Anyway I liked that tool.

Allergic

7:24 pm on Oct 17, 2002 (gmt 0)

10+ Year Member



Sorry for the mistake QNetwork. I saw so many links i assume there where inside your DHTML menu. You should change your DHTML menu code and also never put the javascript in clear in your code. Use a link instead like <script language=JavaScript src="your_dhtml_menu.js"></script>. By using this, youll have to change only in one file for a category addition and it is better for the robots.

QNetwork

8:34 pm on Oct 17, 2002 (gmt 0)

10+ Year Member



Good suggestion. I'll make the changes.

TobyCow

2:13 pm on Oct 18, 2002 (gmt 0)

10+ Year Member


I think you could try a bit more simple change: the <NOSCRIPT> tags - spiders care about 'em in order to get links listed there... I guess...

Sasquatch

5:45 pm on Oct 18, 2002 (gmt 0)

10+ Year Member



It is also good to consider that a small percentage of people (along with all spiders) surf with javascript off, and your site has to REALLY interest them before they woule turn javascript on.

What percentage of your visitors are you willing to lose because your menu is in Javascript?

jtoddv

6:34 pm on Oct 18, 2002 (gmt 0)

10+ Year Member



Your biggest issue is the dyanamic pages.

QNetwork

6:45 pm on Oct 18, 2002 (gmt 0)

10+ Year Member



It is also good to consider that a small percentage of people (along with all spiders) surf with javascript off

I verified that the site works okay with even "low" setting in IE. If you trun off active scripting, the menu does not appear. I have no clue how many average users go that far. You might be right about the spiders. That's why I added the site map.

QNetwork

6:47 pm on Oct 18, 2002 (gmt 0)

10+ Year Member



Your biggest issue is the dyanamic pages.

Biggest issue in terms of what? Spidering? Googleguy mentioned in one of the threads that Google is spidering dynamic pages well. FAST does it pretty good too. Let me know if you think there is nay other issue related to the dynamic pages.

Sasquatch

7:04 pm on Oct 18, 2002 (gmt 0)

10+ Year Member



But it doesn't work of you set it to "high".

Now that I think about it, you really don't have anything to worry about. The people who would be surfing with javascript off or security set to high would not want to go to your site anyway. They are the sort of people that would not be clicking on affiliate links. No offense intended.

It is a problem with some real store sites and manufacturer sites though. they seem to be willing to throw out the 20% of users that do not use IE, or an even higher percentage when they start requiring lower security settings, or even flash.

QNetwork

7:21 pm on Oct 18, 2002 (gmt 0)

10+ Year Member



Accept my appologies. I wanted to mean "medium", not "low". Obviously it works with "low". You put in thinking phase again. I got rid of the image-based menu, so that the pages load faster. If that prohibits users to view the page correctly, I do not think its worth it. I agree with you that the people with HIGH safety setting will rarely be frequent visitors of this site. Still I do not want to miss them.

QNetwork

3:51 am on Oct 19, 2002 (gmt 0)

10+ Year Member



Thanks every one for providing me with so many important suggestions. I changed my home page menu to image links. Looks like the same. Load time is little bit more, but eventually will make the site much more spider-friendly (which counts a lot). Users with HIGH safty settings should not have any problem. I'll eventually change the other pages with this new image-based menu. Thanks again.

NeedScripts

3:59 am on Oct 19, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



<script language=JavaScript src="your_dhtml_menu.js"></script>

Instead of your_dhtml_menu.js, can I use [domain.com...]

Or is there any other way, where I can use the code that will work on different level categories?

cminblues

4:32 am on Oct 19, 2002 (gmt 0)

10+ Year Member



Use a link instead like <script language=JavaScript src="your_dhtml_menu.js"></script>. By using this, youll have to change only in one file for a category addition and it is better for the robots.

This is not exactly a link ;)
Perhaps because of this, I've never seen Googlebot [and I mean 'Googlebot' he..] fetching a .js "pointed" by a "src='".

cminblues

NeedScripts

8:51 pm on Oct 19, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



So what would be a way, where I can call the java code from same subdirectory (one sub dir under root for all js files), but use it everywhere on the web site (1100+ sub directories).
 

Featured Threads

Hot Threads This Week

Hot Threads This Month