is my jscript menu annoying google?
Still, if you want the internal urls of a site to be frequently spidered, and most of all to pass on backlink influence of all kinds, then currently those links do need to be "vanilla html" -- <a href="[the target url]">
Fix your problem by adding an html menu at the bottom of your pages. se's can spider it and follow your url's. Also make sure you have a comprehensive sitemap installed in the root directory.
Thanks for your replies. There is already plenty other ways on my site for accessing the content. However the menu is more of a search, it's not just links.
Just very strange that when the pages went up to 6000ish every one was cached before I had the new menu in place
Tedster >>> Not sure I understood, you mean that some JS links can be viewed and followed by Google?
I am not too technical, do you by any chance have a sample js code which could be read by Google?
Is there a problem if I "hide" content and display it when surfer clicks on certian button. For example I have in js the following code:
document.getElementById('commentformcontainer').innerHTML="Blah, blah blah, the content I want to display is here";
and now I have one anchor with id showcommentsform. now when surfer clicks it it gets content in the next div container named commentformcontainer.
Is this ok for Google?
|you mean that some JS links can be viewed and followed by Google? |
There is a phenomenon that I've heard called "URL Hunger". During the index size wars between the major search engines it sometimes went to the extreme. For example:
Googlebot will try to spider any character string it picks up that looks like a url -- that has certain characteristics such as beginning with the http: protocol, and so on. This doesn't mean Google will treat that url as a link on the page -- it's only a "real" link if it's in an anchor tag, etc, etc.
Let's assume that googlebot sees an on-page script, and somewhere within the script there is the line: "location.ref="http://www.example.com/funkypage.asp". Googlebot will often put that "we-hope-it's-a-URL" into its crawl queue, if it didn't already have it from sone other source.
Unless that URL also shows up as an html anchor tag, its occurence in the script will not influence ranking calculations in the same way that a "regular" link does. But the "we-hope-it's-a-URL" may get crawled, and if the server resolves it, then that URL may get into the Google index in some way. It may not stick if it's not supported by other links, somewhere or other on the web, but it often will get an audition.
|Is there a problem if I "hide" content and display it when surfer clicks... |
Show/hide scripts for changing a div's visibility are not a problem -- as long as that div is right there as part of the regular html document. The criteria isn't the script, it's the way the div content is written.
thanks for the answer.
So I basically have this in the page:
Is this ok?
I recently added AJAX scripting to my site that performs the same function: replacing the HTML within a hidden DIV. It seems to have affected my SERP on Sept 15 as described in this thread [webmasterworld.com], though the effect may be due to other factors. My "empty" DIVs actually contain a link to an full-page version of the submenu (a partial site map) so that if AJAX fails the user can still get to the links.
<div id="submenuID><a href="smID.html">Partial sitemap</a></div>
Has anyone else using this method suddenly developed SERP problems on Sept 15? If so, please post.
Based on your post, there seem to be issue with dynamic content displaying on page. I think it should not be a problem because it is the part of the whole mechanism.
Anyone got any clue?
>> Tedster : Thanks for the info mate, I got it now :)