Forum Moderators: Robert Charlton & goodroi
We don't use javascript to scroll the page when the first occurence of the search terms are below the fold, although technically that could also be done. And Google loves this site - no problems whatsoever.
Wow. Can you be more specific? Is this script commercially available?
And if the wrong page is being indexed, that's another issue altogether.
Surfers are looking for precise search terms and they are impatient - who can blame them. If a search engine just leads them to a webpage they are left to search through that page to find exactly what they want but in most cases if it's not immediately visible they'll bounce out and try somewhere else. This is not good for us, it is not good for the search engine. IMO we need to have some way of presenting exactly what they are looking for, right in front of them.
Now there are probably other supportive factors involved, especially the site architecture. But certainly there's not a problem created by this script.
Would the effects of your script be visible to a search engine? Presumably G would just spider the site in the normal way, with the script as a separate entity (and presumably with the permissions not set to readable)?
Using such a script allows a lot of content to be placed on one url in a user-frieendly fashion - and I prefered to use show/hide divs rather than scrolling beyond 4 or 5 windows full. We considered informational pop-ups or iframes, but both those approaches made the CMS we'd committed to quite problematic to use, so we decided that show/hide divs was the way to go.
When the -950 penalty first popped up, and as I began to suspect that high co-occurrence levels might trigger that problem, I did get concerned about how Google's algo might see this site. After all, many of these products have thousands of words on one url - that might boost co-occurrence.
But instead of content length being a problem, it looks like this is an asset. It certainly was created to be an asset for the user, but you know you can never be sure with an automated algorithm. Perhaps the co-occurrence tolerance is weighted by total word count in some way.