Forum Moderators: open
Homersim, we don't penalize for JavaScript, but we might not being able to crawl a site as deeply. Some things to consider:
- you might want to simplify your JavaScript. Google can often extract urls from JavaScript, but I can believe that doing utterly weird stuff might mess things up.
- you might want to remove the JavaScript. Most search engine spiders can't handle JavaScript, so for optimal crawling you might want to convert the links to static text links.
- you might want to add easier links to crawl. So you could keep the JavaScript links, but add new text links, or add a site map or something that allows spiders to reach pages without going through JavaScript.
If I had to guess, I'd say that it's a good idea to tackle the JavaScript and give the spiders some help, but that might not be the main issue. If you look at your logs, was the site up the whole time? If you see a big gap with no visitors, the site might have been down for a while?
P.S. No penalties on the site in your profile, so that's not it, either.
Personally, I doubt that searchers are looking for js files. I would like to stop them being indexed but the robots.txt standard doesn't seem to allow me to do so. I would like to disallow *.jscr but from reading the standards it doesn't look like I can do so. I'm open to suggestions but I like to keep my js files near where they are needed rather than in separate directories.
Kaled.