I have a site that calls or paginates with an AJAX call so there's no way that the spiders are getting into the product level with a normal crawl. Is there an onpage workaround that I can use to increase the spiderability? I know I can use an XML site map but I'm looking for other ideas.
A partial workaround is to put links to ajax content within noscript tags, but this indexes the bare ajax pages, so if these are visited you need some script to get the ajax back into its intended context.
No js handlers mixed with HTML. The only hook will be onready and even that I would prefer to see it as separate function call at the end of the HTML code so you don't bloat the page with js code the actual content.
The framework should have a modular structure so if js is off the server generates the full page on a link click. Otherwise the ajax retrieves from the server just the HTML to update in the page.
It's called progressive enhancement, and it ensures that your site is accessible & crawlable and indexable.