Forum Moderators: open
OK, so by definition what I propose is not AJAX because it's not asynchronous. I would use the XmlHTTPRequest, once, at window.onLoad, to load up all the extra things that don't have any content or keyword value. I am talking about the logo graphic. the polls. the featured book. the ads. the footer. extraneous external links. whatever. etc.
But for the purpose of this post, I'll call what I'm doing AJAX, because it's employing all the same DHTML techniques.
On one hand, I can imagine that a bot visiting my page will see only the rawest content, without all the fluff, and that will dramatically increase the apparent keyword density of my page. But I wonder: does the presence of an AJAX script look suspicious? Will the bots, or have they already, figured out that AJAX can be used to cloak?
Imagine for instance that I have a page that, to the bots, looks like an essay about the US constitution. But using AJAX, I can rewrite the whole page into a lovely spammerific advertisement for enlargement products. Since the "real" content is being loaded asynchronously, the indexed content has no resemblance to what is actually shown to the user.
Granted, that's a silly way to cloak misrepresented content. But acknowledging that it can be used for evil, would any bot be willing to take an AJAX-enabled page at face value?
I'm only conjecturing that the mere presence of an XmlHTTPRequest would - or should - look highly suspicious.
AJAX is not new anymore, and I'm surprised to have heard nothing about this being discussed. Everyone seems more worried about their AJAX content being excluded from indexing as a bad thing, and not so much attention is being given to non-indexed AJAX as a good thing.
Any comments?
Asynchronous changes to the document have been possible ever since the advent of DOM scripting. Approaches can include: loading new script by creating new script element, iframes, and in IE the default Download behavior.
In many cases, and for the purposes httpwebwitch outlines, the XMLHTTPObject doesn't make things necessarily any easier than they already are.
Unless the SE bot actually runs the Javascript, it cannot reliably detect the instantiation of the XMLHTTPObject. If it tries to detect via the inclusion of certain strings in the script document then this can be sidestepped very easily using a little string manipulation.
e = "docum" + "ent.w" + "rite('hel" + "lo worl" + "d');"
eval(e);
there are many better ways of doing that
Do you think the bots might someday start to execute javascript to see what is being done to the DOM?