Forum Moderators: Robert Charlton & goodroi
This implementation requires that some Jscript on the homepage redirect a visitor through the testing-service's servers and then to PageA or PageB. We placed this code at the top of our current default homepage, with nofollow/noindex instructions to keep the spiders from following the redirects.
This past week I saw massive fluctuations in the SERPs in our sector, and once we all started to recover I noticed that our homepage is no longer indexed (site:mysite.com returns 45,000 results, no default.aspx). It appears that this is the only page which has been dropped, as there appear to be the correct number of pages in the index, and we are still appearing in some SERPs, but for subpages on our site.
Does Google not like this implementation? This is a test of visitor behavior on our site to see if one style would be better for humans than the other. It has nothing to do with gaming the engine or SEO. Would a reinclusion request be appropriate for this situation? Are there any suggestions on a change of implementation that could help?
While what you are doing is perfectly valid and legit, javascript redirects have been used for a very long time as a "poor mans" version of cloaking.
Google has gotten better at parsing javascript for this very reason (looking for redirects).
The sudden drop may have nothing to do with it, but I think it's more likely the redirect got picked up by one of the spiders and action taken.
We placed this code at the top of our current default homepage, with nofollow/noindex instructions to keep the spiders from following the redirects.
If I read this correctly (and I'm not sure I am) then it appears that G-bot is doing exactly what you're telling it to do and it has dropped your current index page.
Can you refiddle things to just include the noindex on the test page and still allow the bot to continue to index the current home page?
<meta name="Robots" content="noindex,nofollow">
<script language="JavaScript" src="http://rotating.service.com/rotate.js" type="text/JavaScript"></script>
<script language="JavaScript">
<!--
// Rotation Script
// ------------------------------------------------
// Test: Split Path Test
// Test ID:
// DO NOT MODIFY THIS CODE UNLESS INSTRUCTED BY SUPPORT
var fs = "'http://www.oursite.com/default_a.aspx'";
var qry = Request.QueryString;
[code]
if (document.images)
location.replace('http://rotating.service.com/?group=2886&failsafe=' + fs + '&' + qry);
else
location.href = 'http://rotating.service.com/?group=2886&failsafe=' + fs + '&' + qry;
//-->
</script>
<noscript>
<meta http-equiv="refresh" content="0;URL=http://www.oursite.com/default_a.aspx">
</noscript> (URLs and titles have been changed to generic words)
Below this code is our normal homepage. I had attempted to get the robots to ignore this code and just move down to the regular page, but it appears this might not have worked as intended.
The nature of our site is such that we don't have lots of individual pages to promote through SEO, but rather we promote the homepage. Losing that page from the index has/will cost us dearly in terms of traffic.
We run on ASP + SQL, is there another way to do this that won't cause Google to drop our homepage? Should I begin the reinclusion process now?
It does not care about the javascript, it never got past the noindex.
Back to watching
WW_Watcher
I have submitted the page to be crawled. We still have all of our backlinks to the site. Is there anything else I can do to speed up the recovery process?
You can Fret, scream and hollar, send e-mails, & spend all day checking datacenters, none of which will help. I would not have even bothered to re-submit the page to G, If you have corrected your mistake, it is just a waiting game now.
Back to Watching,
WW_Watcher