Forum Moderators: coopster
This is the problem: I am using a gateway. When a user goes to my site and clicks on an image, he first gets sent to a gateway, he needs to download a small app., and then is forwarded to the image. But, the googlebot got trapped in the gateway, so it thought that all my pages were the same (meaning that it thought that all my images were the gateway page) and started to drop them because of duplicate content.
So, I have lost thousands of pages that had my images indexed.
The page calls the gateway from an include
<?
include("gateway.php");
$BASE_DIR=".";
$BASE_URL=".";
require_once("lib/common.php");
page("item");
?>
And gateway.php had this on it:
<script language="javascript" src="http://www.somesite.com/gateway.aspx?productid=4243&val=34505207"></script>
Now, if I get rid of that include and just use that javascript code on my page, will the spider be able to continue to the image page? I have read that spiders don't follow javascript, so it shouldn't get stuck anymore, am I right?
So, if I do that simple change and use this:
<script language="javascript" src="http://www.somesite.com/gateway.aspx?productid=4243&val=34505207"></script>
<?
$BASE_DIR=".";
$BASE_URL=".";
require_once("lib/common.php");
page("item");
?>
Will the bot be able to crawl the pages like it was before I put up the gateway?
Please help, I am trying to fix this horrible mess.
Thanks