| 2:39 pm on Sep 6, 2006 (gmt 0)|
You should set up a 301 redirect from the longer URL to the shorter one if both represent the same page (the same content). I would use .htaccess redirects.
Here you can find some examples
Best of luck,
| 7:38 pm on Sep 6, 2006 (gmt 0)|
I read the other posting but still don't understand how to write the redirect. Can you or anyone give a example of how to write this?
I want to redirect
| 10:03 pm on Sep 6, 2006 (gmt 0)|
Far easier is to modify the script so that it detects what URL was requested and simply add a <meta name="robots" content="noindex"> tag on all versions that you do not want to be indexed.
Use the noindex tag on alternative URLs where there are parameter differences and use the 301 redirect where non-www URLs and/or alternative domains are the problem.
| 10:02 am on Sep 7, 2006 (gmt 0)|
I need your help or I may be an unemployed webmaster soon. :-(
The problem is:
10 sites selling widget in 10 different cities across Europe
Info on the templates is different from site to site, although describe items that are very similar; title and description original and unique.
But the code for all the templates is the same ( hyperlinks, tables, picture, the layout in general )
Websites share the same IP on a dedicated server
The question: is there a possibility that google my filter or penalize me for having identical code across those websites?
Any help would be very much appreciated!
| 12:14 pm on Sep 7, 2006 (gmt 0)|
it should be definitely OK if the similarity is only in the CODE and not in the products/text content.
| 4:43 pm on Sep 7, 2006 (gmt 0)|
I have already checked with the writers of my script and they told me there is no way to modify it to block the duplicate URLs.
Would it be possible make a robots file to only block URLS that contain "dir.cgi" since that is the only major difference in the duplicate URLs? If so how would I write it?
| 4:57 pm on Sep 7, 2006 (gmt 0)|
Would this in my robots.txt work to block a spacific cgi page?
| 4:58 pm on Sep 7, 2006 (gmt 0)|
not 100% sure
| 5:41 pm on Sep 7, 2006 (gmt 0)|
" I read the other posting but still don't understand how to write the redirect. Can you or anyone give a example of how to write this?
I want to redirect
It can be done but you really don't want to invoke yet another script or have a very large .htaccess file.
Where it gets done is inside of dirs2.cgi.
You can do the redirect there or manipulate the meta tags emitted by the script to include noindex,follow,nocache ... etc..
What follows is the heart of a redirector, of course you need to provide all of the logic to construct the url that gets put into $location.
Status: 301 Moved Permanently
| 5:58 pm on Sep 7, 2006 (gmt 0)|
Will result in Google not indexing any of the dirs2 anything stuff.
However if Google finds a link to /cgi-bin/dirs2 anything it will create a url only listing in its index.
Please note that blocking the bot also blocks following any of the links within the blocked files.
Use with extreme caution, you can easily block things you really didn't want to and you will lose part of your internal link structure and any external link credit (that is if any other site linked to your dirs2 anything url) for IBLs to that dynamic page.
[edited by: theBear at 5:59 pm (utc) on Sep. 7, 2006]
| 6:51 pm on Sep 7, 2006 (gmt 0)|
The one I want to block is "dirs.cgi". So if I use this robots it will block dirs.cgi and leave dirs2.cgi correct?
| 6:57 pm on Sep 7, 2006 (gmt 0)|
If you want to block the dirs.cgi and Google doesn't stumble on the period when parsing that should work.
I know what happens with the other one because I'm blocking several routines in some forum software that way.
Partial match blocking can have unintended fallout.
And that also means that the redirector goes inside the dirs.cgi script if you choose to go that route.
[edited by: theBear at 7:01 pm (utc) on Sep. 7, 2006]
| 6:59 pm on Sep 7, 2006 (gmt 0)|
>> I have already checked with the writers of my script and they told me there is no way to modify it to block the duplicate URLs. <<
They are wrong. There is a way. It's just they are too lazy to actually do it. A script can be modified to anything you want it to.
Tell them that they can do it, or you can outsource the work and maybe their attitude might change a tad.
| 7:02 pm on Sep 7, 2006 (gmt 0)|
g1smd, I didn't want to say that but I agree.
| 9:00 pm on Sep 7, 2006 (gmt 0)|
I kind of thought they were just being lazy but I didn't know for sure. They just keep telling me to change over to the static version and I won't have a duplicate content issue. I keep telling them that Google already indexed the dynamic content years ago and changing everything would have a devastating effect of my traffic. They tell me Google would pickup the new static URLs very fast and it would have little effect of my traffic but I don't believe them.
I will have another chat with them and see if they can change it or place a noindex tag in the duplicate URL. I paid good money for the script and I don't think it shouldn't have had a duplicate content issue in the first place.
| 9:04 pm on Sep 7, 2006 (gmt 0)|
Google will pick up the new URLs within weeks, but the old URLs will count as yet more duplicate content. You really want to completely avoid having that happen.
You'll also lose all your backlink credit to established URLs, and you will lose any "age" accredited to the old URLs. The old URLs will also continue to appear as Supplemental Results for a full year after you make the move.
If they really understood how the footprint you leave in the search engines index is vitally important to your rankings, then they would not be arguing with you at all.