Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Different URLs for same page in navbars.is this bad?

         

ichthyous

6:16 pm on Sep 17, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hi there, I am building a new photo gallery website. I am using an app called gallery which has a URL rewrite module which lets you set each page name to a unique URL. The problem is that the "pretty" URLs arent used in every case. For example: each album has a thumbnail and a title with a link to the the album. The titles link uses the clean URL while the thumb appends "?g2_enterAlbum=1" to the end of the URL. If you have an album with multiple pages there are two navbars at the bottom of the page, a page-number navbar (i.e. page 1 2 3 4...20) and a <previous next> navbar. The previous/next navbar uses the clean URLS while the page-number navbar uses the dynamic URL. I am wondering if all this will wreak havoc with the sites indexing. Thanks for any advice!

g1smd

7:41 pm on Sep 17, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Yes, it's the same sort of duplicate content nightmare that most forums (vBulletin, PHPbb, etc), most carts, and most CMS packages inflict on the web. It does need fixing, by modifying the script.

Get the script to test the requested URL and for all non-canonical versions of the base URL for each page make sure that it adds a <meta name="robots" content="noindex"> tag to the page.

ichthyous

8:03 pm on Sep 17, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Thanks for your reply. The app doesn't allow you to modify meta tags on a per page basis...it just grabs the info you enter for photo title and makes it the page's meta title, then the photo description becomes the page's meta description, etc. There is no way to add custom metas, which is a shortcoming of this app I think. Since some images fall into multiple categories I linked them and changed the page's url to something unique. I also tried to alter the description, etc as much as I could to avoid triggering dup penalties. However there is as of yet no way to exclude robots on a page by page basis

g1smd

8:42 pm on Sep 17, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Yes, it is the scripting itself that needs the modification. If it uses PHP, or similar, then the modifications are just a few lines of code to detect what was in [THE_REQUEST] and add an extra meta robots noindex tag if certain conditions are met.

ichthyous

9:21 pm on Sep 17, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I see...It is php code. I can propose this as a new feature request for the app or see if any of the Gallery developers know how to do it thanks

ichthyous

4:48 pm on Sep 18, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



G1smd...I have been reading the post regarding handling dupe content issues...very informative! I see now that i can also just use 301 redirects to redirect some of the dupe URLS. The problem comes in with each individual image. Since there hundreds of them and each one is linked to by two URLS:

/photo-page-name.html
/photo-page-name.html?g2_enterAlbum=0

I am wondering whether just having that extra string after the? is enough to cause problems? If I have to add a 301 redirect for each image then the htaccess file will be huge!

g1smd

7:40 pm on Sep 18, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



You would use a redirect that uses a Regular Expression to strip off the extra unwanted stuff.

The second line of the rewrite rule would be something like:

RewriteRule ^/(.*)?(.*)$ http://www.domain.com/S1 [R=301,L]

I am not so sure what you would have in the first line of the rule, the RewriteCond part, but I assume that testing for a URL that includes a ? would be close to the mark.

That's a question for the Apache forum here at WebmasterWorld :-)

ichthyous

10:45 pm on Sep 18, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



oh I see, you're saying just write a customized rule to handle all occurances of the added string...much smarter than adding it line by line. I don't know how to write this myself but I'll poke around the apache forum...thanks!

g1smd

10:52 pm on Sep 18, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I'm a novice with that rewrite stuff.

I get by, by taking a guess and tweaking it until it works.

I do that on a development server so that real sites don't get taken down if the rules crash the server (they can, and often do, while I am testing).

.

It's heartening to know that even Danny Sullivan [daggle.com] finds this stuff difficult. However, I would not have done it like that!