homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

CMS V/S Google Supplemental results
What could be the reason

 6:21 am on Feb 28, 2007 (gmt 0)

even after blocking all the dynamic urls pointing to the same file through robots file, most of the cms sites comes in google supplemental results. the actual file is updated with the original content in google cache but still this tag does not remove.

anybody with the similar problem..? is there anything specific we have to add in the CMS code to for prevention. we have used php with apache url rewriting.




 3:09 pm on Feb 28, 2007 (gmt 0)

Once you show Google what to index and what not to index, it can take a year for the pages tagged as Supplemental to actually drop out of the index.

There is nothing more you need to do. People can still get to, and browse, your site.


 3:27 pm on Feb 28, 2007 (gmt 0)

I have a similar problem because of url traversal.
It happens for all relative links in pages.


 3:31 pm on Feb 28, 2007 (gmt 0)

Google chaps have indicated that inbound links are what you need, an I must say, that thats what i've seen.


 6:14 pm on Feb 28, 2007 (gmt 0)

most of the cms sites comes in google supplemental results

Do you mean your cms sites specifically, or are you suggesting pretty much everyone's cms sites are supplemental? If so, that's not the case.

Have you checked to see each page has a unique meta description tag?


 1:16 pm on Mar 1, 2007 (gmt 0)


i mean to say that i have checked most of the sites while searching for the CMS based websites, and found pages under supplemental including 2-3 CMS based sites of mine. I have blocked the unnecssary dynamic urls from robots file and each page is having unique content but still pages r supplemental in indexing. can we block the server to generating session id in php bcoz sometime we can see the url static & sometime with session id so that file goes under bad impression of the duplicate files due to 2 different names.

any solutions.


 6:14 pm on Mar 1, 2007 (gmt 0)

Most of the problems with a CMS are usually caused by using the same title and/or meta description on multiple pages, and/or having multiple URLs for the exact same content -- duplicate content.

Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved