Forum Moderators: open
For a long time, they were showing title & description only. Now it shows the title and descriptions correctly but with Supplemental result tag.
I understand you cannot rank at all with the supplemental tags.
They re unique contents on my site. Since I use a teamplate including the same meta tags for each category, it is possible that Google think all the pages in a category are same and thus Supplemental.
Is there any way that I can get rid of the supplemental tags?
Since March, I lost about 70% of Google Traffic due to "No Title & No description" problem.
But out of 700s pages indexed, 90% are supplemental.
Thanks for help in advance.
Check your pages with the Server Header checker. If they are served as 302 Found instead of 200 OK then that could be the reason for the Supplemental Results tags.
thanks,
Something interesting happened lately though, googlebot has been attempting to crawl these pages every day for the past week. Each page returns a 404 error. My hope is that they will eventually get removed.
I put <meta NAME="robots" CONTENT="noarchive">
on all of my existing pages to hopefully avoid future problems. I also started going to google everyday and manually clicked on the supplemental indexed pages with the hope that google would take notice of the 404 errors and remove the pages. The next thing i did was go to the google dead link automated removal page.
[google.com...]
It might have been coincidence, but shortly after I started these tactics, googlebot started visiting these pages. Interesting thing is that google is visiting hundreds of supplemtal index pages, and many that were never entered into the automated removal. However, as of today, my supplemental pages are still in the index.
Since the lowest level pages are resource description pages with internal/external links, most of them share the same content from the template.(such as top category menus, left sub menus, and most of all pretty large ASP code to generate the content(usually under 2-4KB) for 30K page. So about 90% of the codes are almost same for each page for all the categories. However, the unique contents are those 2-4KB description and links.
I guess Google doesn't like seeing only 10% difference among those hundreds pages. It is the site's structure, though.
I've got to deal with one personally, and understand it's something with the configuration of the ISAPI filter.
Incidentally, I've tried to use Mambo CMS and had to uninstall it because the same phenomenon was occurring - the same pages showing up with different URLs when accessing from different places in the site - and that was with using search engine friendly URLs, too.
It's a technical issue related to dynamic sites, it's quite a serious issue, and there hasn't been a definitive answer to it yet.
The site is structured as four main categories. Each main category has about 6-100 sub categories. Each subcategory has about 20-50 pages. In fact there are more than 2000 pages on the site. Only 700+ pages are indexed on Google.
I would guess though if your site is working like mine the front page and all the category and sub category pages are indexed and doing just fine and it is mainly the lowest level pages that are suffering.
Try looking at the low level pages that have the smallest number of links in order to get to them - as in ones that are in a category containing 6 sub categories and as few as possible other pages in that subcategory. I'm willing to bet that some of those pages will be indexed or at least be the supplemental ones. As for pages that are in the bigger categories they will be the ones missing altogether.
That is what is happening with my site anyway, hope it makes some sort of sense. I struggling to explain things clearly!
The only recommendation I can make is to try to get more links with a bit of pagerank going to your subcategory pages.
in my case, all of the pages have same URLs regardless of the referring pages. They are not SE friendly URLs I admit. (A few parameters for the page ID and the title with "+" sign within the URL.) However, the SE unfriendly URL shouldn't be issues here because it is crawled and indexed OK. (They are just SUPPLEMENTAL and not ranking at all.)
Since the URLs include the page ID and title of the page, they are quite unique.
What I did to help was creative huge(100KB) site map with as many links to the deeper level pages as possible. Since all the pages have link to the site map from the menu, all the deeper level pages would be two clicks away from the home page.
BTW, the home page has only PR4, it used to be PR5 before (I think March 2004). From the recent backlink update, the site gained a few more back links but I guess they are not counted for PR calculation (yet).
And olias. Yes, I can say it is similar to your problem. My issue is that my deeper level pages are more targeted for the search engines naturally due to the fact that it has brand names and model numbers for each product resource/help info. If you type in "Brand name + model number + Keyword", then my site comes up first. since the no title & description, those deeper pages lost their rankings because they don't have description(and contents) according to the spiders. Now they show as supplemental.. I am not suffering any further decrease in Google since March 2004 because of this. For me the same problem changed its name from No title syndrome to Supplemental Syndrome.
I just look into the 700 pages indexed carefully and ALL THE DEEP LEVEL PAGES PLUS SUBCATEGORY PAGES(yes sub category pages, too) ARE SUPPLEMENTAL. So, basically i have only 20-30 pages out of 700 indexed pages(from 2000+ actual pages) for the Google. Sigh.
Since it is DB driven pages from the template, I cannot reduce the redundant and unnecessay codes.
They used to rank fine until March...