Forum Moderators: Robert Charlton & goodroi
The remaining pages have been indexed but we now get "In order to show you the most relevant results, we have omitted some entries very similar to the 'x' already displayed.
Why would this happen?
The pages are all 100% handwritten unique content with different titles and descriptions.
To get more pages out of the Supplemental index into the main one you need more trusted links. Best way to do this is to get quality links directly to the pages that are Supplemental. This will not only help them individually but help your site as a whole.
How much unique content, compared to the code load?
Take a look at the source code of an 'average page'
Could someone go into this a bit further? Vanessa Fox said to not worry about the code in a recent interview. What is a safe ratio of code to content, and how long does it take to notice changes once you clean up the code?
For instance, if you start with a page that's 24kb and without removing any of the content, clean up the code to bring the html down to around 10kb, is that a good indication that your site might be suffering due to too much unnecessary code?
But take a site template. It may have common titles, headers, footers, navigation and much else besides. Look at the page source, and there may be two of three feet of code, that is common to every page.
And in amongst it, on some sites, some 50-100 words of copy, described by the webmaster as 'unique content' - and the page as a 'unique page'. Leaving identical titles and meta tags aside for the moment, it is not unreasonale of Google to see such a page as little different to its 5000 siblings.
Code bloat is a major cause of 'duplicate content issues', because it makes differences (relatively) smaller.
Tragically, I know of no way to quantify the ratio or absolute amount of genuine unique copy needed, but common sense can see Google's POV!
If you suspect the problem - or supplementary listing suggest it is a factor, then it can easily be dealt with, depending on the site:
1. Add more copy to each page.
2. Merge two small pages (eg red widgets + green widgets = Widgets)
3. Reduce code bloat
(i) Take out css and js to separate files
(ii) Reduce navigation from whole site to 'local' and section indexes
(iii) Cut out sloganeering and repetitive promos
(iv) Use more css, and fewer HTML markups
(v) Simplify layout+ to reduce code-heavt table formations
(vi) Look long and hard at whatever is left!
This doesn't just help SEs, it helps visitors too; it makes for a cleaner, more elegant, less crowded, more reader-friendly site!
And for those on dialup - they see the page before Christmas!
So far, I've been able to take a 47Kb page and reduce it to 14Kb! And yes, it does load much quicker.
I appreciate the additional info.