Forum Moderators: Robert Charlton & goodroi
Results 1 - 2 of 2 from mysite.name for . (0.12 seconds)
So I checked this morning and I get:
Results 1 - 4 of about 172 from mysite.name for . (0.12 seconds)
At first I was very excited, I know that google has been crawling the site properly, but I can still only see these 4 pages regardless of the fact its showing 'of about 172'
What gives?
.
If there is a link to click to see further "omitted results" then it is a warning about "duplicate content", often just title tags and/or meta descriptions that are the same or too similar. Make them all different.
Make sure that each piece of content on your site has only one canonical URL that is used to access it. Get all the others out of the index by using the meta robots noindex tag (preferred) on the other URL versions, or exclude them using robots.txt instead. Matt Cutts [threadwatch.org] has mentioned it several times recently too.
.
If there is no "omitted results" link, then you caught that DC in mid-update. The count represents a load of hidden supplemental results which they have only recently hidden and which will be fully thrown away within 48 hours. At that time you will see the count adjusted downwards to reflect only those results that can actually be seen. I have seen this effect late last year, early this year, and again in the last few days. It has happened to sites that had a large number of one-year-old supplemental results (for URLs that have been 404, or have had a 301 redirect on them for a long time) that have then been discarded by Google.
However, all Meta's are the same, they use a common header... Could this cause issues?
Yes, this being discussed in many threads. Google's Matt Cutts also mentioned it recently. Unique and page specific meta descriptions should get those urls out of the "omitted results" cluster.
The portal I use is already setting the page title to the forum thread name which is nice, so with a little bit of editing, the page will now look something like the following to google:
Title: Cool forum thread about widgets
Description: View the Cool forum thread about widgets on oursite.domain
Keywords: cool forum thread about widgets, keyword1, keyword2, keyword3 etc
Also by using this method, I have got a good strategy in place for the numerous joomla site's I've designed, thanks very much for the tip.
To see the impact of having nice unique titles, descriptions and keywords on a page, you need look no further than the BBC website (am assuming I can say that without breaking any guidelines).
I have noticed that today, Google has gone back to showing 1 of 3 pages (the 172 I guess are now going to be included on the next update).
Will keep you posted and if anyone would like any help with PHPBB2 Plus doing this, let me know.
Google is showing the main page, with the other two as 'omitted'.
The problem here is that www.sitename.com is the same as www.sitename.com/home.html which is a simple re-write of www.sitename.com/index.php
index.php has recently been added to robots.txt but I'm unsure what I can do about the / and the /home.html
I just complained to Google using the how can we improve link. I did a search that should have returned 1000's or even 100's of thousands of results, but there were just four pages followed by:
"click for omitted results"
The complaint was simple. One site was producing 100's of thousands of results for the keywords in question, but many legitimate sites with useful results were intermixed perhaps 1 per page. So thousands of useful results are masked by this one huge site's pages. A somewhat (slightly) legitmate site that rates businesses (Hmmm) and their products so they can produce millions of pages of results that hit about every keyword in the world. So when you do click for ommitted results, this one site still dwarfs and masks all other legitimate results.
Granted I can filter this site using "-site:" and in fact in the past I've asked Google as a user to be able to have a URL filter list for my searches to eliminate these types of sites from my results. I can't think of once where I've seen a "-site:" directive in referer strings in my site(s)logs, so most Google user's don't understand this syntax. Which of course means MY potential visitors may run into the same inconvenience.
So the result is a poor search experience. Poor quality!
Shouldn't Google cap the number of results, from some of these extremely large sites, that end up masking many other legitimate and more useful results?
[edited by: Simsi at 8:59 pm (utc) on Sep. 8, 2006]
site:domain.com shows unrelated/spam domains [webmasterworld.com]