Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Strange Google Results doing site:

         

Swanny007

11:48 pm on Jul 7, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



So help me out here please ;-)

When I do a site:example.com on Google (for my site of course), it tells me it's indexed 762 pages. That's about how many static pages I have.

When I do a site:example.com inurl:* it lists 62,300 pages. That's pretty much the site including the photo galleries, forums, etc. In other words, that's my static and dynamic content.

Why doesn't site:example.com just show me everything? It seems odd to me.

Also, when I do site:example.com inurl:forums it says there's 24,200 results (sounds good). But in the results, it lists two pages, and then it says "In order to show you the most relevant results, we have omitted some entries very similar to the 2 already displayed. If you like, you can repeat the search with the omitted results included.". Does that mean my site is part of that supplemental index stuff (I don't really understand that yet, I need to do more reading). FYI, I'm using phpbb for those particular forums (I've done the anonymous SID removal mod).

Thanks folks.

tedster

12:57 pm on Jul 8, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There are a lot of Google Gremlins in the site: operator results -- what you report is not necessarily a problem. But I would recommend you click on the "Omitted Results" link and study what it shows you. You may find things you can do to get those urls to be listed in the first line results. Common issues:

1. Multiple URLS for the same content (this is common in unmodified forum software)
2. Identical title tags or meta descriptions. You're better off with NO meta descriptions at all than to use a generic one for many urls, in my recent experience.

g1smd

9:19 pm on Jul 9, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>> "In order to show you the most relevant results, we have omitted some entries very similar to the 2 already displayed" <<

This is warning about duplicate content. The other pages are too similar and are hidden from the list. Make sure that every page has a unique title. Make sure that every page has a unique meta description.

Warning: if you have NO meta description Google now takes the first dozen words from the visible page content and uses them as the snippet. This also has the same effect as having an identical meta description on every page. Make sure that the meta description is there on every page and is different for every page, and exactly reflects the on-page content of the page that it is on.

Make sure that every page of content has only one URL that can access it. See the posts I made about Duplicate Content in popular forum software just a few months ago, for more information.

[edited by: g1smd at 9:22 pm (utc) on July 9, 2006]

quixote

12:39 am on Jul 10, 2006 (gmt 0)

10+ Year Member



This is probably meaningless, but has anyone here tried a site: search without the .com or .net or .org at the end, such as this:

site:www.domain

My domain pulls up mostly supp results for site: with the .com, but I get only non-supps without it. And yes, it says "Results 1-10 of about 500 FROM www.domain"

My point being that leaving off the "dot-whatever" extension also removes all supps from the results.

Anyone had a similar experience? I'll wait while you try it out... :)

kidder

12:47 am on Jul 10, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yep that works on my sites also - seems to return a realistic page count.

tedster

12:59 am on Jul 10, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



No go here - 8x on one domain I checked. And when there are separate .com, org, .net domains out there, I see urls from all of them.

daveVk

3:36 am on Jul 10, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



quixote - good observation.

This also happens with www.example.com/directory, that is supp results not shown.

moftary

6:13 am on Jul 10, 2006 (gmt 0)

10+ Year Member



You're better off with NO meta descriptions at all than to use a generic one for many urls, in my recent experience.

That was my very first thought about meta description and omitted results, but it seems that I was mistaken.

A few sites that I have been watching for a few months are very stable in google index, in terms of indexed pages and ranking. A common factor for these sites is that they all use a generic meta description on all their pages.

Using the site: operator on one of them, I get results 1 of 1 of around 4 million pages, that is the homepage of the site, and the rest is omitted. However, this particular site is really doing great in google. Any search for the titles of the subpages, and they are all very competetive search terms, results in the subpage being in top ten in serps.

I applied that on a recent site that I published and got good results, yet it's too early for me to take this theory for granted.

Cheers

*edited for grammer mistakes :)*

[edited by: moftary at 6:15 am (utc) on July 10, 2006]

g1smd

7:28 pm on Jul 10, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>> You're better off with NO meta descriptions at all than to use a generic one for many urls, in my recent experience. <<

I don't agree. Both same-meta or no-meta are bad.

Lorel

3:39 pm on Jul 12, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



This is probably meaningless, but has anyone here tried a site: search without the .com or .net or .org at the end, such as this:
site:www.domain

Yes, this is a good way to find those who are copying your website by using other tlds.

Lets say your own domain is example.com

This search will bring up:
example.org
example.net
example.biz
example..co.uk
example.demon.co.uk

If you have the domain/business name trademarked then you can do something about it.