homepage Welcome to WebmasterWorld Guest from 54.237.98.229
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Canonicals fixed?
site: command returns believable numbers
europeforvisitors



 
Msg#: 3138434 posted 10:57 pm on Oct 28, 2006 (gmt 0)

For the first time in several years, Google's site: command lists [b]no[/i] results for www.mysite.com (I default and redirect to mysite.com) and a mere few thousand pages for mysite.com instead of 20,000, 30,000, or more.

Are others noticing a similar improvement?

Does this mean Google's "canonical" problems are finally fixed?

 

theBear

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3138434 posted 11:24 pm on Oct 28, 2006 (gmt 0)

It means that the site command is estimating things better.

[edited by: theBear at 11:24 pm (utc) on Oct. 28, 2006]

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3138434 posted 11:40 pm on Oct 28, 2006 (gmt 0)

You normally get that effect about a year after you originally put the fixes in place.

Actually, you often see a big improvement within just a few weeks, but then a month or two later many of the now redirected URLs reappear as Supplemental Results and then hang around for a year or so.

europeforvisitors



 
Msg#: 3138434 posted 12:39 am on Oct 29, 2006 (gmt 0)

You normally get that effect about a year after you originally put the fixes in place.

It's been a year and a half for me. (I put the fixes in place around April 1, 2005, and the site: command was still returning huge numbers when I last checked a few week ago.)

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3138434 posted 2:41 am on Oct 29, 2006 (gmt 0)

I also see sites returning numbers that are reasonable now when they were always 4X or worse -- and I see this even in cases where there was no canonical fix (or issue) on the part of the site owner. I think theBear got it right. The site: operator is returning better url number estimates now. Matt Cutts said that this was in the works.

grant

10+ Year Member



 
Msg#: 3138434 posted 3:27 am on Oct 29, 2006 (gmt 0)

The site:URL query returns results from both Google's main and supplemental results. You should review the results for pages in supplemental and figure out why there are in supplemental (if this is the case, it might not be).

Maria444

10+ Year Member



 
Msg#: 3138434 posted 6:16 am on Oct 29, 2006 (gmt 0)

Any special reason why you decided to use redirects to site.com instead of www.site.com?

theBear

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3138434 posted 2:17 pm on Oct 29, 2006 (gmt 0)

IIRC europeforvisitors has always promoted his site without the www subdomain designation. That meant that practically all of his inbounds were to the non-www form.

europeforvisitors



 
Msg#: 3138434 posted 3:12 pm on Oct 29, 2006 (gmt 0)

Exactly. The domain without "www" looks cleaner and makes for better branding (at least in my opinion).

We may have needed "www" back in the early 1990s to distinguish Web sites from gophers, archies, ftp servers, etc., but nowadays it just seems like excess baggage.

jetteroheller

WebmasterWorld Senior Member jetteroheller us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3138434 posted 4:57 pm on Oct 29, 2006 (gmt 0)

Exactly. The domain without "www" looks cleaner and makes for better branding (at least in my opinion).
We may have needed "www" back in the early 1990s to distinguish Web sites from gophers, archies, ftp servers, etc., but nowadays it just seems like excess baggage.

Exactly my thought.

About the canonical, I noticed spring this year, that some of my domains had been indexed both ways.

On June 27th, I implemented redirects in my .htacces
August 14th, I set in sitemap the preferred version

Now all only with the right version indexed

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3138434 posted 5:15 pm on Oct 29, 2006 (gmt 0)

Don't be surprised if a few of those already redirected URLs reappear as Supplemental Results in a few months time. I have seen that happen many times.

europeforvisitors



 
Msg#: 3138434 posted 7:18 pm on Oct 29, 2006 (gmt 0)

I set in sitemap the preferred version

I also did that when the option was introduced a few months back. I don't know if that played a role in the www results going away, but maybe it did (considering that I'd done the redirect in .htaccess more than a year and a half ago).

leadegroot

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3138434 posted 1:18 am on Oct 31, 2006 (gmt 0)

I also note that the site: command results now appear to be generally in order of age (oldest first) rather than higher PR first.

No, I take that back - I just checked another site and it has a whole bunch of pages that I only added a week ago as positions 2 - 50, rather than older higher PR pages. Weird!

Its nice that the numbers are better, finally!

MattCutts

5+ Year Member



 
Msg#: 3138434 posted 5:46 am on Oct 31, 2006 (gmt 0)

Yup yup. I won't claim that the estimates are perfect, but they're definitely more accurate than they were at the beginning of the summer.

dmje

5+ Year Member



 
Msg#: 3138434 posted 6:26 am on Oct 31, 2006 (gmt 0)

Speaking of the site command, Am I correct in my thinking that if you have over 1000 pages in google the site command will not show beyond a 1000?

Reason for asking, after reading this thread I decided to check my site with the site command and I got to 948 of 2170 before the omitted results prompt showed and after I clicked to show the omitted results it only went to 1000 of 2170.

If this is correct is there no way to see the remaining results? I may have been doing it wrong but I always used the site command to help with finding duplicate content, if any, by seeing how many results it pulled before the omitted link appeared, but that is useless if it will not show beyond 1000 pages.

What is the best way to check for duplicate content, or is there one?

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3138434 posted 7:05 am on Oct 31, 2006 (gmt 0)

Yes, 1,000 urls is the limit for any single search result. But you can also do a site: search on a longer file path, such as domain + directory:

site:www.example.com/directoryname/

You can even get more complicated by adding keywords to your search, or using the inurl: operator along with site: and so on, if you want to zero in on specific areas of your domain.

OutdoorWebcams

5+ Year Member



 
Msg#: 3138434 posted 8:44 am on Oct 31, 2006 (gmt 0)

As Matt pointed out, the estimates now are much more accurate than a couple of weeks (months?) before.

Before this 'fix' I have never seen a site:-command returning any numbers between 1.000 and 9.000.

When one of my sites had about 1.000 pages there always was a jump from just below 1.000 to just above 9.000 and back, but right now I get something like 2.000 pages which is much more accurate.

But this can't be due to a canonical fix as I never had any issues with this.

BTW: I use www.domain.com, because threeletters.domain.threeletters looks more 'symmetrical', nicer for my eyes... ;)

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved