Welcome to WebmasterWorld Guest from 3.227.233.78

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Part 3 Update Jagger

     
4:10 pm on Nov 5, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:June 19, 2002
posts:1945
votes: 0


Continued from
[webmasterworld.com...]


if it rains they will need a replay!
8:20 pm on Nov 6, 2005 (gmt 0)

New User

10+ Year Member

joined:Apr 12, 2005
posts:38
votes: 0


From MC's comment (per Steveb), I would guess the future SERPs will lean more towards 66.102.7.104. Perhaps not 100%, but definitely not 50%.

Dayo_UK

8:22 pm on Nov 6, 2005 (gmt 0)

Inactive Member
Account Expired

 
 


Well wow - in that case they have managed to fix one canonical problem in 10000000000000.

Sorry - annoyed. There must be more to come.

8:22 pm on Nov 6, 2005 (gmt 0)

Full Member

10+ Year Member

joined:July 24, 2005
posts:322
votes: 0


ww-watcher
A site that tells you what colors of paint look good with a certin flowery wallpaper, and what window dressing & flooring go good together, and then tells you how to remove the old, and install the new, and then has a link to buy these products is not informational?

Yes...it is informational

I think you might be misunderstanding what I meant.

Ecommerce = Target stop & shop type site.
Informational - HGTV(they don't physically sell a tangable product...they just tell and refer.)But they have advertisers to support their content.

8:28 pm on Nov 6, 2005 (gmt 0)

Full Member

10+ Year Member

joined:June 17, 2004
posts:347
votes: 0


I see signs that they are working in that "second order change" now..:))

I am watching with the mcdar tool.

Still not at the "9" yet though.

8:40 pm on Nov 6, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Nov 6, 2005
posts:58
votes: 0


Eazygoin

You're right on the mark. I love Adwords because once you get it figured out, you can dramatically lower your cost per click while attracting more traffic and the traffic converts much better than organic listings. And I don't worry about traffic from orgaic listings. If we get it, then it's gravy. I don't think Google owes me a thing.

But from Google's standpoint, advertisers like me pay the bills and you are right, we wouldn't even be having this dicussion if it weren't for the advertisers. So if the users who are looking to spend money - corporate purchasing agents for example - get sick of plowing through the weeds, they won't hesitate to bolt. And my advertising spend will follow the buyers. Period.

Things may change, and they usually do, but from the way it looks right now, Yahoo and MSN have a much better handle on that concept.

Jimmers

8:40 pm on Nov 6, 2005 (gmt 0)

Inactive Member
Account Expired

 
 


First post after utilizing these forums for years because of the sheer degree of information regarding many areas of webdesign and optimization. But, with this update, as well as many others, it all appears the same.

The very first piece of advice I saw time and again was first and foremost building your site for your visitors. This was stated by many of the webmasters who are scurrying to make changes with each update. I understand everyone wants to take advantage of the traffic the big G has at it's discretion to provide but, doesnt anyone else believe that a large majority of webmasters have lost sight of the 'building for the visitors.'

People are running to make changes to the tags and descriptions, create 301 redirects, checking and counting link numbers on their pages, holding back on their updates simply because they may contain too many pages at one time for G and they will trip a penalty.

I respect what G has created for the most part and yes, my sites have been affected on several updates but, it's come to feel like someone dictating how I should build my sites if I want any traffic from them.

I simply decided to build for humans after realizing how many hours were spent trying to chase down problems that may not exist to begin with. The traffic always seems to come back sooner or later and in the meantime the actual visitors can enjoy what has been produced.

8:41 pm on Nov 6, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Jan 9, 2005
posts:1094
votes: 2


From MC's comment (per Steveb), I would guess the future SERPs will lean more towards 66.102.7.104. Perhaps not 100%, but definitely not 50%.

I would wish them that. These results are really clean and more comprehensive.

8:44 pm on Nov 6, 2005 (gmt 0)

New User

10+ Year Member

joined:Dec 11, 2003
posts:29
votes: 0


MC's answer to someone who had his site drop nowhere overnight:

"Kelly Jones, that probably means that you donít have any manual penalty. But the algorithms can still change and that can affect the ranking of your site. A reinclusion request doesnít do much in such a case, because itís the scoring that is causing the site to rank differently, not any sort of manual penalty."

Who programmed an algorithm that suddenly drops a normal clean site several hundred positions down?

Everyone here with good rankings should remember that all it takes is one (minor) algo change and the next morning you check your stats you're nowhere. Get more income sources and build more websites to different domains before it's too late.

8:52 pm on Nov 6, 2005 (gmt 0)

Senior Member from FR 

WebmasterWorld Senior Member leosghost is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Feb 15, 2004
posts:7139
votes: 412


King of all sales

First off welcome to WebmasterWorld

Second ...thank whatever gods one beleives in for you bringing some intelligent input ..there is some but post moderation would drop these threads to less than 50 posts ..bleating on bridges doesn't even begin to describe most of the posts

Thirdly ..a good friend sent me a link the other day when we were discussing this thread off WebmasterWorld which was inspite of the best efforts of some to elevate it from "how you doin ..i'm doin"...not succeeding in changing the tone ( so mostly unmitigated rubbish and zero analysis ..presume the posters would be the same who would like avatars on here )..

He sent me this ...seems to be about what your son [mindset.research.yahoo.com...] was thinking of ..but better ..

Most of the rest of you can get back to your pyjama party and eat canonical cake ..like "G"'s PR dept wants you to ..

Oh yeah and last off ..phase 3 started last Saturday 31.10.2005..

while you were all busy looking the other way ..like the guys at the plex wanted you to ..:)

[edited by: Leosghost at 9:06 pm (utc) on Nov. 6, 2005]

8:56 pm on Nov 6, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
posts:18903
votes: 0


Dayo: Just what do you mean by canonical page(s)?

.

For me, that has always meant Google trying to figure out whether to list the whole site as www or as non-www, and in the absense of any redirects to force the issue, they would score things (incoming links, internal absolute links, etc) to try to work out which was best.

A few years ago, they were able to combine the backlink results and the PR results for both www and non-www and showed the same result in the SERPs for both queries. It took a few months for the combination to happen on a new site. GoogleGuy even posted (several years ago) that this process only ran a few times per year. In the meantime if you added the correct 301 redirect to your site then you didn't have to worry about such things, it just worked.

Then, a while back, things changed again so that the www/non-www scoring was now being done on a page-per-page basis, with simple "duplicate content filtering" taking out of the SERPs (or at least making it URL-only) the "other URL" for the same page. What now happened was that some parts of the site were being listed as www and others were listed as non-www instead.

This started to make a mess of things. One problem was that PR might NOT be being passed around the site to full advantage, but the other problem was much more incidious.

After having one URL filtered out, if the page was ever modified again, the "other" URL for the page would then be "UN-duplicated" and would re-appear as a supplemental result in the SERPs, with a cache from years ago, and the page would start to rank for content that may no longer exist on the real site.

Google had long ago stopped running their "equalise www and non-www" process over their database, and any sites without a 301 redirect in place now started contributing huge amounts of ancient pages to the supplemental index, and massive amounts of "almost duplicate content" to the SERPs.

Even with the redirect put in place at a later date, data in the supplemental index still wasn't touched by that, and continues to show up in the SERPs. With this update, Google is ranking those supplemental pages lower, but Google has never cleaned the supplemental index of cached data that no longer represents real life. That was something that I expected to see from this "update".

To me "canonical pages" means that I need to set up a redirect to tell Google that only the www version of the site needs to be indexed (or the non-www if that is preferred). Having done that, I expect only that version to appear in the index, and there to be no public data for the other version. (If I don't set a redirect to say which one I want, then Google will decide for me.)

For sites where they have already indexed stuff from the "other version" before the redirect was added, I expect them to run a process over their supplemental database, now and again, that looks at the status of every URL that is indexed there and then takes following actions:

- supplemental URL returns 301 and a new URL that leads to a page that has content with status 200: Dump all cache data, title, and snippet data, from the supplemental index for the "old" URL. You have the new URL for the page - go index it.

- supplemental URL returns 404 and has done so for many months: dump the cache, title, and snippet, 6 months after the page went 404. Page has gone. Dump the data.

- supplemental result returns real content and status 200: Update the cache and compare that to other URLs to see if duplicate, if duplicate dump the cache, title, and snippet -- if no duplicate found, leave it in the supplemental index.

Finally, run a process over the supplemental title and snippet database (and for the search queries for which these are returned) (yes, that IS a separate database to wherever the supplemental cache data is kept {I can show you many examples of that as proof}) to remove out of date titles and descriptions from the supplemental index, for pages that in normal search results are shown under the same URL but with a totally different title and snippet. Remove the "ancient history" title and snippet from the supplemental results, and stop the URL being returned as a match for content that hasn't been on the page for several years.

The supplemental index has gradually filled up with junk over the past few years. It really needs a good clean up.

Another example of something mid-bogglingly stupid is when a page goes 404 and still appears in the index; you can use Google Remove to seemingly remove it, but in fact all that happens is that you hide it for 3 or 6 months. After that time, even if the page is still 404 it automatically reappears in the index.

Why?

If the page is 404 and the webmaster asked for it to be removed, then it should stay removed until such time that it ever comes back with a 200 status.

The same is true for pages where the domain no longer exists. You can use Google Remove to get rid of it, but again after 90 or 180 days the page re-appears in the supplemental index even though it still no longer exists. Why?

The supplemental index is stuffed full of millions of pages of such junk. I don't need to see an algo shuffle as an update; I would like to see the actual data in the database cleaned up: spider every URL on the web and act on the status code that is returned. Currently this never happens with supplemental results.

These are the sorts of issues that I thought "fixes to canonicalisation and supplemental problems" would address. For those issues, nothing has changed. Nothing at all.

The searches I monitor haven't changed one bit in the last 4 months. There are two very different sets of supplemental results out there: old and very ancient. In fact Google added a load of duff supplemental data back into the SERPs in August - so instantly UN-fixing the canonical problems that the many 301 redirects added back in March had finally fixed in June.

Is there any hope that Google is going to re-spider all the URLs and actually update things rather than piling yet more ancient history into the supplemental index?

.

Some people seem to be referring to "canonical page" as somehow referring to Google identifying which page is the homepage of the site. That isn't what I see that term referring to at all.

What I see that referring to is where the same content can be accessed by multiple URLs:

- directly at: www.domain.com/somepage.html
- and also via: domain.com/somepage.html
- via a 301 redirect from otherdomain.com/somepage.html
- and by a 302 redirect from competitor.com/redirect.php

that Google picks the "correct version" to index and show in the SERPs and then dumps or hides the rest.

[edited by: g1smd at 9:11 pm (utc) on Nov. 6, 2005]

This 516 message thread spans 52 pages: 516