Forum Moderators: open
----
I'm starting this thread because another member suggested such would be a good idea because the main Google update thread is cluttered with posts like "OMG, I've been dropped in the new index!" and "Yippee, I'm now #1 on a key SERP". This thread is ONLY for serious, generic discussion of changes that you are observing with the new algo in this update. As in things like "Looks to me like PR is less important this month, and anchor text of inbound links counts more.", etc. How your site is doing has no relevance here unless you can explain why you think so in terms of a general algo update.
I found interesting what GoogleGuy said here:
[webmasterworld.com ]
GoogleGuy msg #298:
albert, what you said, except I wouldn't be surprised to see SJ show up at other data centers first, and then to start applying the newer data/filters after that.
My message was #292.
For one month only (for reasons not important here) my site was dropped from the index: no backlinks showing.
Prior to that it had plenty; on www now it has plenty. On 2 and 3 it has none. This is not a question of other sites being downgraded - we have links in from more than one PR7 site - they can't all have been downgraded to <4!
However, I am getting some very bizarre entries on 2/3/sj for domains that were 301'd at about the same time.
No explanation I am afraid (and no panic either) - just extra facts for analysers to digest.
On -sj also no backlinks and can't find the site at all.
Not whining or whinging at all (it's very early days) but I suspect that each of these is showing a slightly different "iteration" of the update, which is clearly a different kind of update to those we have become used to.
Seems to me that -fi is possibly a step further on from -sj.
Of course it could all change again next time you look at either!
Some of the backlinks are coming from sites that are haven't been online for at least a month, and those sites offline were not included in last months update as backlinks.
So this index of sites is 2 months old.
HOWEVER...
Some of the new pages I've added to my site for the last crawl are showing up in the google update. So... I'm thinking Google spidered new content in the crawl, but they haven't implemented new PR/backward link calculations yet.
zeus
It is satisfying to know though that our closest competitor lost over 1000 back links which put them below us.
Welcome to WW, not need to be sorry, but don't forget to try the search function :] (I think I've seen you elsewhere...)
Some of my observations regarding the SJ data are that Zeal links are showing up on the "new-old" database. Also my Yahoo Group is showing up in the serps whereas previously it wasn't there. There are other odd pages included also.
The number of allinurl pages of two of my sites has nearly doubled after many months of consistently going up by a 1,000-10,000 per dance. This time 10s of thousands of pages were added. I can't account for when they were crawled as there are more than double the number of allinurl's as number of pages crawled in the past 30 days.
In my space I'm not seeing a drastic change in the serps for important keywords although I don't doubt the spammy results people are reporting. I just haven't seen it for myself. It looks like a normal update from that point of view.
Dominic based in part with this new algo?
The site has been a PR3 for some time, hence "link:" searches always returned nothing. On -sj, "link:" now shows 5 links, so perhaps the site will become a PR4 after the update is complete:). My toolbar is pointed at 216.239.33.100 and is still showing P3.
My -sj backlinks include my DMOZ listing (which is not new), though the DMOZ listing just shows the URL (no title, snippets, or cache). I know this happens when a page has not been fetched by Googlebot -- but if that is the case, how does Google know that it links to my site? This may provide some support for some of the theories regarding new/old index, new/old backlinks, etc.
As far as rankings (on -sj) for my main single word query, the site has fallen 3 places (from 38 to 41), so little change there (so far).
Here is a funny test (which proves that link: command and/or Google's understanding of redirections was modified -and which proves also that I don't understand well how it worked before).
Among the three URLs :
[fr.yahoo.com...]
[yahoo.fr...]
[yahoo.fr...]
the first one returns a document (HTTP code 200) while the other two return a temporary direction towards the first (HTTP code 302)
(I suppose I am allowed to post such general URLs? I swear I am not Yahoo's webmaster trying to make some self-promotion -))
On old Google, link:www.yahoo.fr gives the same answer that link:fr.yahoo.com while link:yahoo.fr only returns the few pages who explicitly contain this URL.
On www-sj and www-fi, link: to the three URLs give three different results.
(Edit reason : one small typo)
[edited by: french_tourist at 3:31 pm (utc) on May 6, 2003]
So, for the sites I looked at, it is showing an old link structure.
Now, why would some sites show an old link structure while others would not?
This gets stranger by the minute.
I don't think we can properly analyze this update because I don't think it's the real deal. Those crafty guys at Google are playing with what they want to use for the real update before they apply it to their freshest material.
I think this is why GG remarked that current crawl and data would be filtered in after the fact.
I'm going to go about working on my site and wait and see what happens. Life is just to short to get all upset over a Google dance! (Besides, there is nothing I can do about it anyway until the dust settles. Then, I'll adjust and move forward just like everybody else here:)
To me this suggests fi is running some kind of filter/penalty that's missing from sj - but perhaps a keyword-specific penalty? On a second look it almost seems like it's a home page-only penalty.
For some sites this would be good, 'cause they dropped a bit in the last update *g* but for most not, they are down now on the "updated server".
Calculating this I agree: THIS is not the googledance, it's just funny google-techs playing with our nerves *lol*
Google results are akin to a Tax Audit. If they gave away the secret sauce, audits wouldn't be much good to the government? Google results would not be relevant to surfers, if Google gave away the sauce. But having Google representatives in certain places available to help out at times, is a big help. Google compared to the IRS,,, sorry....
Google is really good. We believe there is a conscience operating there that factors in that there are many sites listed in Google results that without Google results, they simply would not exist to the surfing world. That would spell doom for many, even newer, small business' like ours that have a large online base to advertise to, and are on very limited budgets.
Truth is, that Google listing results are still by far the most economical way to bring the online audience to your website. They also created a niche for those so inclined (crazy enough. haha) to earn a living helping other people enhance listing results, which is also a good thing.
We would just end off by saying that for those of you, or clients of those that use you for optimization pros-- save the profits the economic benefit Google results can bring.... Man does not live and die by Google results alone, or man will live and die as the Google index will see fit. Having a permanent presense on the net is very important, but traditional paid online and offline advertising has it's place as soon as you an afford to invest in it, and is a necessity for any business or idea serious about getting exposure.... as we are saving our dollars to reinvest the benefit Google results bring, into quality paid advertising campaigns to help us grow....
No, we are not an advertising company.... haha.
Thanks again to all!
I have one site that gets hit by freshbot constantly. It hits hard and goes deep. SJ shows all the previously indexed pages as well as new pages.
Another site hardily ever gets hit by freshbot. When it does, it only hits the main page (with one exception I’ll get to in a second). This site only shows results from before I did an overhaul in December. None of these pages even exist anymore. Now to the freshbot exception. Starting in February, the freshbot started trying to access these phantom pages. This continued until late March. It didn’t hit new pages, just these old pages that I had removed from my server after the January update. They had not been linked to since the overhaul. Then it started up again just last week. The results are showing up without a cache, so they are probably from freshbots crawling 404’s.
One more comment. I recently redesigned a site for someone else. It went live just before the April crawl. They didn’t get any traffic so I went ahead and deleted the old pages. Now, in SJ, I see both the new pages and the deleted pages. But this time the deleted pages have a cache. They still lead to a “Page Cannot be Found”.
Make of this what you will. But it kind of looks to me like Google is searching for a way to incorporate fresh pages into the index better than they have before.