Forum Moderators: open
The biggest thing is they move the toilet mid stream without a hint they are going to do it...(change the rules)
Googles a joke..
tired of their games..
off to support ANY other search engine..enough of this every month change the rules nonsense..good bye Google ..Good riddence..
Actually, I won't rant too much, nor blame Google for anything specific, just tell you what happened from my perspective:
My PR 6-7 site is completely gone from the index. Grey toolbar. This caused my Yahoo listings to drop from #1 to the bottom (2,000 - 3,000th) of Yahoo results (not google.yahoo) on the competitive keywords (since they use Google's PR to rank them). Financial loss here is devastating.
My PR 5 site remained unchanged but I dropped from #1 position on my top keyword to about the 10th page of results. Visitors from google.com have all but dried up except for those random 4-10 word queries.
A fairly new site, which is listed in zeal, yahoo and was being updated in google on a daily basis for the past two months is gone from google. PR is grey. Yahoo listing dropped to over 3,000.
My PR 4 site remained a PR4 but my listings for the top keywords dropped so far down I can't find them. Traffic has almost completely ceased.
Another new site that was indexed in Google results mid month showed up in the results on www2 and www3 and was then given PR0. The only link to it is from Zeal/MSN and it has several affiliate links on the homepage so I wasn't surprised by this.
I guess what frustrates me is that I built and designed these sites to be most relevant to visitors, and although added a little keyword density to help optimize them on Google, was very careful not to crosslink them or add any other links or content that I thought could penalize me. I never added a single banner ad, popup or exit ad and I only used affiliate links on some of the deeper pages.
It really hurts because I have spent about 4-5 hours per nite for the past six months working on these sites and the content on them. Then suddently, overnite, they are worth about 25% of what they were worth a week ago. I would like to try to increase my rankings again, but I have NO IDEA what the hell just happened! I am very frustrated!
Affiliate links are no problem if Google likes the page otherwise. I have pages ranked #1 for their target terms that are chock full of affiliate links.
I suspect the first part of that sentence is more of a problem -- "the only link to it ..." That's where you can take some practical action.
Do some research to find other sites who might find your content useful, and invite them to link to you. Reciprocate if you have to. Then find some more, and some more ... The more links in, the better, because the traffic they send will provide a stable base amidst the SE ups and downs.
It's okay to have 80% Google traffic if the other 20% is enough to survive on if you had to.
I presume that's a joke. Goodness knows what topic area you are searching in.
Some great points in this thread though, especially with respect to responsibility.
Sorry Google, but morality HAS to be an issue amongst decent folk. Hard nosed capitalism, where the strong don't give a hoot about the weak, is pretty ugly.
That is basically what seems to have occurred with some of the guys on here. They are your 'collateral damage' - the innocent bystanders.
Given the numbers that seem to be involved... have you REALLY done the homework to keep this to an absolute minimum? Have you REALLY given due consideration to the the small but high quality sites that some of them produce? It certainly doesn't look like it.... but only you know the answers to these questions. Only you can decide how much of a social conscience to have, if any.
Whether Google like it or not, whether they want it or not, people's lives do depend upon them. Google is now the driving force behind a large proportion of web development. Whole sites are built from the ground up to obey Google's fuzzy rules. Google's linking dependence actually dicates how sites are designed and structured, which sites are linked to, how they are linked to etc. For me, Google has been the death of Flash based sites, for example, which is not a bad thing IMHO, but still significant. With this power comes responsibilityand Google aren't taking it seriously.
What is more annoying is that the whole matter can be so easily fixed. Just state the rules clearly, openly, in detail and with examples and then nobody can complain if they are dropped (unless they are caught up randomly). Google should also provide an established appeal route for sites that have been dropped, and perhaps a prewarning list of URLs of sites that are likely to be dropped in the next index if they don't conform to the rules.
Now to a different issue:
From a spam reducing point of view, I don't think that Google's current tactics work either. Because of the inherent instability of the index and the fact that webmasters can't trust Google to keep their sites present, I wonder just how many webmasters re-use their content across several domains. I do provide SEO services and always advise clients to build at least two if not three sites, obviously obeying all the rules of not simply duplicating. To minimise the unpleasantness of this I will always make sure each site is differently focused and providing at least some unique content. However, you get the picture.
And why do I do this? because Google is so unstable. The only way to make sure that a company is not killed by a random Google blitz is to make sure that the company is not dependent upon a single site. I am just following Googleguy's advice of "diversifying". However, the end result is that Google has three lots of the same content, albeit in different forms, thus reducing the quality of Google's index.
Would I do this if I knew sites weren't going to drop out of Google randomly? no, of course not. It is a lot of effort, but the only way to play it safe. I will repeat: Google forces webmasters into making duplicate sites. Discuss ;)
What all the whining SEO's don't understand though, is THIS is what you are paid for.....to have the knowledge of what the ongoing changing rules are. Yes, the rules have changed again......we should hope for this if google is to maintain the relevancy of their results.
I agree that this was a bad month, but I think that some issues that needed to be taken care of were. I think we will see some of the downplayed variables will be tweaked back up to the level they need to be at to obtain the best results.
I also think that as much as we would like this to be a reliable occupation, it is a shifty game, and if you play it you have to be willing to deal with the consequences whatever they may be.
Even though I don't follow the advice myself...I think we all know that the safest optimization.....is no optimization.
Just made a mental note never to cross swords with you in a court of law (or was that philosophy)
;-)
Napoleon
I also disagree that "the safest optimization.....is no optimization". Half of the work that I do for other people is making sure that sites meet all of the Google and Yahoo rules that I am aware of. So many sites still have hidden comment tags, hidden text, massive meta tags, browser incompatibility etc. and so I clean them up to make them acceptable. I do regard this as optimisation work since it is a prerequisite for not being banned! ;)
WRT to it being a shifty business...well there really is no need for it to be. Google can be as stable as they want it to be. This is the new world industry and we need to encourage it in the right direction. I don't believe that Google is helping this by proving unreliable and unpredictable. They are part of the industry, and are a large defining part of it. If Google is seen as unreliable by webmasters, then so is the web business, which in turn leads to job insecurity and slowed growth. What we will be left with is the group of people willing to take the most risks, but not those who require a stable income.
[edited by: Bobby_Davro at 9:18 pm (utc) on Sep. 30, 2002]
How much money did you give to Google to get those first page listings. NOTHING !.
You made all that money by winning lotto Google for the given month.
If Google changes its algrorithm, they have to answer to nobody but themselves.
You know why I can say all this. Well I beat the odds and won lotto Google again this month :)
And that folks is my last rant on Google (well until the next update at least).
Here are my ideas:
1) The new rule out of this update is that exact phrase anchor text in external links is what matters. And I would suggest that it is the accumulated PR of this subset of links that trumps. In other words 3 PR6 backlinks with the anchor text in them should trump 8 PR4 backlinks with the anchor text in them because of the logarithmic scale of PR (but I am hypothesizing now on this extension of the theory).
I built on martinibuster's theory for this, starting at message #12 here: [webmasterworld.com...]
>>> martinibuster, I agree with your theory.
2) Exact phrase matching in the title or on the page used to be very important. Now it seems to be much less so, and is one of the main factors in why this update appears more spammy to some of us.
>>> Site 1 above doesn't have the exact phrase on the page. It does have the phrase concatenated together however (i.e. "keyword1keyword2" instead of "keyword1 keyword2", which is the exact phrase).
3) Google may be filtering out or discounting the weight of internal links.
My page used to hold a position of #11 for the phrase I analyzed the top 3. Now I am at #60. This page has only two external links showing up in www2 with the phrase in the anchor text. All other links to it (50 or so pages) are internal links with the anchor text in them. This page used to be in the top 5-8 up until last month.
Now here it gets wierd with the internal links theory. I have two other pages that are top 5 for their keywords, both in last months update and in this month's update. Why didn't these pages sink? One of these two is competing with 2.5 M results - the phrase above is competing with 2M results. What is different about these pages is that there are no external links to these pages. All links are from internal pages with the link text included. So I think that maybe internal links are discounted unless internal links are all Google has to work with for that page.
3) Huge PR differences can trump.
Site 2 of the search above I didn't mention. Its not a relevant result. is a big software company with a PR of 8 and the two keywords appear separately on the page. The keyword phrase does not appear in the backlinks -I think. I didn't check for this, but it wouldn't make a whole lot of sense for this particular keyword phrase to be in this Software Company's backlinks - most of their backlinks are logo-image links.
I'd love to hear your input, challenges, or different theories. I don't care if I am right or wrong. I just want to crack the algo.
After just watching (or should I say wading) through the update pages, and listening to different reactions and results..
I am wondering if there are several sets of new criteria. Is it possible that certain sites (i.e. the serp giants) have a set of standards and criteria. Then as mentioned in the above thread (which has affected most of us here) the medium-large sites have a set of different standards. Then the smaller sites yet a criteria/standard on there own. I realize this sounds absurd, but think about it. The serps are going to hold thier weight for certain criteria depending on the magnitude of the sight. Could be a filter or algo that examines a sight (theme, pr, pop, etc.), and the sight is given a set of standards to be judged on based on which "cateragory" the sight falls into. For the medium to large sites, it may not have as much to do the quantity of inbounds, but perhaps the desc., anchor, or quality..
An example I can give has been discuused before.. Yahoo and ODP backlinks are showing for some sights, but not others.. Some have been dropped entirely, others added. There really is NO standard on this update that applies across the board. It depends on the site. I have seen a pattern of similar site results either in the forums, or doing my curiosity checking.. Certain sites all seem to fall the same cateragory or standard of this mystery, while others have been relatively unaffected.
My two cents.
Everyone has looked at the commonly known variables, but even if your theory doesn't play out, I think the answer we are all searching for lies in a new (and probably somewhat simple) variable that has as of yet slipped under our radars.
Inconsistency... yes I hear you. I am looking/hoping for one or two factors that would be across the board, but that may not be the case. When I read the update thread, I wonder how many people have really analyzed closely any SERPS. it gets real interesting once you do.
argusdesigns,
Your theory isn't absurd. Maybe there are multiple criteria. It would make some sense to try to make the results more relevant to the market that was searching for it.
bobmark,
Have you compared the number of external links you have with your phrase to the number of external links with your phrase that the top sites have?
exact phrase anchor text in external links is what matters
Which continues to prove itself out with each update.
Exact phrase matching in the title or on the page used to be very important. Now it seems to be much less so
I suggest it’s too early to tell and that spammy results often settle down by the time Google calms down and that we may want to wait and see with this one. I at least haven’t had time to do more than gather the data in.
Google may be filtering out or discounting the weight of internal links
I’m not seeing that from the sites I’m looking at. What I see so far is that the internal linking is being grouped together better in the backward link checks I’ve been running. If I see any sign of discounting internal linking but right now I’m not seeing that. Like I said though I’m just gathering in the data and haven’t had a chance to analyze anything yet. I like to give things a chance to settle down.
So I think that maybe internal links are discounted unless internal links are all Google has to work with for that page.
That’s an interesting theory. Thanks for the heads up, I’ll keep an eye out on that as I work my way through my data.
I am wondering if there are several sets of new criteria. Is it possible that certain sites (i.e. the serp giants) have a set of standards and criteria. Then as mentioned in the above thread (which has affected most of us here) the medium-large sites have a set of different standards. Then the smaller sites yet a criteria/standard on there own. I realize this sounds absurd, but think about it. The serps are going to hold thier weight for certain criteria depending on the magnitude of the sight. Could be a filter or algo that examines a sight (theme, pr, pop, etc.), and the sight is given a set of standards to be judged on based on which "cateragory" the sight falls into. For the medium to large sites, it may not have as much to do the quantity of inbounds, but perhaps the desc., anchor, or quality..
It seems to me that just a few minor tweeks to the algo can have the same affect. For example, the google bombing with "go to hell" could be handled fairly well by changing the term
link_text
in the algo to
(link_text * (linking_site_PR / 6))
In addition to the normal place for the linking site PR to be considered, they decide to modify the link text specificly by that, discounting any sites < PR6 and promoting link text from sites > PR6.
Now if they change those PR numbers to the voting value of the different PR levels, it would make the text in the links of a PR4 site to be worth effectively 1 and the link text from a PR8 site would suddenly be worth almost as much as most of the other factors combined.
BTW, I am not saying that that is what they did.
Little tweaks can do a lot.
As it is, PageRank fits into chaos theory quite well. Time to go dig out a couple of my old books.
I like your theory. I have been thinking that this may be part of the new algo also. It would be really easy to change the weighting of the PR scale. I also agree with what you wrote in another thread about LYAO on people looking for one factor. I am sure there are multiple factors that Google has tweaked.
Combine some basic logic, fractural geometry and Choas theory and walla! You now have the key to the algo. Well, that is if you can relate the infinate possibilities.
Infinite possibilities are not a problem if you do not want an exact answer. If all you are interested in is probabilities (I feel lucky), then it plays out well. You are just working to narrow the possibilities and sometimes you are wrong.
I imagine that after just about evey update there are a lot of people down at Googleplex wondering "how the hell did that happen", and they even have access to the algo.
Infinite possibilities are not a problem if you do not want an exact answer. If all you are interested in is probabilities (I feel lucky), then it plays out well. You are just working to narrow the possibilities and sometimes you are wrong.
That is exactly what we do as SEO's... great analogy on the "feeling lucky". and your right, people at googleplex are probably wondering that, and they probably have to make a call to thier MIT class-buddies at NASA to get the big picture.
Why The Search Engine Is Not Always Right [forbes.com]
Google can't afford bad press that implies that they have bad results/or that they are spammable. Problem is, their solution has caused a whole bunch of new irrelevant results. I expect some of this to roll-back in coming months.