Forum Moderators: Robert Charlton & goodroi
One thing for sure. Reading all these stories on this thread about so many sites either suddenly penalized or dropped in ranking over such wide range can't be explained by an everflux.
Matt himself defined the eveflux as:
The term “everflux” is often used to describe the constant state of low-level changes as we crawl the web and rankings consequently change to a minor degree.
Now.. let me ask those of you whose sites have been affected during the last few weeks; do you consider what happened to your site as minor degree?
There is something going on which neither GG or Matt are telling us, or able to telling us, unfortunately.
Of course I don't expect GG and Matt to tell us everything, but continuing in denying the real reason for what's going on (and keep talking about reporting spam and everything else) would only discredit these two kind gentlemen... said with all due respect.
Dear GoogleGuy and Matt
If you are so sure of your argument that we are still witnessing the usual "everflux" and no "update" has taken place, please come here and lets discuss it wide open. And I might invite Danny Sullivan to contribute to the discussion too?
Make sure you have a 301 redirect from non-www to www and make sure that you are not promoting both "/index.html" and "/" in your internal links, as that would make up to 4 "copies" of your index page visible to the spiders.
I would promote only "/" on "www" for best results.
that would make up to 4 "copies" of your index page visible to the spiders
True, and other spiders have no problem understanding that it's all just one page. That GG chooses to not comment on the issue must tell us something. Personally, I wouldn't expect an early solution. In fact, I don't know if it's worth a rethink on the standard advice doled out here about doing a 301 between www and non-www versions of your site. If, as experience suggests, supplemental index keeps storing stuff from a year ago - all your dup pages will still exist in sup and Google will still occassionally attempt to access those pages for a long time to come.
So much for smarter spidering.
Doing a 301 to get over the www/non-www dup content issue may itself cause other problems.
<tinfoil hat on> First, it tells Google you've been reading WW and/or other SEO forums </tinfoil hat off> :). Seriously, though, a drop in SERPS followed by a .htaccess with a 301 does suggest some SEO activity... perhaps a first indicator of such activity and, as we know, any SEO is anathema.
Secondly, you'll still have dup pages in supplemental, possibly forever. The full future effects of that can't immediately be discerned, but some of them could be detrimental to your interests. Is is better to live with those URLs in supp but ensure the current site is free of what Google might see as dups? Or to take no action now and live with the dups both in supp and in new spidering? Who knows how Google will resolve it (if and when they resolve it). If some things are broken the long awaited "solution" could be broken too. It could fix only those sites for which there is a certain relationship between the URLs Google has in current vs Supplementary Index. Unlikely, but, hey, Google isn't perfect? There is the possibility that those who implement the 301 get hurt while those who don't... benefit.
Third, the official advice from the plex is to do a 301 to prevent their bot seeing your content as dup. Just because certain advice emanated from the plex doesn't guarantee it's your best course of action, of course. You need to take your own decisions as to what's right for you. In this case, however, it does appears that while Yahoo/MSN etc don't need the 301, implementing it doesn't hurt your rankings in those engines. So, fine you'll do the 301. You'll do other stuff they ask, like adding rel=nofollow to prevent any backlash and loss of tPR/algo PR, you'll put links to any ISBNs so Google doesn't autolink them to a merchant of their choice, you'll remove the js links even if they are not spammy because Google doesn't like js now. It's all getting a bit much. Sure, they are under sustained attack - it can only be described as such - from some SEOs but, somehow, I expected more from Google; I expected them to recognise when something is broken, tackle it head on, and do the repairs themselves. Maybe, that's the problem, we still see them as keen, cuddly, approachable, friendly start-ups.
Seems to me that statements by Google folks and observations here are consistent with the following:
There are lots of *underlying* changes going on with backlinks, PR, and site relationships. These are affecting sites more than usual because it appears a large number of pages are getting re-processed.
People posting here are a small subset of a huge population and tend to report changes more than stability.
There have NOT (yet) been significant changes in the ranking algo, which is what Google would call "an update". As MC noted in his blog the term "update" is somewhat subjective.
I'm guessing a true "update" is coming soon.
Just wondering. Will google's next update include additional update of backlink counts and also PageRank on their toolbar?
How does this all work? Does a major update include all this at once?
Or is it broken down over time? Also when exactly was the last major update?
I am thinking right after the 4th of July. Am I correct in assuming this?
Thanks!
Trisha, may be it will help to see if you have -- Canonical URL
- Duplicate content (may be someone has stolen your content that is creating problem)If the site is new, I don't think I have to remind of Sandbox problems :)
I set up the www/non-www redirect months ago, and got rid any duplicate content (yes, I did have some, but have original content too). Got a Site Map set up even; not new sites, (although I have some of those too).
I'm just getting impatient for Google to sort out what pages are there and which aren't - and to not punish a whole site just because it doesn't like some of the pages on the domain (at least that is what I suspect part of the problem is).
Yes I read that too. My question is though if you done alot of onsite work after the PR and backlink update two weeks ago, if there is a major update coming soon, lets say in a week or two would google make adjustments to your specific site realizing lets say you optimized well, added lots of fresh content and finally did a lot of reciprcal link exchanges with relevant sites?
What would happen in a case like this? Seeing that the update has not officially happened yet, would it get picked up in the next update? Or will it take several updates to catch up?
Anyone welcome to jump in.
Thanks!
>>reseller -
Seems to me that statements by Google folks and observations here are consistent with the following:
There are lots of *underlying* changes going on with backlinks, PR, and site relationships. These are affecting sites more than usual because it appears a large number of pages are getting re-processed.<<
There are several factors all mixed together within your argument ;-)
-backlink and PR update:
This can't explain sites being penalized or dropped down into oblivion since around 20 July 2005.
-large number of pages are getting re-processed..
Here is food for the thought.
Do you, GG or Matt really consider processing large number of new pages as "everflux"?
-There have NOT (yet) been significant changes in the ranking algo, which is what Google would call "an update". As MC noted in his blog the term "update" is somewhat subjective.
No matter what you ask GG about regarding current changes, his reply would only be based on ONE argument ... "everflux" in a ONE-WAY-COMMUNICATION manner.
As long as the situation remains as such, any intelligent discussion would be out of the question, unfortunately.
[edited by: reseller at 7:35 pm (utc) on Sep. 14, 2005]
If hard work was put it after the last visible PR and backlinks and there will not be another visible update for lets say two months, will all the work put "in between" the last and future PR and Backlink updates reflect in the current serps even though they will not be visible again until the next update?
When I say hard work I am referring to quality content, onsite optimization and relevant link building.
Do the google serps reflect current update to sites in reference to quaility content, onsite work and relevant link building, even though its not visible?
How does it work?
Can anyone answer this one:
why do some sites flux while others (usually top 3/4) remain carved in stone?
My site is on the second page an fluxuates from # 11 to 15 and back, every few days or so. This appears to me to be a bad sign, it's very disconcerting - does anyone know?
If hard work was put it after the last visible PR and backlinks and there will not be another visible update for lets say two months, will all the work put "in between" the last and future PR and Backlink updates reflect in the current serps even though they will not be visible again until the next update?
Let's see...
First off, GG admits that only a portion of the actual backlinks that Google finds appear using the links command.
GG also tells us that PR and backlinks are constantly updated in the background. They only become visible once every so often. So after Google finds a link, you get credit for it. Some feel the exact weighting of the link may be tempered for a short while.
GG also tells us that PR and backlinks are constantly being used to develop new SERPs - this accounts for everflux.
That being said, I believe that during a true algo update we see the introduction of many newer pages ranking for terms for which they did not rank for previously. My theory is this.
When my site came out of the sandbox, it started ranking for about half of the topics on the site. Some topics, while covered in detail and relating to the overall topic of the site, do not rank at all. For example, I ranked well on articles relating to stocks, but not bonds. With the next algo update, I hope that Google recognizes the site has great content on bonds in addition to stocks.
i am seeing the heaviest gbot activity i have seen in a long time ..real long time.. over the past 3-5 days. anyone else?
For the last two months lots of spidering. Through the 13th, the bot took in 2,100 pages - that's enough to spider the entire site twice. A "normal" month is something around 3,500 page. In August it was 7,800.
I have held this range for 6 months. I over the last couple of months I have doubled the amount of related backlinks for this term. Yet my ranking is actually at the lower end of the range. Currently #14. None of my competitors have seem to have done anything to improve their SEO.
I have done this in a few of the other related areas that I deal in and have not noticed any signifacant results either.
Of course in the other search engines in the last couple of months I have moved to the top spot.
So I believe that backlinks have not yet been given full weight.
BTW doubling the backlinks in one area is not that significant to the overall site because the site is a massive content site. So the SEO I am working on is just a small percentage of the overall links so no chance of sandboxing.
Guys,
I really dont want to say, and i dont want to expect, that their is an algo update, but the referrers to my site coming from google is really increasing and my website serp in some data centers is really higher than before.
My referrers from Google have increased also. Traffic went down about 3% the few weeks before school started and now has jumped up to a new high of 22%. I thought it was just school starting as I get a lot of traffic from college students as my site has lots of tutorials, and other info, but this is a bigger jump than I've seen before. Serps haven't changed much except on those pages I've been tweaking and the search terms people are using hasn't changed much either. Only change I see is just more traffic but maybe I'm missing something.
Just because we do not see the visual changes such as PageRank and backlinks counts does not mean its not getting credit.
Case and point, serps changes daily, cause and effect is from bots picking up your and other sites changes while spidering and applying them to their search results based on their algo.
So why the big concern for the Dance? I don't get it. Its an ongoing thing, it never stops!
So why the big concern for the Dance? I don't get it. Its an ongoing thing, it never stops!
In many ways, especially when it comes to the mental health of webmasters, you are right, but when you're going down, you might actually find out why by comparing notes with other misfourtunates. When you're going up, and choose to share, you will help others by sharing what you did to right.
I've assumed that everflux was Google's A/B testing of their tweaks and result sets, but I could be assuming wrong. Everything I see discussed here suggests that everflux is just datacenters settling up their data AKA the Google dance.
I think A/B testing makes more sense since G has admitted that they do this, what's the consensus?