Welcome to WebmasterWorld Guest from 18.104.22.168
Many of us have tried to make sense of what Matt Cutts and GoogleGuy call Data Refresh (DR). And we came out with different suggestions, and your suggestion could be as good as mine. But we still don't really have an exact definition or know exactly what those DRs are.
Until now, we have witnessed few DRs within the last few months. However, GoogleGuy has mentioned on July 27, 2006 [webmasterworld.com] that:
"There was a data refresh on June 27th that lots of people ask about, but there was also a data refresh in the last 1-2 days that refreshes the same data. Going forward, I'd expect that the cycle time would go down even more, possibly down to once a week for that particular algorithm."
Now, we have seen the effects of those late DRs and you might have also the time to read the many war stories on different related threads. One or many things could happen as consequences of those DR:
- Lost or gain in ranking of your competitive keywords/keyphrases
- Lost of pages of your site
- Your entire site might be subjected to off-on index cycles
Now, imagine that above will occur once a week. Wouldn't you expect that to affect your site and accordingly your business to great extent?
Have you already your emergency plan in place? if not you better prepare one, IMO.
Thanks in advance for your contribution to the thread.
I know many would disagree with me on this, but I would not have a problem to that at all. Why? Because 2 weeks of good Google SERPS for me is better than having 1 month of solid SERP rankings, then 2 months of none. So for me, in my case, i would benefit from such a thing. In fact, it could give me an extra 2 weeks of good SERPS.
This of course would have to be a consistent thing or it would not work.
As far as an emergency plan goes, I would have to say increased PPC seems to work for me, but I would much rather prefer organic than paid.
I myself am monitoring over 20 keywords and I am watching to see if the number of indexed pages increases each time a data refresh comes along.
My theory is simple and I will use this as an example:
Google had 8 billion pages indexed, data refresh comes along, now its 10 billion pages. Well that would mean in this example that there was a 25% increase in pages added to google, so of course serps will change.
I am seeing a lot of new sites in the serps....
So, even if they do not change the algo, if a competive keyword in the index grows by 20% (Which is possible) you might see wide swings in the everflux.
[edited by: trinorthlighting at 5:29 pm (utc) on July 31, 2006]
But yes, I have suffered from a google dip for 6 months before, till everything came back to before - revenues, traffic anything. Compared to that, a weekly or daily refresh is manna. 6 months of no traffic can drive people to suicide!
Pages for: "Blue widgets" in google today are 1000 pages indexed for that keyword and you are number 1.
Data refresh comes along, the algo does not change but there are now 200 additional pages (1200 total) for the keyword "blue widgets"
Now, if all two hundred of those pages score better than your number one page, now your serps change and you are 201.
Thats an extreme example, but imagine how it would work if there were millions of result for a competive keyword and how new or old data could effect you.
I have been tracking more than 100 keywords over a one year time frame and in my stats for those keywords, its drastic evey 3 months, then 30 days later another drastic one.
I also agree with wandering mind, 6 months out, then back is crazy.
I would pray for a weekly datarefresh in that case.
Then if thats the case, that is why every three months we see wide swings. The net changes a lot in 3 months, may be once they get them down to weekly the data will level a bit and we will see more subtle changes.
On the flip side, google also gets rid of pages as well, (404) and probally even duplicate pages as well.
Example, ended ebay auctions, duplicate content on amazon, etc....
[edited by: trinorthlighting at 5:46 pm (utc) on July 31, 2006]
That depends on crawl priority and page rank a bit.
Example, fox news writes a story today and has it on their home page. (Good page rank site) Google indexes in a day or two.
You put a repair part on your website and link from it from a level 4 page that has a pr of 1 , it will take a lot longer.
I am sure there is a point somewhere that determines the crawl priority and when the page will show up in the index.
I mean, as long as I have good rankings I don't want any changes, ever. But as soon as something bad happens I'll be happier this way. It's a lot easier to hold your breath for a week than a month.
Now, if all two hundred of those pages score better than your number one page, now your serps change and you are 201...
And how would these new pages "score" better...with no history, click thrus, usability data? If they show up with more links than an established site that has a definitive history and serves that sector well...then G would likely penalize rather then rank...so you may not see your site move 200 places...but get bumped around a bit...for sure...
Try as you might, it is not possible to make a business plan when visitors to a site swing between such extremes.
One can ignore google traffic and plan according to the lows of the traffic. But then, whether I wish for it or not, it swings back and multiplies by around 400 %, at which point I am very happy, but throws my tax plan into a huge mess. Say, when Google traffic is low, I can hire two journalists for my news site. And then there it shoots up, and to keep my visitors happy, I need 10 of them. 20000 people on my main news website is a big responsibility, they demand features, better quality, more daly updates. If I hire those 10 journalists, there is no guarantee that things would remain so good after the next data refresh!
Yes, we need to take data refreshes into account in our planning. But are data refreshes themselves going to be this dramatic always?
That was an example, but keep in mind those 200 new pages getting put in the index, could have been on the web for a while but never indexed by google. So there could be data there.
I think it will level a bit as they get in a smaller cycle.
So probably this algorithm is about re-evaluating the quality of domains, or about detecting spammy domains; that would be explain all pages of a site loosing or recovering positions.
I suppose new pages added to a domain will rank between cycles, as it happened until now.
Hmm sounds surprisingly reminscient of the old Google Dances, yes?
Didn't the monthly google dances end when the first "updates" started?
To me this again confirms the theory that the "updates" were in fact stop-gap measures for storage issues.
Does this mean we will no longer have "updates" like we used to?
This is my theory and I'm sticking to it. ;)
Hopefully in the long run, we will no longer see sites go missing for 6 months because the newest update tries to patch up the reason quality sites got lost in a previous update.
We might even see an end to many of the filters that we lovingly call the "sandbox".
If so, I will gladly proclaim G the greatest SE once again. Only time will tell tho' ;)