|The Consequences of Weekly Data Refresh On Your Business|
A Data Refresh could subject your business to off-on undesired cycle!
Many of us have tried to make sense of what Matt Cutts and GoogleGuy call Data Refresh (DR). And we came out with different suggestions, and your suggestion could be as good as mine. But we still don't really have an exact definition or know exactly what those DRs are.
Until now, we have witnessed few DRs within the last few months. However, GoogleGuy has mentioned on July 27, 2006 [webmasterworld.com] that:
"There was a data refresh on June 27th that lots of people ask about, but there was also a data refresh in the last 1-2 days that refreshes the same data. Going forward, I'd expect that the cycle time would go down even more, possibly down to once a week for that particular algorithm."
Now, we have seen the effects of those late DRs and you might have also the time to read the many war stories on different related threads. One or many things could happen as consequences of those DR:
- Lost or gain in ranking of your competitive keywords/keyphrases
- Lost of pages of your site
- Your entire site might be subjected to off-on index cycles
Now, imagine that above will occur once a week. Wouldn't you expect that to affect your site and accordingly your business to great extent?
Have you already your emergency plan in place? if not you better prepare one, IMO.
Thanks in advance for your contribution to the thread.
"Now, imagine that above will occur once a week. Wouldn't you expect that to affect your site and accordingly your business to great extent?"
I know many would disagree with me on this, but I would not have a problem to that at all. Why? Because 2 weeks of good Google SERPS for me is better than having 1 month of solid SERP rankings, then 2 months of none. So for me, in my case, i would benefit from such a thing. In fact, it could give me an extra 2 weeks of good SERPS.
This of course would have to be a consistent thing or it would not work.
As far as an emergency plan goes, I would have to say increased PPC seems to work for me, but I would much rather prefer organic than paid.
I think data refreshes is actually new data. I suspect with big daddy and crawability improving at google, they are indexing more pages. Every time they dump a batch of new pages in the index, serps will change.
I myself am monitoring over 20 keywords and I am watching to see if the number of indexed pages increases each time a data refresh comes along.
My theory is simple and I will use this as an example:
Google had 8 billion pages indexed, data refresh comes along, now its 10 billion pages. Well that would mean in this example that there was a 25% increase in pages added to google, so of course serps will change.
I am seeing a lot of new sites in the serps....
So, even if they do not change the algo, if a competive keyword in the index grows by 20% (Which is possible) you might see wide swings in the everflux.
[edited by: trinorthlighting at 5:29 pm (utc) on July 31, 2006]
Data refreshes could be new data, but they could be old data as well. On June 27th, my disappeared supplemental pages reappeared in full force and rankings vanished. On the July 27th refresh, rankings back but supplementals remain. Next month, it could all go down the drain or improve.
But yes, I have suffered from a google dip for 6 months before, till everything came back to before - revenues, traffic anything. Compared to that, a weekly or daily refresh is manna. 6 months of no traffic can drive people to suicide!
That is a very good point wandering mind, I think why they are terming them as "data" refreshes that means new or old data.
Use this example as well:
Pages for: "Blue widgets" in google today are 1000 pages indexed for that keyword and you are number 1.
Data refresh comes along, the algo does not change but there are now 200 additional pages (1200 total) for the keyword "blue widgets"
Now, if all two hundred of those pages score better than your number one page, now your serps change and you are 201.
Thats an extreme example, but imagine how it would work if there were millions of result for a competive keyword and how new or old data could effect you.
There is still much to learn about how this lastest GG/Matt teaser will work. But I also feel more comfortable with more frequent refreshes. My nightmare involves my site getting torpedoed by a major update with little chance of improvement until the next major update in who knows how long. If I lose pages, backlinks, and/or position during a weekly change I at least would think I'd have a shot at fixing it by the next week.
"I myself am monitoring over 20 keywords and I am watching to see if the number of indexed pages increases each time a data refresh comes along."
I have been tracking more than 100 keywords over a one year time frame and in my stats for those keywords, its drastic evey 3 months, then 30 days later another drastic one.
I also agree with wandering mind, 6 months out, then back is crazy.
I would pray for a weekly datarefresh in that case.
Then if thats the case, that is why every three months we see wide swings. The net changes a lot in 3 months, may be once they get them down to weekly the data will level a bit and we will see more subtle changes.
On the flip side, google also gets rid of pages as well, (404) and probally even duplicate pages as well.
Example, ended ebay auctions, duplicate content on amazon, etc....
[edited by: trinorthlighting at 5:46 pm (utc) on July 31, 2006]
Agreed, I have been watching this and that seems to be (at least for me when they were doing it). However, I would rather take a weekly drastic data refresh so it would be a little more balanced in the long term, if that makes sense.
New pages are added to the index on a daily basis and rank immediately. Why would Google need a monthly or weekly data refresh to add new pages?
The net changes a lot in 3 months, may be once they get them down to weekly the data will level a bit and we will see more subtle changes.
That depends on crawl priority and page rank a bit.
Example, fox news writes a story today and has it on their home page. (Good page rank site) Google indexes in a day or two.
You put a repair part on your website and link from it from a level 4 page that has a pr of 1 , it will take a lot longer.
I am sure there is a point somewhere that determines the crawl priority and when the page will show up in the index.
I really don't see why weekly refreshes is inherently riskier than any other time period.
I mean, as long as I have good rankings I don't want any changes, ever. But as soon as something bad happens I'll be happier this way. It's a lot easier to hold your breath for a week than a month.
|Now, if all two hundred of those pages score better than your number one page, now your serps change and you are 201... |
And how would these new pages "score" better...with no history, click thrus, usability data? If they show up with more links than an established site that has a definitive history and serves that sector well...then G would likely penalize rather then rank...so you may not see your site move 200 places...but get bumped around a bit...for sure...
The objective behind data refreshes definitely would not be to bring quality sites down. But that happens. So what IS the objective that data refreshes are trying to accomplish, and how do they happen? Are they just including new pages indexed? Are they including old pages selectively using certain criteria? Exactly how do data refreshes differ from our famus old updates or google dance? As far as we know, a data refresh is not an algo refresh/ update. So how do they differ from the google dances of the past?
Try as you might, it is not possible to make a business plan when visitors to a site swing between such extremes.
One can ignore google traffic and plan according to the lows of the traffic. But then, whether I wish for it or not, it swings back and multiplies by around 400 %, at which point I am very happy, but throws my tax plan into a huge mess. Say, when Google traffic is low, I can hire two journalists for my news site. And then there it shoots up, and to keep my visitors happy, I need 10 of them. 20000 people on my main news website is a big responsibility, they demand features, better quality, more daly updates. If I hire those 10 journalists, there is no guarantee that things would remain so good after the next data refresh!
Yes, we need to take data refreshes into account in our planning. But are data refreshes themselves going to be this dramatic always?
That was an example, but keep in mind those 200 new pages getting put in the index, could have been on the web for a while but never indexed by google. So there could be data there.
I think it will level a bit as they get in a smaller cycle.
Ever wonder if they have been using 2 slightly different sets since Big Daddy and over time while they have always been drastic for multiple sites little by little they are working those 2 sets of slightly different data together with each new particular algo?
Note that this one-month cycle, in the future one-week cycle, is just "for that particular algorithm".
So probably this algorithm is about re-evaluating the quality of domains, or about detecting spammy domains; that would be explain all pages of a site loosing or recovering positions.
I suppose new pages added to a domain will rank between cycles, as it happened until now.
I think there are many different data sets
Maybe even weekend sets
Sets for companies like comcast, etc..
Hmm sounds surprisingly reminscient of the old Google Dances, yes?
Didn't the monthly google dances end when the first "updates" started?
To me this again confirms the theory that the "updates" were in fact stop-gap measures for storage issues.
Does this mean we will no longer have "updates" like we used to?
This is my theory and I'm sticking to it. ;)
Hopefully in the long run, we will no longer see sites go missing for 6 months because the newest update tries to patch up the reason quality sites got lost in a previous update.
We might even see an end to many of the filters that we lovingly call the "sandbox".
If so, I will gladly proclaim G the greatest SE once again. Only time will tell tho' ;)
I also think that at least one part of data refreshes is the reinclusion of penalized/banned pages/sites, although there may be a lot more to it as well. That would fit nicely into the 3 month/6 month/9 month cycle that most penalties/bans are set for.