Forum Moderators: Robert Charlton & goodroi
A set of holiday season penalties and filters just rolled out.
[edited by: tedster at 3:21 am (utc) on Nov. 1, 2008]
I don't disagree with the Holiday Shopping Algo idea, but how could home page/sitelinks penalties be related to better holiday SERPs? It would seem to make more sense to set up a holiday shopping algo based on commerce.Does anyone know from previous years what Google was attempting to do or did for its Holiday Shopping Algo that made sense?
Why would Google want to penalize just the home page (and not the entire site)?
Darnit p/g! I said i didn't want to go on any rants! :P
My working theory (which we never saw fully realized from last years #6) is that it isn't a penalty at all.
And even perhaps they ARE testing click-thrus (just not the way everyone usually assumes)
lol i'm not saying more until i get more concrete proof.
I wouldn't want to ruin my Prediction's Score(tm) on this site ;)
no sitewide or blogpost links were built for it in the last 2 weeks. social bookmarks are completely devalued for internal pages as well.
[edited by: Strider at 12:17 am (utc) on Nov. 1, 2008]
It would seem to make more sense to set up a holiday shopping algo based on commerce.
In major PPC campaigns, businesses often target different kinds of query terms to different parts of the shopping cycle. For example, a generic product type used as a search term is most to be likely early on in the cycle - not someone who is ready to buy but rather needs top-level information.
If the search phrase includes a manufacturer's brand name, then we're talking about comparison shopping quuite often. But a specific product, maybe including a model number, is quite far along the cycle - and if the word "buy" is in the phrase, then the intent is particularly clear.
I've been expecting this kind of logic to show up in the regular SERPs - and to a degree it already has. Generic terms get a lot of informational results. Maybe this current version of the SERPs is another attempt in the same direction.
But why that goal would involve dropping site home pages still eludes me. Sounds like an unintended side effect.
What could have happened also the PR fell from 3 down to 2, Can someone help me out here! with a kinda explanation. Is it worth going to Google by writing to them and asking what happened and why?
"If that different set of results has a much lower count, you might be seeing a SERP with no supplemental results in it..."
Strider/G1smd.. hi. good explanations both of them. pity i didn't count the the number of results though when i had the chance. doh!
I do believe this is more than me getting a different set of results personally. I saw another relevant page of mine ranked highly which never has been previously! I've checked through several proxies here in the UK and everything is consistent, except for that single time i saw different results about an hour ago. i havent been able to replicate this yet,
Yeah, I'm thinking roll-back of some kind, but perhaps not the usual, is part of the problem. My sites' caches seem pretty up to date but some of the SERPs look old.
> One of my internal pages that ranked at number four has just disappeared entirely!
That happened to me, too. I've had home page and internal page drops from this new Google test/bug.
On another note, I have one site that kept its sitelinks whereas another lost them. No idea why unless it's a test, i.e., Google's testing is only affecting sample sites.
p/g
[edited by: potentialgeek at 1:46 am (utc) on Nov. 1, 2008]
64.233.161.83 has very strange results....though we can't review just a single IP anymore since Google removed that.
Wait till the dust settles............
[edited by: MLHmptn at 2:21 am (utc) on Nov. 1, 2008]
> I've been expecting this kind of logic to show up in the regular SERPs - and to a degree it already has. Generic terms get a lot of informational results. Maybe this current version of the SERPs is another attempt in the same direction.
If I'm following you correctly, Google is trying to give holiday shoppers the best online shopping experience. It does this based on the intention of the user, i.e., it separates results between buyers and reviewers.
So during November and December, review/comparison sites could lose rankings and shops could get better rankings for those whom Google thinks are ready to buy?
What about sites that are in a sector of an industry which doesn't have both reviews and stores? Nobody is selling anything. Is Google yet able to distinguish between the commercial and non-commercial sites as well as commercial and non-commercial sectors? My sector provides a service and doesn't have online stores; it's advertising-based.
p/g
P.S. Has Matt Cutts or a Google rep responded to this development? I seem to recall he was late on the last similar one (#6 penalty), and then said: "I'm not aware of anything that would exhibit that sort of behavior." [seroundtable.com...]
Didn't they take some time before they figured out that bug? My concern is this one could also not be understood quickly and therefore last longer than it should before they fix it.
[edited by: potentialgeek at 2:39 am (utc) on Nov. 1, 2008]
[edited by: Strider at 2:37 am (utc) on Nov. 1, 2008]
209.85.207.nnn is from the "YA" "datacentre" -- wherever that is.
Oddly enough, those IP's, which had been stable for months, have been replaced by 74.125.95.nnn
[edited by: Atomic at 2:41 am (utc) on Nov. 1, 2008]
I can't yet grok how this goal would entail integrating two different data sets - but I certainly don't have a detailed picture of how Google moves data around, either.
Another negative to my theory is it should be their goal year-round, and not just at holiday time.
So during November and December, review/comparison sites could lose rankings and shops could get better rankings for those whom Google thinks are ready to buy?
Well, it might not be that clear cut. Review/comparison sites might get a BIGGER presence on some SERPs if the data shows Google that those query terms are not historically searched on by buyers. And the SERPs might evolve more toward buying as we dive into December.
Again, this is theoretical for me right now, and I hestitatd to bring it up because I don't want to start more mythology. But now I think it's worth sharing the idea, just to see if it strikes a chord with anyone looking at other areas than the ones I watch. Just brainstorming, not hard information in other words.
[edited by: tedster at 3:18 am (utc) on Nov. 1, 2008]
... can't yet grok how this goal would entail integrating two different data sets ...
For hardcore DC watchers - Watch closely
- what they have now
- what they have when it LOOKS like its 2 datasets
- what the final results are.
Oops, i already said too much. =P
(If you're not up and watching early on the morning when it all shifts over in about 30-60 minutes,
you're gonna miss the juicy gold nuggets of information)
P.S. Has Matt Cutts or a Google rep responded to this development? I seem to recall he was late on the last similar one (#6 penalty), and then said: "I'm not aware of anything that would exhibit that sort of behavior." [seroundtable.com...]Didn't they take some time before they figured out that bug? My concern is this one could also not be understood quickly and therefore last longer than it should before they fix it.
uh oh, now you've done it p/g. Set off my rant-rage. ;)
It took NO time to figure out they had messed up.
I (and i think someone else) had ALREADY figured out they had messed up their test and simply
FORGOTTEN about it.
(along the lines of my same prediction that THIS TEST should be done by Jan 2nd when they got back to work)
When Tedster DIRECTLY asked about it, MC didn't know.
BUT he asked the engineer soon after and SIMPLY DECIDED TO NOT TELL WEBMASTERS FOR OVER A MONTH.
Why did they do this?!
Read my consistent rants over the past 3 years for the answer.
That's Goog for ya.
Always organized. Always looking out for your best interest. -.-