Welcome to WebmasterWorld Guest from 22.214.171.124
I did the rounds to check on the state of various data updates. I'd estimate that the "0.5" (not algorithmic changes, but rather responses to various spam/porn complaints + processing reinclusion requests) should go out this weekend sometime or possibly Monday. There should be a binary push this week to improve a corner-case of CJK-related search, and that new binary should have the hooks to turn on the third set of data. Regarding finishing up the second piece of data, there's still two data centers with older data. Those data centers will probably be switched over by Monday. By Monday, 2.5 of the 3.5 things will probably be on.
Also if I search on my URL, the description comes from the google directory, not from my website.
Anyone khow what is going on?
>Things tend to happen when I want to go to bed.<
Me too. So I just went out shopping and purchased 200 g of instant coffee, plus 225 g of Cappuccino, just in case ;-)
However it seems this Bourbon boy hasnīt much effect on my site ranking or number of referrals. Total average ranking of pages is just the same.
Btw, has any of your sites returned home yet?
also if anyone could shed light on the following please.
I use a cms that configures the internal link to my home page not in the standard way i.e. not "index", "default", "home" etc - in my case www.mysite.com/content/en/indexpage.html.
This way I presume I might trigger duplicate content penalty for having identical content for what is in fact 2 different urls - www.mysite.com and www.mysite.com/content/en/indexpage.html
I hear voices in the forum that inconsistent linking throughout the site, like using www and non-www absolute links to the same pages within the site has caused a lot of trouble and serious drops at the Bourbon update.
Anyone here experiencing the same?
Have not got a clue regarding which DC is most upto date.
With ref to duplicate content Q.
These are my thoughts (people will disagree)
If you have exact duplicate content on the same (sub-)domain then I dont think you will have a problem - one will just get filterd out of the serps.
When you have exact duplicate content on more than one sub-domain or domain then that is slightly different and problems may arise. Which is where the non-www and the www problem may arise.
Therefore - personally I am not to worried about links or Googlebot finding www.domain.com, www.domain.com/index.htm and www.domain.com/index.html on my site. But much more worried about G finding domain.com/index.html versus www.domain.com/index.html
Just my current thoughts (may get changed in time though)
joined:Oct 27, 2001
just a thought...there are some "city -name-widgets" MONEY KW's that have a strong filter at the top 10 for some exclusive guys and dolls sites, funny though ,those guys and dolls are bulletproof in every update....(talking about sites with 0 absolutely NO CONTENT AT ALL) just bookings( I am talking by the way about travel and hotel sites).
I don't know what a "guys and dolls" site is, but I've got a handful of hotel pages with affiliate links for [cityname_1 category widgets] that have disappeared completely from the index in the Bourbon update even though their counterparts for [cityname_2 category widgets] are ranked #2.
A similar page for [cityname_3 category widgets] is mentioned in the pirated eval.google.com document (see [webmasterworld.com...] as an example of a "value-added" [widgets] page with affiliate links. That page is currently #1 for the [cityname_3 category widgets] search phrase, as are my other pages for varous [category widgets] in [cityname_3].
This suggests that any filter for affiliate pages in the hotel category is being applied both selectively and inconsistently. And what's ironic is that the missing [cityname_1] hotel pages are accompanied by far more "value-added" city-related editorial content than are the comparable pages for [cityname_2] and [cityname_3].
joined:Dec 29, 2003
check this: [webmasterworld.com...] apparenty the pages are ranked individually by Google raters. Your site was mentioned as good example, but it also said to check each page. Maybe this is it.
joined:Oct 27, 2001
I would have thought that is much more likely.
I personally think that people are reading to much into eval.google. It just user test/survey type thing IMO(but thats a conversation for another thread)
And one more thing! Four of my sites are now suddenly in the top 10 for that keyword instead of just two! One dropped from #1 to #6 though, which is not good. One is not related, three are related to the KW. Weird...
Ummm, FYI, 'everflux' is what goes on normally betweeen the updates.
> fat lady
Reseller, ya had to go and rub that in, didn't ya? ;-)
> When my order forms start ranking well I usually know Google has swung those knobs a little to far.
outland88, yes, the more we look, the more examples we find of top results that probably won't last, like your forms reference.
Still seems at this point that there will be more substantial changes to come. I don't expect that some of my old dusty sites will continue to outrank previously much higher ranking sites of higher quality, but that's what is happening right now. Good for me if it stays this way, but not good for the SERP's really.
So as noted before, seems likely that more filters and hurdles are still to be added back.
IF that is wrong (and as a reminder almost everything said in here is opinion that might be wrong), then a bunch of sites/pages will need to drop out of top spots over the coming weeks, because there are currently too many high ranking pages that won't generate the click patterns that G looks for from the leaders in the SERP's.
Categories of results I see that will need to go away soon (for various reasons) include:
Also, there are pages not doing well right now that might be expected to reappear over time (aside from the www/non-www and redirect issues in need of fixing):
All conjecture and preliminary observation, but that's the only fun part of these update threads. Otherwise, we're left with nothing but rants and minutiae.
Added to his list of pages that may/probably will not remain.
I will be amazed if the supplementals dated November and backwards last year hold these positions when all is done. (They will either get recrawled or dropped - I would have thought)
joined:Jan 12, 2004
I am really waiting for some people posting that there re-inclusion requests have been successful.
joined:Jan 12, 2004
He quite clearly says changing is a good idea. Having the 301 is a positive step, but its silly to think its some massive coincidence that all these problem sites have canonical issues in common.
Steve can you please explain this "canonical issue"? I'd like to know I have any of these.
With reference to re-inclusion requests:-
GG answers to a Q from JudgeJeffries in the Q&A thread.
JudgeJeffries, I see reinclusions go through, and there's a fair number of them. Once it reaches a person to investigate, I'd budget four weeks up to six weeks for it to show up.
This is the best place to send reinclusion request (I believe)
Although if you have already sent one then perhaps it is a case of patience.
The reinclusion requests I am talking about stem from GG post (the start of this thread - part 4)
I would imagine that it would be dealing with older requests than ones processed since May 21st.
Canonical URLs (I hate doing a definition of this - but if you do a search of Webmasterworld then you can see it has been discussed in depth)
But the idea of you 301 redirecting from your non-www to the www is to fix that problem (If you have it)
joined:Jan 12, 2004
I renew my domain yearly.
joined:Jan 12, 2004
I mentioned May 21st since Friday May 20th all was well, and it was about 3AM on the 21st when I realized I was gone from the G index.
Ok, then I guess I don't have the canonical URL issue anymore since I made mine a 301 (and still awaiting input from Japanese as to why the 301's are now bad).
We renew yearly. We have always had good rankings until Bourbon (but I can already see things settling - 0.5 perhaps). However, we've been at that domain for over 5 years. Surely Google Almighty sees the connection and wouldn't penalize a site for it's business practice (renewing yearly) when it's been around for a while.
My point is, that I doubt it has anything to do with the recent tanking of several sites. There may be a slight penalty in there, but I doubt it's related to the Bourbon Massacre.
Just my observations. Like any car wreck, every witness sees it differently.
I've identified three, maybe four, possible causes for our G referral collapse:
1) www vs. non-www issue. This has since been fixed.
2) Existence of legacy cobrands with duplicate content. This has since been fixed.
3) The recent addition to the site of a large block of "thin affiliate" pages - while these pages only account for about 2% of our total pages, the number is still significant. These pages are nothing more than Title, db driven stock descriptions, address, and affiliate link. A penalty related to this might also be called the "too much too fast" penalty referenced elsewhere - too many new pages in too short a period of time.
I am starting to lean strongly towards thinking it's number 3 that has dinged us - not only resulting in poor rankings for those new pages (rightfully), but also affecting the rest of the site. Couple of reasons that I think this:
- we have not received a note from G saying that we have not been penalized - we have received a form letter referring us to the Webmaster guidelines. And the guidelines say specifically: "Avoid .... 'cookie cutter' approaches such as affiliate programs with little or no original content."
- When I search for SiteName (without.com), the www version correctly shows as the first result. It doesn't look to me that G is confused between www and non-www. It just looks like G has degraded the www home page's importance (a search for the unique, non-competitive home page title puts us on the second page of results).
- The dead cobrands have never gotten us in trouble before. They are ancient (five years old), not fresh, and not driving any traffic. I think it would be clear to an automated system like G's that these are not spam pages.
I plan to robots.txt these new affiliate pages to hide them from the search engines, but to leave them for our user base.
Anyone have experience with this sort of issue and care to offer an alternative solution?
joined:Dec 29, 2003
I don't think it was that cut and dry. It more like "one of the factors", but honestly, now that everyone knows, I doubt they'll even use it at all. If you have a site that can make you thousands a month, spending less than $100 for a 10 year renewal is nothing. Now only non-SEOs will renew one year at a time. :)
Questions - Do you appear for your old "high rank" searches if "list ommitted results" is clicked?
Do you find a large number of sites that reference YOUR material but appear ABOVE you in results?
Our G refs collapsed after allegra, we fixed www vs non www, fixed some dupe content items. We also have what might be called "thin affiliate" pages for hotel listings and have left them up though I think you may be on to something.
We have THREE notes from Google saying that we have no penalty on our site. I no longer believe them since I can search for a totally obvious result like "oursite.com in ourcity, ourstate" and appear ONLY by clicking "list omitted results", plus our referrals are at 1% of former level while Yahoo referrals increase.