Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Dealing with the consequences of Bourbon Update

Which changes has Bourbon brought about & How to deal with them?

         

reseller

3:41 pm on Jun 5, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Assuming that the greatest part of of the latest Google update (Bourbon) is completed, its rather important to do some damage assessments, study the changes brought about by Bourbon and suggest ways to deal with them.

We need to keep this thread focused on the followings:

- Changes on your own site ranking on the serps (lost & gained positions or disappearance of the site).

- Changes you have noticed on the new serps (both google.com and your local google site) especially in regards to the nature of the top 10 or 20 ranking sites.

- Stability of the serps. I.e do you get the same serps when you run the same query within the same day or 2-3 successive days (both google.com and your local google site).

- Effective ethical measures to deal with the above mentioned changes.

Thanks.

outland88

5:56 am on Jun 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The new redesign may crash and burn Reseller. I keep the backup if I get hit to hard. I do it for customers and freshness not Google. No I don’t shift domains.

66sore

5:57 am on Jun 14, 2005 (gmt 0)

10+ Year Member



Sorry to 'break the rythm' of this thread, but I am dealing with the consequences of Bourbon by deciding to tweak my text for Yahoo.

One of my sites found its way to page 1 on Yahoo for a 100,000 result "five word pop culture phrase" resulting in significant traffic to my site.

This same "five word phrase in quotes" does not even show up in the first 200 results on Google.

In several respects, Google is broken, and I will let the world know, and start looking for new "alternatives" to Yahoo (& MSN).

reseller

6:06 am on Jun 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



66sore

>Sorry to 'break the rythm' of this thread, but I am dealing with the consequences of Bourbon by deciding to tweak my text for Yahoo.<

Good point!

Maybe we can add a new point to our list of how to deal with the consequences of Bourbon:

-Optimize your site for other search engines (like Yahoo, MSN ..)

reseller

6:18 am on Jun 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Updating the list of suggestions to deal with consequences of Bourbon, that this thread has received:

- Do nothing as probably more changes on the way

- Subtle page changes and monitor SERP changes

- Do a 301 redirect regarding yoursite.com vs. www.yoursite.com (canonical url problem)

- Removing 302 redirects

- Removing duplicates

- Optimize your site for other search engines (like Yahoo, MSN ..)

- Transfer your affected site to a spare/emergency domain

Any other suggestions?

stu2

6:51 am on Jun 14, 2005 (gmt 0)

10+ Year Member



fearlessrick

Yup, that was what I was wanting... the 26 steps. I'd seen it mentioned a few times but couldn't find it.

Dayo_UK

7:45 am on Jun 14, 2005 (gmt 0)



Morning All.

Re: 64.233.183.99 & 64.233.183.104

These are the DCs that GG said watch for new serps - however they look like they have followed other dcs this morning IMO - rather than leading the way.

reseller

8:04 am on Jun 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Dayo_UK

>Re: 64.233.183.99 & 64.233.183.104

These are the DCs that GG said watch for new serps - however they look like they have followed other dcs this morning IMO - rather than leading the way.<

And should I understand that Dayo_UK is still happy for the serps today as he was yesterday ;-)

Dayo_UK

8:06 am on Jun 14, 2005 (gmt 0)



Bit early to tell - really looking at logs rather than serps.

Have not heard from GG in a while in the forums :( - hope he is ok and not put off by all the negativity.

reseller

8:13 am on Jun 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Dayo_UK

>Have not heard from GG in a while in the forums :( - hope he is ok and not put off by all the negativity.<

GG is used to the "negativity" and we are used to GG codings/decodings lines ;-)

I guess the majority on these forums highly appreciate GG´s feedback.

Btw, words on the street say that GG would be in the PubConference. Look for a "Larry" there ;-)

Dayo_UK

8:22 am on Jun 14, 2005 (gmt 0)



>>>>I guess the majority on these forums highly appreciate GG´s feedback.

Yep.

>>>Look for a "Larry" there

Larry Lol - dont think so.

Johan007

8:34 am on Jun 14, 2005 (gmt 0)

10+ Year Member Top Contributors Of The Month



What many of us will never know when our sites recover is that if it was bug or if our actions actually made the deference? I think our actions did. The one positive thing to come out of it is that many people have learnt lots of SEO and I think I have learnt so much I can add SEO to my Web Design CV.

I had fully functional shops (running of the same product database as other sites) and will most likely be running it off a new url with Google exclude or do people here recommend putting it in a sub domain with Google exclude? Maybe I should just kill it altogether :(

I think many of us have question for GoogleGuy (its a shame so many people have posted naff questions on thread)

Dayo_UK

9:18 am on Jun 14, 2005 (gmt 0)



IMO it looks like Google are sorting out Canonical URLs. I would suggest that [64.233.161.105...] is one of the more advanced dcs for sorting out this issue.

Once that issue is sorted out hopefully traffic will return for some of us. :) Some of us need a really good crawl though.

If so - well done Google (esp GG for taking feedback to where it matters)

Natko

9:22 am on Jun 14, 2005 (gmt 0)

10+ Year Member



for 'yacht croatia' 6 of top10 are spam site (hiden text mostly), one of them is 'under construction' for months, that before bourbon were nowhere.
It has nothing to do with 302, agresive links acquiring or whatever that has been mantioned here.
I think G has become a SE of new generation; spam engine

Pico_Train

10:59 am on Jun 14, 2005 (gmt 0)

10+ Year Member



What some of us really need is for the 302 and cached page issue to be corrected.

We also need people to stop using 302s.

Then we need people to stop copying and pasting content.

Then we need a good crawl and things to be sorted out.

Then, maybe then, I might make the money I did 3 months ago.

Right now, it looks like I've lost 20000 a year and my retirement fund.

Absolutely fantastic.

Johan007

11:02 am on Jun 14, 2005 (gmt 0)

10+ Year Member Top Contributors Of The Month



IMO it looks like Google are sorting out Canonical URLs. I would suggest that [64.233.161.105...] is one of the more advanced dcs for sorting out this issue.

I think not! [64.233.161.105...] is terrible for me even though its very US .com forcused it still has by site penilised.

Borek

11:16 am on Jun 14, 2005 (gmt 0)

10+ Year Member



We also need people to stop using 302s.

At least in some cases it is done in good will - php header("Location: new_url") sends 302 by default. No idea why not 301, but if you are new to the game you are likely to do this mistake as I did :(

sailorjwd

11:21 am on Jun 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I had done a really stupid thing back in Feb.

I renamed several pages that were being hit hard by scrappy scrapers sites and had gone url-only. This was in an attempt to break them lose from the scrapers.

I didn't re-write the content of the page so that is likely why Google thinks the content of the scraper sites is 'original' since their pages were created earlier.

So now I'm adding/rewriting content on the pages - even though all main pages have been fully indexed within last 10 days.

I also noticed on one page in particular... This page only had 3 paragraphs of text (~250 words) and several bullet lists, and several screen shots images.

A scraper site had copied 1.5 paragraphs and used it on 3 of their pages. This is the only 'main page' that is not indexed and it is obvious that G thinks the scraper site has the original content.

So I've had to rewrite the content and add some. And the moral to this story is that Long pages (>700 words) likely won't get affected by a scraper taking a large paragraph but smaller pages will. I guess they have a % duplicate test.

sailorjwd

11:38 am on Jun 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Pico,

Ditto here, but add a zero to than number :(

Dayo_UK

11:51 am on Jun 14, 2005 (gmt 0)



[64.233.161.105...] - I agree the serps arent there but the way G is handling Canonical URL issue seems to be a bit better there (at least from what I am seeing)

bumpski

11:56 am on Jun 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Sailorjwd

But did the scraper site also copy your page's Title or Description and make it part of the content? This is very common for the scraper pages. If so, Google can figure out eventually that your page is the original. Unless the scraper copies your Title and Description and makes an identical page, your page should win out in the long run if a "sort" is being done.
That's why I believe Google is and has been doing a long winded bubble sort of web pages, actually checking uniqueness of Titles and Descriptions. It's going to take a long time, they will keep kicking sites out of the index and then crawl them back in when they think they have a true original Title and Description. The scraper sites help by providing a link (or at least what looks like a link) to the source of the original Title and Description.
I know this idea is farfetched but more and more the sypmtoms and evidence bear it out. My primary site was down to 20% indexed pages, now it's back up to 60%. Before the update it was 100% indexed for 2 years.
Given all this I realized there is one important thing to do!

1. Make sure your site doesn't look in any way like a scraper. When thinking about this I had a lot of index pages linking to my articles. There was a paragraph describing each article, and what did I put at the end.

Read more .... LINK

Looks like a copied snippet, and I said, thats not a good idea! Even though my content is all original, in some cases it looked a little like scraper pages on my index pages. So needless to say a lot of "....." strings are gone from my pages.

2. I'd also think at this time, a sitemap.txt, or sitemap.xml would help Google sort out what is truly the correct URL to use to fetch your pages. Your telling Google like this "www.google.com" not "google.com" when you ultimately submit your link to your sitemap.txt or xml. This can help Google get rid of those redundant copies of your own pages (although I don't think they've started, yet..... (oops scraper like))
SEE Google:
[google.com...]
then sign up if you haven't and submit a sitemap.

To sum up, and ad to the thread todo list:
1. Be patient, Google is sorting out originals by Title and Description?
2. Don't look like a scraper
3. Help Google out with a sitemap.

bumpski

12:21 pm on Jun 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



To late to edit so:

Just to follow up if my crazy theory were correct here are some: .....
Don't Do's
1. Don't change your page's file name or path.
2. Don't change your titles (for now)
3. Don't change your descriptions (for now).
4. Only change your content on existing pages a little.
These are the only things, at least today, that make your site unique and are easy to sort.

walkman

12:49 pm on Jun 14, 2005 (gmt 0)



>> And should I understand that Dayo_UK is still happy for the serps today as he was yesterday ;-)

count me on the very happy camp too. I had no dupe or canonical issues though. Just my domain was "sandboxed". Now I'm doing very good on all DCs.

My suggestion is to wait till GoogleGuy announces that they fat lady has sung. Anything you while in panic mode will come back to bite you.

Clint

12:56 pm on Jun 14, 2005 (gmt 0)



if that taft tool is accurate...thank google for filters. my site goes from top of the serps to oblivion with the filters off.

OldPro, I don't think it is. I get the exact same results with OR WITHOUT the filter! Figure that one out.

Clint

12:57 pm on Jun 14, 2005 (gmt 0)



So, you guys that are seeing positive things happen, what did you do?

Dayo_UK

1:07 pm on Jun 14, 2005 (gmt 0)



I wouldn't call myself very happy - just seeing encouraging signs.

Clint - it could be my 301 kicking in or it could be Google sorting things out.

Clint

1:30 pm on Jun 14, 2005 (gmt 0)



Can someone please explain what exactly the allinurl: does or means? I just tried that on my domain, and without the www G shows about 300 results, yet only shows 9 results on the page and with that "In order to show you the most relevant results, we have omitted some entries very similar to the 9 already displayed. If you like, you can repeat the search with the omitted results included" below it.

I tried it with the www and got over 200 results, with again only one page of 5 hits showing this time.

Check out some of the results G shows for the NON www search:

The first two hits are on my domain, two of my pages.

The 3rd hit looked like a 302, but it's not...
[TheWazzooDomain...]

The 4th hit is a payment link from some long ago defunct payment processing service! It's not even on my domain!

The next hits are as follows:

MyDomain.com=20/
Similar pages

MyDomain.com%5B7/
Similar pages

MyDomain.com%5B1/
Similar pages

www.MyDomain.com%09/
Similar pages

ALL of these 4 links above are BAD LINKS, clicking them gives you "the page cannot be displayed". How the heck did those get in there and what does that mean?

Then the 9th and final hit is a 302 (hijack?)
"HTTP/1.1 302 Found". URL is like:
[SomeOtherSubdomain.OtherMainDomain.com...] . Note the ?u= and the %2F in that URL, which DO NOT exist in the real URL. Also, that full link works, it goes to the correct webpage on my site.

Doing the command with the www, shows the exact same results for 1st, 2nd and 3rd spots; 4th is another one of the www.MyDomain.com%09/ and 5th is the same as the last result (9th) above.

Does this tell anything of use to anyone?
Thanks.

fearlessrick

1:32 pm on Jun 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Don't know if this is relevant or not, but yesterday was my best AdSense earnings day of the month and four of the last five days have been the top earnings days (including yesterday) of the month.

I question this as Adsense earnings on my site follow the same pattern routinely: 1st 10 days - not so good; 2nd 10 ten days - usually the best of the month; last 10 days - back to not so good.

So, this could be just the pattern repeating. Also, I've changed some of the pages (actually redid nine main pages and changed their names, titles, KWs), began using dynamic content instead of straight HTML for news articles.

Next steps are publicity and marketing the site via anything but Google. My site is still way down in the SERPs for many queries. Unrelated sites appear well before me and the 1st page of results are generally less-than-fully-relevant, though I am seeing fewer scrapers.

My tactic is now to ignore Google almost completely. If traffic begins to come from there, fine. If not, fine also because I am using other traffic-generating methods.

reseller

2:25 pm on Jun 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



walkman

>My suggestion is to wait till GoogleGuy announces that they fat lady has sung.<

You forgot something! Most correctly:

My suggestion is to wait till GoogleGuy announces that the fat lady (Copyright 2005 by Caveman, All rights reserved) has sung. ;-)

Clint

2:26 pm on Jun 14, 2005 (gmt 0)



Clint - it could be my 301 kicking in or it could be Google sorting things out.

Well that's good for you. They haven't sorted out JACK for me. :( :( :( I still have been erased from existence. I'll give them till next weekend and if I don't have my life back then, I WILL raise a stink and ruckus the likes of which G and the national media has never seen--and I will NOT be alone. It will be time for someone to put a collective representative face on the countless thousands of victims of this atrocity. I will have no other choice, and nothing else to lose. Hopefully it will never come to that.

oldpro

2:42 pm on Jun 14, 2005 (gmt 0)

10+ Year Member



clint,

OldPro, I don't think it is. I get the exact same results with OR WITHOUT the filter! Figure that one out.

i get totally different results with the filters off for my keyphrase. right now i am analysing why this is so...just in case the tool is accurate. so far, i have identified 3 or 4 possiblities, but don't want to post them until i am somewhat sure my assumptions are correct.

my sector is unique in that it was one of the first "commercial" sectors on the WWW. Lot's of old sites including mine. the one possibility i am confident of is that there seems to be some sort of "grandfathering/legacy" type filter.

On another note, in the aftermath of bourbon it seems to have had strong undertones of the hilltop algo. This could explain the fallback on dmoz, yahoo directory and the late 1990's alta vista crawl of "expert sites". It also could explain why site's with crosslinking schemes and canoncal issues where hit the hardest...and the fact that the Page Rank algo seems to have little influence on the serps.

If they are giving hilltop more influence in their overall algo, this could explain the new google evaluation lab project...to develop yet another set or sub-set of "expert sites".

I have no idea if the tool is even halfway accurate, but the results of using it were astoundingly different for my sector. Assuming it is accurate, then it could be that different filters are being applied to different keyword clusters since you saw no change.

We are all shooting in the dark at a moving target here with trying to come up with how to deal with bourbon. The few things we have identified as possible negatives are...www and non www situations, 302's, crosslinking schemes and scrapper generated duplicate content. At least with those things we can adjust. On page factors are of little importance at this point.

This 1225 message thread spans 41 pages: 1225