Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Dealing with the consequences of Bourbon Update

Which changes has Bourbon brought about & How to deal with them?

         

reseller

3:41 pm on Jun 5, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Assuming that the greatest part of of the latest Google update (Bourbon) is completed, its rather important to do some damage assessments, study the changes brought about by Bourbon and suggest ways to deal with them.

We need to keep this thread focused on the followings:

- Changes on your own site ranking on the serps (lost & gained positions or disappearance of the site).

- Changes you have noticed on the new serps (both google.com and your local google site) especially in regards to the nature of the top 10 or 20 ranking sites.

- Stability of the serps. I.e do you get the same serps when you run the same query within the same day or 2-3 successive days (both google.com and your local google site).

- Effective ethical measures to deal with the above mentioned changes.

Thanks.

johnhh

1:49 pm on Jun 23, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



fearlessrick
I was going to brave and ask the same. From our perspective : no - still going backwards as well.

I have had an answer from Google Sitemaps regarding commercial sites ( see their terms of service ) - just asked for a list of sites. Waited several weeks for that reply.

Clint

1:50 pm on Jun 23, 2005 (gmt 0)



John are you saying they have a different TOS for commercial sites, sites that sell products?

johnhh

2:17 pm on Jun 23, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Clint:

When I looked a few weeks ago ( 03 Jun 2005) there was a Terms of Service link which had been removed now (just looked) this said "not for commercial use" So posted on Google groups for clarification.

Is this off topic? I guess not is as it is one of the possible steps to take to "deal with.."

As a company we have to check these things out. I have just sent them the URL's I am interested in submitting a site map for.

sailorjwd

2:28 pm on Jun 23, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Sitemaps appeared to help me deal with Bourbon.

Are we telling each other that a site with adsense is a commercial site and therefore I shouldn't use the site maps feature?

Borek

2:32 pm on Jun 23, 2005 (gmt 0)

10+ Year Member



For good or for bad, G is still tweaking the results.

I have several KW and keyphrases, but I am mostly tracing one - lets call it KP1. It is not a thing my main page is optimized for, rather one of subpages, but KP1 is present on the main page as well.

My whole site completely dropped for KP1. Then - about a week ago - my main page have reappeared for KP1. Just now - for the first time in a few weeks - my page optimized for KP1 shows in search results. SERP much worse than it was before and only 4 DC, but obviousle they are still trying to optimalize results.

reseller

2:35 pm on Jun 23, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



sailorjwd

>Sitemaps appeared to help me deal with Bourbon.<

Glad to hear that!

An as johnhh correctly mentioned, it is one of the suggested steps in our Google-Updates Survival Kit .

- Create and submit a Google Sitemap (You want Google to crawl more of your web pages)
[google.com...]

Folks!

What are waiting for ;-)

johnhh

2:46 pm on Jun 23, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



"Are we telling each other that a site with adsense is a commercial site and therefore I shouldn't use the site maps feature?"

I don't know thats why I asked for ( and waiting for ) clarification for commercial sites - we don't use adsense. I tread carefully.

gnehid29

3:01 pm on Jun 23, 2005 (gmt 0)



Just for your info. Deep crawling are under way including Java Script Files (?)

reseller

3:08 pm on Jun 23, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



sailorjwd

>Are we telling each other that a site with adsense is a commercial site and therefore I shouldn't use the site maps feature?<

IMO, AdSense spots shouldn´t be considered as a factor to decide whether a site is commercial or not.

My definition: A commercial site is a site where a product or service either being bought, sold or bartered.

For example a site which covers free commercial resources and having AdSense spots on pages shouldn´t be considered a commercial site.

Clint

4:55 pm on Jun 23, 2005 (gmt 0)




Clint:
When I looked a few weeks ago ( 03 Jun 2005) there was a Terms of Service link which had been removed now (just looked) this said "not for commercial use" So posted on Google groups for clarification.

Is this off topic? I guess not is as it is one of the possible steps to take to "deal with.."

As a company we have to check these things out. I have just sent them the URL's I am interested in submitting a site map for.

Thanks John. I know many of those using G Site Maps on the Site Map thread have commercial sites.

Borek

9:42 pm on Jun 23, 2005 (gmt 0)

10+ Year Member



Clint:
I know many of those using G Site Maps on the Site Map thread have commercial sites.

SiteMap FAQ:

10. Can I use Google Sitemaps to submit URLs for my commercial websites?

Yes, the Google Sitemaps program would like to know about all the URLs you would like included in the Google index, including your commercial websites.

lorenzinho2

10:06 pm on Jun 23, 2005 (gmt 0)

10+ Year Member



Matt Cutts said today that the inclusion of "new signals of quality" drove the Bourbon update.

He did not elaborate as to what those new signals of quality were.

g1smd

11:17 pm on Jun 23, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



That's probably some of the stuff in the recent patent being factored in...

MikeNoLastName

12:25 am on Jun 24, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think SOMEONE got their SIGNALS crossed!

Still dropping...

stu2

1:34 am on Jun 24, 2005 (gmt 0)

10+ Year Member



"So what do people say constitutes copying, sufficient for a dupe penalty? Just the title alone? A single unique sentence like in the Google Descriptions? A paragraph? Using a common template on your own site which includes a partial title, common menu bar and maybe a common footer? Entire page copying including metatags? Your entire page (called from your site) framed by another person's page?
If we know what to look for it's a lot easier to find causes."

Just my 2 penneth worth on this duplicate content discussion. I use a lot of php includes which call common headers, footers, and menus. I use page titles with very similar (almost identical) wording. Some thing like "This is mysite's main title - subsection". Only the subsection changes in the page titles and is expanded upon for the page description. My site has about 150 pages and my largest subsection group would be about 50 pages. I have no page rank and maybe only about 4 or 5 incoming links to the entire website. To my pleasant surprise, google's serps for my targeted keywords, rank me #1 (in an uncompetitive segment). So, it would be my contention that you can duplicate quite a bit of "fixed" information between pages before triggering google's duplicate content filter within a site.

steveb

1:53 am on Jun 24, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Somebody turn down Matt's opium drip.

It's simply impossible that new signals of quality were introduced, beyond a minute degree. An introduction of more quality signals certainly drove nothing. Quality signals were obviously loosened, a lot, this time around. Anything new was dwarfed by the overall relaxation of quality signals.

Google employees need to take stock of themselves if they truly believe recognition of quality signals has improved rather than degraded very significantly.

stu2

1:54 am on Jun 24, 2005 (gmt 0)

10+ Year Member



reseller

Dealing with the consequences of Bourbon Update
"Google-Updates Survival Kit"

Might I suggest you add GG's email address to this list (the jun05feedback thingy).

stu2

2:29 am on Jun 24, 2005 (gmt 0)

10+ Year Member



patchacoutek

I would like to know how many people had all their pages become supplemental results or URL only, and then came back recently. I'm slowly losing hope, I think that maybe if all my pages became supplemental or url-only, I have been banned or got a bad penalty?

Please Help!

Did your pages come back when they were all supplemental?

Yep. Use Google Sitemaps. Worked great for getting everything re-indexed.

oldpro

2:57 am on Jun 24, 2005 (gmt 0)

10+ Year Member



It would be nice if GoogleGuy would define "new signals of quality". I bet these "new signals" are more subjective than objective in nature. It certainly appears that way.

stu2

3:42 am on Jun 24, 2005 (gmt 0)

10+ Year Member



Clint

"Has anyone's site that WAS trashed in this update, got their ranks BACK, WITHOUT doing a 301 redirect from non-www to www?"

My site was trashed before this update. I believe using Google Sitemaps aided considerably to resolving those issues. I believe the 301 redirect I did afterwards had nothing to do with my subsequent re-ranking (but that's just me). I believe (although it's a bit of an apples to oranges comparison) that I have been a net gainer with this Bourbon update (although even that might just be a coincidence.. occuring at the same time as the update was going on). I only run a hobby site, so I don't rely on it for any income, so didn't fret too much that Google sandboxed me (for no apparent reason that I could tell) for between 6-12 months. I was getting hits thru Yahoo and from word-of-mouth/local magazine. I believe I had no 302 hijacks, although I had seen evidence of 1 scraper site linking to me. The scraper doesn't appear in my searches any more. I don't know why that is (and I forgot the url to check it).

theBear

4:17 am on Jun 24, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



MikeNoLastName,

Would that be the site who's google cache entry shows up by any chance?

Ivan_Bajlo

5:28 am on Jun 24, 2005 (gmt 0)

10+ Year Member



Huh, I don't have time to read last 400 posts, so I'll just put short info on my site Bourbon aftermath.

Made it, ma! Top o' the world! ;)

Yes, I'm back to No.1 place! And I'm finally passing that magical 1000 visitors a day again, although I still have to get back to 1500 which I passed at beginning of the year but this will be much more difficult now since I have noticed lack of traffic from regional Google domains which use to give nice boost.

What changes did I do to my site? Almost zero, I've manually removed from Google some ancient URL's (about two years old!) where my website use to be (which had META refresh redirects!) and I have sent several e-mails to Google inquiring whether I'm suffering Canonicalpage or duplicate content filter.

Not sure which one of this did the trick - did they took pity on me or was I getting annoying or everything sorted itself out? :o

dgdclynx

7:12 am on Jun 24, 2005 (gmt 0)

10+ Year Member



I have sent an email to help@google.com explaining how my literary website has been penalised under the 'original content' rule. Could somebody please verify that help@google.com was the correct place to send it. I had had an automated response from this address on reporting to Google that I had problems.

Johan007

8:24 am on Jun 24, 2005 (gmt 0)

10+ Year Member Top Contributors Of The Month



dgdclynx did you use the key words "bourbon" or "GoogleGuy" in the text you should be ok. If not then you could have used the Googlegroups email for GG directly.

...my own site is still penalised. I had the odd Java Script redirects from some old pages that I deleted ages ago!

simmen

8:47 am on Jun 24, 2005 (gmt 0)

10+ Year Member



>I have sent an email to help@google.com explaining how my literary website has been penalised under the 'original content' rule.<

My site has been placed in the sandbox, all keywords above position 100 ...

I wonder is there a black list (I read somewhere).And how did you know you where penalised under the "original content rule". I mean where did you get that info..

Thanks already Simmen

dgdclynx

9:00 am on Jun 24, 2005 (gmt 0)

10+ Year Member



I was told that G may have brought in a requirement for original content of around 12%. I dont think I am in the sandbox cos I have been at my current ISP for over two years. The other thing was having more than 100 outbound links in a page but I think the 'original content' more likely as I see it in another thread. My only uncertainty re 'original content' is whether it is 'within a site' or 'between sites' If 'between sites' then that would be nasty for everybody. There is another thread on this.

simmen

9:10 am on Jun 24, 2005 (gmt 0)

10+ Year Member



Okay, then you're guessing like everyone else and there isn't a place where you could be certain about why a site is penalised?. Only to send a mail to Ggle.com ...(waiting them to respond..)

I think I'm sandboxed or Bourbanned :-) I wished there was more clarity about the reasons google has for these things..

In my country (Netherlands) was lately on the internet that Google works with persons who rule about te SERP's on a certain bases..

maybe you know more about these things..(probably..)

patchacoutek

11:21 am on Jun 24, 2005 (gmt 0)



I used google sitemaps and yes my pages are being re-indexed quite fast actually. Each day I see more and more pages come back as full listings without the supplemental tag...

The only thing is that no page is appearing in the SERPS for its keywords. So maybe the pagerank and the links are not fully calculated because all the pages are not in yet. But still I,M happy to see some changes, even though the traffic is not coming back yet. One full month at 70% of the revenues but still paying the rent with it.. fiou...

Sailorjwd. Did your pages started to come back in google after the whole site has been reindexed?

Thanks

Alex

kgun

12:27 pm on Jun 24, 2005 (gmt 0)



Bad news: I am back again.
Good news: I leave in some minuets after a second
post about meta tags.

gnehid29 3:01 pm on June 23, 2005

“Just for your info. Deep crawling are under way including Java Script Files (?)”

Tell me more, I am a bit confused.

orenzinho2 10:06 pm on June 23, 2005

“Matt Cutts said today that the inclusion of "new signals of quality" drove the Bourbon update.
He did not elaborate as to what those new signals of quality were.”

Mafia sites? :-) :-)

Reseller 10:21 pm on June 23, 2005
“That sounds like GoogleGuy talks. Every word should be decoded.
Now. Are you sure that Matt Cutts isn´t Mr. GoogleGuy himself ;-)”

Are you sure of anything on the web. I may be you. May make a “serverpage to intrude the web server and overwrite what you write” in your name.

stu2 1:34 am on June 24, 2005
“So, it would be my contention that you can duplicate quite a bit of "fixed" information between pages before triggering google's duplicate content filter within a site”.

What about syndicated minisites within a site? They may have a lot of identical content to the supplier. It is business, and should not be considered copying? But the minisite should of course get a lower rank than the mother site.

KBleivik
Make it simple, as simple as possible, but no simpler.

kgun

12:41 pm on Jun 24, 2005 (gmt 0)



Kgun 2:51 pm on June 22, 2005

Clint:
Here is my code on the index page:
<META NAME="googlebot" CONTENT="index,follow,archive">
<META NAME="robots" CONTENT="index,follow">
Better, less coding:
<META NAME="robots" CONTENT="index,follow">
<META NAME="googlebot" CONTENT="archive">
Is there a difference? One never knows when it is about coding?

Nickied 3:19 pm on June 22, 2005

Don't know about a difference, but it's all unnecessary. Only use I make of these metas is noindex, nofollow, etc.
(edit)
What you have above is the default, which is why I said unnecessary. Regarding noindex, etc. robots.txt is probably a better bet. See: The Robots Exclusion Protocol

Clint 3:36 pm on June 22, 2005

kgun, all I meant was if you were misspelling "archive" in any of your tags, the tags would not have your desired affect.
G caches (archives) all pages by default, right? So that 'archive' tag is redundant. (If I understand that "archive" means to cache the page). If you DON'T want G having the "cached" link next your hits, then you DO want the 'noarchive' tag there.
I would always put the Googlebot tag first before the generic robots line if the G tag has a different negative precedent than that of the generic robots tag. In other words, if you needed for example:

<META NAME="robots" CONTENT="index,follow">
<META NAME="googlebot" CONTENT="noindex, nofollow">
That would probably not be a good idea since the G bot would see the generic "index, follow" tag FIRST, and obey it. So you'd want the G-bot line first so *it* would not index or follow the page. Their order may not even make a difference if the bots parse the FULL <head> tag before doing anything.


Kgun 3:38 pm on June 22, 2005
But I switch between
e.g.
noindex and index.
Do you know why? I do not, but have an idea.
Nickied 4:26 pm on June 22, 2005

Clint:
I would always put the Googlebot tag first before the generic robots line if the G tag has a different negative precedent than that of the generic robots tag. In other words, if you needed for example:
<META NAME="robots" CONTENT="index,follow">
<META NAME="googlebot" CONTENT="noindex, nofollow">

Again, the first line, ie index and follow is the default and therefore unnecessary. The second line is correct if he wants to exclude googlebot as described here: [google.com...]

******************************************************************************************

Questions:
1.Do you see any reason for using different code on different pages?
2.robots.txt is mostly placed in the root directory. What about a spider entering
a subpage (subdirectory page).
3.What about code on a syndicated mini site?

KBleivik
Make it simple, as simple as possible, but no simpler.

Good news, I leave.
Bad news, I may come back.

This 1225 message thread spans 41 pages: 1225