Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Dealing with the consequences of Bourbon Update

Which changes has Bourbon brought about & How to deal with them?

         

reseller

3:41 pm on Jun 5, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Assuming that the greatest part of of the latest Google update (Bourbon) is completed, its rather important to do some damage assessments, study the changes brought about by Bourbon and suggest ways to deal with them.

We need to keep this thread focused on the followings:

- Changes on your own site ranking on the serps (lost & gained positions or disappearance of the site).

- Changes you have noticed on the new serps (both google.com and your local google site) especially in regards to the nature of the top 10 or 20 ranking sites.

- Stability of the serps. I.e do you get the same serps when you run the same query within the same day or 2-3 successive days (both google.com and your local google site).

- Effective ethical measures to deal with the above mentioned changes.

Thanks.

oldpro

4:36 am on Jun 19, 2005 (gmt 0)

10+ Year Member



steveb,

You need one before you should be allowed to wildly blame other people for problems you created, when the basic solution has been posted many times, including by Google Guy (also several times). Stop complaining and get off your butt and solve the problems you created.

I respect your opinion and you are correct to a certain extent. However, what use to be the great attribute of the WWW is now a thing of the past. Once upon a time an average person with an elementary understanding of html the WWW provided the opportunity express one's creativity or earn a living.

A little sympathy is in order here. Google is in a position to remake the WWW in its own vision. By virtue of this google is changing the rules. Duplicate content, canonical, validation issues use to never be a problem. I am framing this statement within the previous 10 to 12 year timeframe. Now it is.

So...there is some ambiquity has to who or what is the blame for these problems. This is the classic question of which came first...the chicken or the egg?

I agree with you to the extent those that are complaining should just simply adjust an accept the fact that Google is in control of the internet. We have to march to the beat of their drum. It is a sad loss that the average non-webmaster has been left behind in all this.

willie50

6:40 am on Jun 19, 2005 (gmt 0)

10+ Year Member



Hi I have a question:

If your site disappear completely from google, it is bad, but what happens when your page rank goes to 0 in the googlebar.

What does it mean

walkman

6:48 am on Jun 19, 2005 (gmt 0)



>> If your site disappear completely from google, it is bad, but what happens when your page rank goes to 0 in the googlebar.

unless it's a temporary glitch, it can indicate a manual penalty.

reseller

7:10 am on Jun 19, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Dealing with the consequences of Bourbon Update
"Google-Updates Survival Kit"

Hi Folks!

Updating.. including GoogleGuy´s important remarks in regard to removing 302 redirects (msg #:786 of this thread) .

- Do a 301 redirect regarding yoursite.com vs. www.yoursite.com (canonical url problem)

- Removing 302 redirects
Please do not remove your own site using Google's url removal tool. All it will really do is remove your own site for 180 days.
And don´t think that allinurl:yourdomain.com returning a result like someotherdomain.com/redirect?url=www.yourdomain.com could be a hijacking. That's a common misperception. All that "allinurl:yourdomain.com" does is look for documents with "yourdomain com" anywhere in the url that we saw. It's not a hijacking if you see results from other sites with allinurl. The only time you need to worry is if you do site:yourdomain.com and then you see results from someotherdomain.com.

- Removing duplicates

- Subtle page changes and monitor SERP changes

- Create and submit a Google Sitemap (You want Google to crawl more of your web pages)
[google.com...]

- Optimize your site for other search engines (like Yahoo, MSN ..)
Keep working to increase non Google sources of visitors.

- Transfer your affected site to a spare/emergency site
An emergency site is an additional site with 1-2 pages of real content related to your affected site. You create the emergency site in good time, submit it to the majors (also maybe local directories) and leave it to age for at least 6 months before moving the content of your affected site to it.

Resources:

Google Update Bourbon Part 4
[webmasterworld.com...]

Dropped from Google - a checklist to find out why.
[webmasterworld.com...]

Further Google 302 Redirect Problems
[webmasterworld.com...]

301 for non-www. to www. not working, plus custom error stops working
[webmasterworld.com...]

Google Sitemaps
[webmasterworld.com...]

Successful Site in 12 Months with Google Alone (Brett Tabke)
[webmasterworld.com...]

Sandbox Question and SEO for Google
[webmasterworld.com...]

GoogleGuy's posts (Some posts and advice on Bourbon and other topics)
[webmasterworld.com...]

eval.google.com - Google's Secret Evaluation Lab..
[webmasterworld.com...]

Your comments and suggestions shall be highly appreciated.

Thanks!

steveb

7:14 am on Jun 19, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"Once upon a time an average person with an elementary understanding of html the WWW provided the opportunity express one's creativity or earn a living."

It is sad in a significant way, but don't mourn, organize... your sites. Adapt or die.

reseller

7:46 am on Jun 19, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hi Folks

Because of its importance and valuable information included which I find very relevant to the discussion within this thread, I´m posting here GoogleGuy´s message #7 which was posted on GoogleGuy´s own thread

GoogleGuy's posts
[webmasterworld.com...]

--------------------------------------
GoogleGuy
Senior Member

msg #:7 8:26 am on June 2, 2005 (utc 0)

Sometimes a tangent isn't such a bad thing though. For example, partway into the Bourbon discussion, wattsnew asked if there was a technical guide on how to handle www vs. non-www, relative vs. absolute linking, and links to different pages such as / vs. index.html vs. default.asp. My rule of thumb is to pick a root page and be as consistent as possible. I lean toward choosing [yourdomain.com...] but that's just me; [yourdomain.com...] would work as well. Then I recommend that you make things as simple as possible for spiders. I recommend absolute links instead of relative links, because there's less chance for a spider (not just Google, but any spider) to get confused. In the same fashion, I would try to be consistent on your internal linking. Once you've picked a root page and decided on www vs. non-www, make sure that all your links follow the same convention and point to the root page that you picked. Also, I would use a 301 redirect or rewrite so that your root page doesn't appear twice. For example, if you select [yourdomain.com...] as your root page, then if a spider tries to fetch [yourdomain.com...] (without the www), your web server should do a permanent (301) redirect to your root page at [yourdomain.com...]

So the high-order bits to bear in mind are
- make it as easy as possible for search engines and spiders; save calculation by giving absolute instead of relative links.
- be consistent. Make a decision on www vs. non-www and follow the same convention consistently for all the links on your site. Use permanent redirects to keep spiders fetching the correct page.

Those rules of thumb will serve you well no matter what with every search engine, not just with Google. Of course, the vast majority of the time a search engine will handle a situation correctly, but anything that you can do to reduce the chance of a problem is a good idea. If you don't see any problems with your existing site, I wouldn't bother going back and changing or rewriting links. But it's something good to bear in mind when making new sites, for example.

annej

8:14 am on Jun 19, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Interesting what looking for duplicate copies of your webpages brings. I just found a site that sells papers to students. It seems they are selling at least one of my articles. <sigh>

kgun

9:51 am on Jun 19, 2005 (gmt 0)



Some experiences.

1. I have been on the net since its beginning and worked for the Central Bank of Norway and a large swedish company. I have seen the increase in professionality. In the beginning, the Internet and the searchengine Yahoo, was a great experience. There is no lack of information on the internet. The problem is to find the right information to the right time.

2. Competition on the web is increasing, and it is worse and worse for a startup company. You need traffic to participate in programs like Ad Sence. It is like an evil circle. You need to be large, to grow and prosper. The internet is the greatest network of people in the world. It connects people from Inner Mongolia to Greenland, from Australia to Norway, from Chile to Hong Kong. Some say that the internet is anarchy in practice. There is a tendency that money becomes more and more important.

3. Will the internet evolve into a great communication, information and data logistic medium controlled by a few large players? What shall them moms and paps and small companies do to meet this challenge. In the Europe, we have a tradition for strong labour unions that were a result of the global economic crisis in the 1930's. I am nearly daily in contact with people from the USA and other countries in the world. The USA are for free enterprise, and it is the country of the selfmade man. The USA have been the motor in the world economy since world war II. Most of the small companies pops up in the USA. And american companies have given me much. I have very good experinces with a large company like Microsoft. In my view understands the need of the customer, even if it can be hard to find the right person.

4. The only way for small companies and startup companies to survive in these surroundings is in my opinion to organize. The members pay USD 100 (10) a year to participate in this organization of small companies on W3. The organization represents them in conflicts, and the organization have the poverty to boicott great players. It is a form of insurance. The small fishes shall not feed the big sharks, but play with them to mutual benefit.

5. This was only some thoughts I had and wanted to share with you. Personally, I think that I will survive, but I have experienced the frustration and struggle some people have had here. It has been a great experience. Written in a hurry before I go out in the sun in Norway to get some food. I think more competent writers than me should edit these lines.

6. So small internet companies, moms and paps running your small homeshop, in all countries, organize into an organization of W3SmallCompanies.org. (Have not checked that name on Whois or Betterwhois. If it exists it is not reserved by me).

Kjell Gunnar Bleivik
KGB as one wrote in this post.

Sorry for swearing in the church.

Clint

12:09 pm on Jun 19, 2005 (gmt 0)



I have no way to find hidden copies of my articles but I can find open ones.
Could someone please tell me if doing a
<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
would solve the problem. At least then visitors to my site could still find the articles. Otherwise I'll have to remove them from the site.

Anne, if you doing this for Google ONLY, then don't user that tag, that will affect ALL SE's. Use:
<meta name="googlebot" content="noindex,nofollow">

Clint

12:25 pm on Jun 19, 2005 (gmt 0)



People are still crying about their sites being gone? Man this thread needs to die.. Move on people.. For every loser there are winners.. I was a loser and a winner in this update. This is how all updates work for me. And I ALWAYS win more than I lose. ALWAYS!.
I diversify.. I build more than one site per niche.. tweak each one differently. If I get more than one in top 3 then that's gravy. But that is my advice to you all diversify. And stop acting like you are on google welfare. They don't owe you anything. They don't have any "OBLIGATIONS" to do anything. They are a public company that only obligation is to their share holders. Not to you guys. And this is coming from a die-hard google hater...

Someone please put this thread out of its misery.. I can see this thread going on and then three months later, someone posts, I'm noticing update "Charlie"....

Your POST "needs to die". If you're not getting any useful information form this thread, then why the hell are you here? Obviously to taunt and provoke others. There are still many on this thread that still have not got back their G SERP's, and you should have and show some respect for them!

[edited by: Clint at 12:31 pm (utc) on June 19, 2005]

Clint

12:28 pm on Jun 19, 2005 (gmt 0)




Though I prefer using Google to look for stolen snips of our stuff.

Joe (duck), how do you do this, you just take some text from any of your pages and paste it in quotes at Google? Or is there some special Google "command" that does it?

Clint

1:27 pm on Jun 19, 2005 (gmt 0)



P.S. J., thanks for the spam report. I love to hear about any spammy sites that we're missing

GoogleGuy, would you be kind enough to please explain exactly just what G considers a "spammy site"? Do you mean a site or page with nothing but links; or, a site or page with repeated key words all over it (and if so where, meta tags or body, and how many is too many) ; or, what's in alt image tags; or, the "title" tag for links, etc.? I'm sure many would like to know this as per G's definition.

Before anyone replies with things such as "just use your head", or "well, I think it's....", etc., I reiterate: as per Google's technical definition of such. ;)
Thanks.

oldpro

1:27 pm on Jun 19, 2005 (gmt 0)

10+ Year Member



steveb,

It is sad in a significant way, but don't mourn, organize... your sites. Adapt or die.

My site has been online since 1994...never been affected by google's algo gyrations. "adapt or die"...very much the truth folks. Adapting ahead of the curve has always helped me. Practice murphy's law..."anything that can go wrong, will go wrong". Nip potential problems in the bud.

A few other tidbits of advice for what it's worth...

1. Keep you sites, recips, and inbounds on-theme.
2. Understand LSI
3. Develop an outline of the structure of your site and keep it logical. Don't make hasty additions or changes.
4. Use a singular alias and absolute urls in your linking structure no matter what the GG says.
5. Shoot for top rankings on MSN and Yahoo and be satisfied with a respectable showing on google.
6. Validate your code each time you make a change (no matter how minor) before ftping.

Clint

1:30 pm on Jun 19, 2005 (gmt 0)




Posted by GoogleGuy:
Also, I saw at least one person who wrote and still thinks that allinurl:yourdomain.com returning a result like someotherdomain.com/redirect?url=www.yourdomain.com could be a hijacking. That's a common misperception. All that "allinurl:yourdomain.com" does is look for documents with "yourdomain com" anywhere in the url that we saw. It's not a hijacking if you see results from other sites with allinurl. The only time you need to worry is if you do site:yourdomain.com and then you see results from someotherdomain.com.

So, GG, is it then safe to say that if you do a site:yourdomain.com and all of the results you see are your own pages, that you are not getting hijacked?
Thanks.

patchacoutek

1:39 pm on Jun 19, 2005 (gmt 0)



I know I've been in the first batch of people affected by this update since I lost all positions for all pages (more than 3000 KW) in a single 24 hours period. Since then the business is still viable since we have quite good positions on other search engines, but were rapidly rethinking the business so we wont be dependant again...

So this is the adapt part...For the rest i'm still very surprised each day that I see that google has not recrawled my site and all pages are still supplemental results.

And no I didint publish similar pages on purpose and now come here to cry, so #*$! off...

DO you see enourmous crawler activity or I should forget about this site for a while?

Alex

oldpro

1:43 pm on Jun 19, 2005 (gmt 0)

10+ Year Member



No...

I think he was saying that listings showing up in allinurl for your site that are not yours does not mean these are hijackers. It's just another site that is using your url in the title or in the link to your site.

I think what he means to worry if none of your site listings show up, then google does not have you in their index.

The have been several instances of 302 hijacks where it did not show in the allinurl...I think helleborine was one of them.

Clint

1:44 pm on Jun 19, 2005 (gmt 0)



Alex, I posted something yesterday about G bot activity:
----------
>>>Anyone else noticing that *images.64.233.179.104/ and other froogle, groups, news and local are returning a 404 response? <<<

Yeah, I see that today as well from images.google.com. They are all old images that I removed some time back, but some of them are VALID CURRENT images! Seems to be a problem with their bot. This looks like yet another G issue, this is going to remove our valid images from their index. I also see it trying to access images that never existed! Like www.MyDomain.com/jpg , what's up with that? G is not correctly linking to the images!

The bot is also doing it with PDF files I also removed some time back. I also see a lot of "302" codes from images.google.in, .de, .be, .se, and other overseas G image servers.
-----------
In addition to that, I saw more G bot activity than I have over the past couple of weeks. But today, NOTHING! Not a SINGLE ENTRY for the googlebot!

kgun

1:53 pm on Jun 19, 2005 (gmt 0)



Google you have the greatest resources:

1. Is it possible to make a searchengine that
is difficult (ideally impossible) to manipulate?

2. A searchengine, where the designer of the page
does not need to have a PhD degree in information
science.

3. A searchengine that makes the designprosess
easier for webmasters in the future?

4. In short, a searchengine that filters out spam,
hijacking and other sort of garbage?

5. A searchengine that gives fearly stable and not
too unpredictable results. Credible search.

6. (Is it possible to get the Google logo on my
toolbar, that fits my colours and have the
following options.
1. Google Site Search.
2. Google Safe Web Search.
3. Google Web Search.
4. Google Financial Web Search.)?

KBleivik

Since I now came too close to Google, I leave for a break. That is the good news. The bad news i that I (may) come back?

[edited by: kgun at 1:58 pm (utc) on June 19, 2005]

sailorjwd

1:57 pm on Jun 19, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



About GG saying it is no problem when another site shows up in your inurl: command..

Seems to me it could be a problem. Is Google treating that link as though it where a page in the 8 billion pages? Or is it truely just a link with no content?

If there is content associated with the link, and by the way they are all 302's in my case, then you know who's content it is? MY SITE'S CONTENT.

However, as long as those links don't come up in any search results for exact sentence matches from my site then I guess it is ok.

In that case I take back the 50 emails I sent to G support complaining about links indexed as pages.

Another topic.. GG says that entries from other sites are only a potential problem when they show up in the Site: command - but didn't they prevent such links from showing a while back? So how the heck are we to know if they are really there or not?

fearlessrick

2:13 pm on Jun 19, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



SteveB said,
It is sad in a significant way, but don't mourn, organize... your sites. Adapt or die.

So how come when I ask you for assistance or an explanation, you ignore me? Maybe you think the internet should be only for those who are able to adapt and those who cannot should die.

Nice sentiment, that.

max_mm

2:31 pm on Jun 19, 2005 (gmt 0)

10+ Year Member



reseller wrote:
Please do not remove your own site using Google's url removal tool.

Doe's this mean that anyone can remove anyone’s domain from the index using the Google removal tool?

I sure hope it doesn't. Are there any safeguards ensuring this can not happen?

Comments please.

Clint

2:37 pm on Jun 19, 2005 (gmt 0)



Oldpro:

No...
I think he was saying that listings showing up in allinurl for your site that are not yours does not mean these are hijackers. It's just another site that is using your url in the title or in the link to your site.

I think what he means to worry if none of your site listings show up, then google does not have you in their index.

The have been several instances of 302 hijacks where it did not show in the allinurl...I think helleborine was one of them.

Thanks, I understood the first two paragraphs. I'm taking GG's quote: "The only time you need to worry is if you do site:yourdomain.com and then you see results from someotherdomain.com" verbatim (the ONLY TIME you need to worry...), which would mean it's safe to say that if you do a site:yourdomain.com and all of the results you see are your own pages, that you are not getting hijacked. That's why I need clarification on that. So if you're saying that in fact Chantal's site that was 302's was indeed adversely affected by that 302, and it didn't show in the site: command, then I guess that's a resounding "no".

This pops the question again, how can we check for hijacks? Has anyone found a tool of some kind to check for them?

Clint

2:40 pm on Jun 19, 2005 (gmt 0)



Someone, I forget whom, asked about G's cached pages and how to stop it. It can be done via a tag in the <head>.
[google.com...]

dgdclynx

2:47 pm on Jun 19, 2005 (gmt 0)

10+ Year Member



My site was designed in 1995/7 before Google or its guidelines existed. Now under Bourbon it is being penalised about 100 places, presumably for not conforming. It looks like the days of the amateur are over and only professional sites will survive. Now it is quite impossible to redesign my site so it is sensible to call it quits. Everything is fine on MSN and Yahoo but they dont provide the hits.

Clint

2:49 pm on Jun 19, 2005 (gmt 0)



>>>Does this mean that anyone can remove anyone’s domain from the index using the Google removal tool?

I sure hope it doesn't. Are there any safeguards ensuring this can not happen? <<<

I believe they send you an email to which you have to reply. So, hopefully the answer is "no".

MHes

2:50 pm on Jun 19, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>So how come when I ask you for assistance or an explanation, you ignore me?

reseller

3:15 pm on Jun 19, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



max_mm

>reseller wrote:
Please do not remove your own site using Google's url removal tool.
Doe's this mean that anyone can remove anyone’s domain from the index using the Google removal tool?<

NO. Of course not. Only you or the person who have access to edit your robots.txt (Remove pages, subdirectories or images using a robots.txt file.) or edit your pages (Remove a single page using meta tags) can do that.

Remove Content from Google's Index
[google.com...]

[edited by: reseller at 3:18 pm (utc) on June 19, 2005]

Clint

3:17 pm on Jun 19, 2005 (gmt 0)




I think if a lot more people ran their sites through validators (several of them because not all pick up all the errors-not even W3c.org), link checkers, blacklist checkers, browser compatibility programs and checked Google for duplicate content penalities and went out and found some quality links passing PR--this thread would expire in a few days because a lot of the angry people on this thread would be eating humble pie.

FTR, Nope. ;) You don't think people have done that? (Referring to the bold text above) I posted a few weeks ago that I put sites appearing at the top of the G SERP's through validators and they were LOADED with errors. Some with 100+ per page. Also, sites with few or no errors were removed from the G index.

As for the "quality links", if you're referring to sites to whom you link, also nothing there. Link exchange pages and the like were atop the G SERP's and I checked the type of sites to which they were linking. None more than a PR3, most a PR0.

As for what's italicized, depending on what exactly you mean by dupe content (www Vs. non-www; or content theft or copying; or dupe pages created by the webmaster), you 'could' have something there, but I don't think so unless G is/was/is penalizing only certain sites that do it. Now I'm not sure how to check for duplicate content penalties (someone please explain how), but when I was checking some of my monitored phrases in G when I was removed, I saw pages appearing 1st, 2nd, 3rd spot, near the top, that not only copied content from the original sites (that's one type of dupe content), but also sites that had the same pages with different meta tags (another type of dupe content?), and sites that did not do the 301 redirect thing to their www pages (another type of dupe content?). So, again if that was or is an issue, it's only affecting certain sites for some reason, and if anyone has been able to definitively determine that one or more of these dupe content issues above definitely negatively affected them or anyone else (has anyone?), then of course it's certainly something worth fixing.

Clint

3:20 pm on Jun 19, 2005 (gmt 0)



Reseller, I think what he was referring to is the area at G where you actually submit a URL for removal.....if they still have that area. I think I remember seeing it once around the time all this started to happen. I checked it to see if someone could have submitted my URL for removal, and I seem to remember you had to login and they had to send you an email on the domain's address to which you had to reply, in order to go any further.

On that page you gave, if you click the "urgent" link, you're brought to the page [services.google.com:8882...] where you have to login. But someone is going to have to verify if what I thought I saw about the email address having to be on the domain of the URL requesting removal has to be used.

Clint

3:37 pm on Jun 19, 2005 (gmt 0)



Can anyone finally explain this? I've seen numerous posts here regarding one's SERP's in G not having anything under the URL, it's just a clickable URL, but no answers to them. When I do a site:MyDomain.com, this is the typical standard G format for most hits:

Clickable main hyperlink text here...
Plain text description here...
www.MyDomain.com/whatever - 28k Cached - Similar pages

But some of the pages shown, are shown only as this:

www.MyDomain.com/whatever
Similar pages

So, why is this? Does this signify anything, DOES it need to be fixed and if so, how can it be fixed? There's no reason this should happen. I compared the webpages with only the URL for commonalties and there are none.

Then there is the issue with the "trailing slash". I've seen no one explain the significance of it. When I do a site:MyDomain.com I get DIFFERENT RESULTS for that as compared to site:MyDomain.com/ which doesn't sound like a good thing. I'm also still getting different results with and without the www, so that makes for 4 different results: no www and no slash, no www and with slash, www and no slash, www and with the slash.

Also, I'm still seeing fewer and fewer results everyday for any syntax of site:MyDomain.com.

Answers please? ;)
Thanks.

This 1225 message thread spans 41 pages: 1225