Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Dealing with the consequences of Bourbon Update

Which changes has Bourbon brought about & How to deal with them?

         

reseller

3:41 pm on Jun 5, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Assuming that the greatest part of of the latest Google update (Bourbon) is completed, its rather important to do some damage assessments, study the changes brought about by Bourbon and suggest ways to deal with them.

We need to keep this thread focused on the followings:

- Changes on your own site ranking on the serps (lost & gained positions or disappearance of the site).

- Changes you have noticed on the new serps (both google.com and your local google site) especially in regards to the nature of the top 10 or 20 ranking sites.

- Stability of the serps. I.e do you get the same serps when you run the same query within the same day or 2-3 successive days (both google.com and your local google site).

- Effective ethical measures to deal with the above mentioned changes.

Thanks.

Clint

3:43 pm on Jun 19, 2005 (gmt 0)



Mike, I've seen no one reply to this below, and I think it's a very good, and important question. (On the "comparator utility", I think "copyscape" will do that, but unfortunately you of course have to PAY to see more than 1st page results, so a free tool would be great).

So what do people say constitutes copying, sufficient for a dupe penalty?
Just the title alone? A single unique sentence like in the Google Descriptions? A paragraph? Using a common template on your own site which includes a partial title, common menu bar and maybe a common footer? Entire page copying including metatags? Your entire page (called from your site) framed by another person's page?
If we know what to look for it's a lot easier to find causes.
It's be cool if someone could write a comparator utility to see how "alike" another page is to yours.

So....GG, can you answer this please?
Thank you.

reseller

3:43 pm on Jun 19, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Clint

>Reseller, I think what he was referring to is the area at G where you actually submit a URL for removal.....if they still have that area.<

Even within the UR removal console, its only the person who have access to edit robots.txt or pages can submit removal request.

The email address is used to create an account which gives access to the UR removal console. But you can´t remove pages unless you have access to edit the robots.txt or to edit the robots meta tags on pages.

<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">

<META NAME="GOOGLEBOT" CONTENT="NOINDEX, NOFOLLOW">

max_mm

3:48 pm on Jun 19, 2005 (gmt 0)

10+ Year Member



Clint

I've seen this happening (pages indexed with url only) on one of my sites which got very badly hit.

Traffic returned recently (well most of it anyway) and the urls have full titles and description now. I think that it simply means that the pages are being re-crwaled re-indexed. But with google lately going highwire every second Monday i am not 100% sure.

Clint

4:18 pm on Jun 19, 2005 (gmt 0)



Reseller, thanks. I then don't know what the point is of their "urgent URL removal" area if it still involves a metatag or robots.txt file.

Max, thanks.

sailorjwd

4:40 pm on Jun 19, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Clint,

The url removal tool is good for removing links that are 302ing to you (in case these make you nervous).

The method is to add the NOINDEX tag to the page and then do the url-removal by pasting the 302 link into the url-removal box.

Then, quickly remove the NOINDEX tag - hopefully the whole process takes no more than 1 min thus reducing the chance that googlebot will catch your page with the noindex in it.

The only problem I have found is that some of the 302 urls are tooo long for the urlremoval tool to accept.

And a side note on this subject: If the url-removal tool is allowing you to remove a url on someone elses site then what does that tell you? It tells me that G thinks your content is part of that 302er's site...

somebody tell me I'm full of it on this point.

Borek

4:48 pm on Jun 19, 2005 (gmt 0)

10+ Year Member



Even within the UR removal console, its only the person who have access to edit robots.txt or pages can submit removal request.

There is more to it, as you can enter just url, but then G checks if the url exists - and only non-existing pages can be deleted this way.

So if your page is redirected using 301 to new page, that in turn reports 200 - G sends information "only non-existing pages can be removed". Period. You have to wait for your 301-ed page to dissappear form the index. As the pages are reappearing now - after 6 months - you can wait long.

I believe G should try to create some better tool for pages deleting - tool that will allow to remove 301-ed pages, non-www pages and so on. Or at least G should give clear recommendations how to remove pages from their index. Should it be 404? Or 410? And how is 301 treated? If such recommendations have existed probably at least half of the problems some of us have now will never shown up.

oldpro

5:19 pm on Jun 19, 2005 (gmt 0)

10+ Year Member



clint,

This pops the question again, how can we check for hijacks? Has anyone found a tool of some kind to check for them?

I don't think there is. Done some searching and nada. Thought of this a few weeks ago. I don't have the expertise, but if somebody could come up with such a tool it should be a money maker. I have other resources to make it happen, but not the technical knowledge.

As far as I know the easiest way to track down hijacks is to search phrases of unique content of your site.

sailorjwd

5:28 pm on Jun 19, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Let me post this question again:

How is it i am able to remove a link on someone else's site unless G thinks either their site is part of mine or my site is part of theirs. Here's the url-removal tool log, url removed on 6/13:

2005-06-13 15:31:02 GMT :
removal of ostg.example.com/click_out.php/1=1/masterid=1123159/
url=http%253A%252F%252Fwww.google.com%252Furl%253Fsa%253Dl%2526q%253Dhttp%253A%252F%252F
www.MYDOMAIN.tld%252FMYPAGE%252F%2526ai%253DBbSjZeFacQrXrDM7QYcvjqKMNtuWHCt7lmaQB6OzEpAP
QhgMQAxgDIIH0iQIoAzAASIk5qgEIYWZjdGVzdDPIAQHoAQE%2526num%253D3/c=Book:+blue+widgets/
page=searchgetprod.php/channel=afctest3
complete

Is it only me that sees a potential problem with this situation - re dup content issues.

[edited by: ciml at 5:24 pm (utc) on June 24, 2005]
[edit reason] Fixed stretching. [/edit]

steveb

7:05 pm on Jun 19, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"So how come when I ask you for assistance or an explanation, you ignore me?"

What? I posted exactly what you need to do, and have done it so many times it will probably get a dupe penalty for webmasterworld.

It's rude and peculiar to ask for help, get it, then complain.

Johan007

7:28 pm on Jun 19, 2005 (gmt 0)

10+ Year Member Top Contributors Of The Month



P.S. J., thanks for the spam report. I love to hear about any spammy sites that we're missing


GoogleGuy I too would like to know what Google considers spam. It’s critical cos their are too many legitimate methods such as in my case of the affiliate shop....and who is J?!

europeforvisitors

7:53 pm on Jun 19, 2005 (gmt 0)



I too would like to know what Google considers spam.

1) We can assume that anything within the Google Webmaster Guidelines isn't considered to be spam.

2) As to what does constitute spam, the definition obviously changes as the spammers' techniques evolve. (For example, AdSense scraper sites didn't even exist until recently.)

3) Spam isn't the only reason why pages might be whacked or downranked in the Google index. Duplicate content is another. For example, an affiliate or e-commerce site might have thousands of boilerplate catalog pages for legitimate reasons, but Google may feel that, since those boilerplate pages aren't adding value for the reader, they don't deserve to be ranked high for their keyphrases. GoogleGuy mentioned the need for "value add" a week or two ago, and I believe he also referred to the need for "diverse" search results (presumably so that users don't have to dig through boilerplate clutter to find a range of information on a topic).

Johan007

8:45 pm on Jun 19, 2005 (gmt 0)

10+ Year Member Top Contributors Of The Month



Europeforvisitors that’s exactly whats happened to me – unfortunately. I have deleted all the “spam” now what? How long do I have to wait? Not up to six months I hope or I will be forced to move the good site over to a new domain! I sent an email via the GG’s Googlegroups address.

kgun

9:09 pm on Jun 19, 2005 (gmt 0)



Hi erverybody, I am back. I laid in my bed and was thinking.

For economists, the assumptions you make are very important.

Assumptions:

1. Google does their utmost to continue as the best searchengine in the world delivering the best SERPs. How many use (live on) these results?

2. Surfers want an up to date dynamic index and archieve of web pages.

Result:
How many computers and superfast connections do you think you need to synchronize, replicate and update thousands of databases of Tb's size spread around the world?

Who else can manage that job? Who else can do that job better?

Nevertheless. But why, after 400 pages of discussion does people start to talk about robots.txt and Bots (NO)FOLLOW (NO)INDEX and GoogleBOT (NO)ARCHIEVE?

Don't blame Google for that, and you should be able to understand this (please don't remove it, since it is at least as important as COPYSCAPE) link:

[loriswebs.com...]

KBleivik
It is up to you to organize. Back to bed for me after a cup of warm coffe from Brazil. What had the day been without that coffe? What had the day been with only meta SERPs?

joeduck

9:20 pm on Jun 19, 2005 (gmt 0)

10+ Year Member



Excellent observations by EFV as always.

I would add though that following the Google Guidelines sometimes approaches art more than science. "Adding value" and making a site with users in mind rather than search engines does not mean the same thing to everybody.

Johan - are you saying you put up spam on purpose?

HansDekker

9:25 pm on Jun 19, 2005 (gmt 0)

10+ Year Member



Well I lost some and I wone some in the Bourbon update. I still have the feeling its far from over. Results in my niche are still very poor. Spamsites in toprankings and good quality content sites at 100 minus places.

I lost money this month, but I still do well in Yahoo and MSN. If there is one thing I learned in not to put any trust in Google.

I have to find new ways for my business model to succeed, to rely on a spoiled over powerfull molog like Google is a surefire way to bankrupcy. If not with Bourbon, it will be with the next or the one after.

Hans

joeduck

9:30 pm on Jun 19, 2005 (gmt 0)

10+ Year Member



laid in my bed

Kgun! You lucky Norwegians have all the fun while the rest of us just *lay* in our beds and worry about Google.

MikeNoLastName

9:43 pm on Jun 19, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>"So if your page is redirected using 301 to new page, that in turn reports 200 - G sends information "only non-existing pages can be removed". Period. You have to wait for your 301-ed page to dissappear form the index. As the pages are reappearing now - after 6 months - you can wait long. "

If you have having problems with an old redirected page showing up in the index here is what I did last week. I can't guarantee in my case it wasn't a coincidence as the bourbon update was still going on, but I CAN confirm all of them are now gone and it generally happened within 48 hours after performing this operation. I WOULD NOT ATTEMPT THIS ON A INDEX.HTM, INDEX.HTML OR INDEX.ASP OR SIMILAR PAGE OR ANY PAGE REDIRECTED TO OR FROM AND ROOT/HOME PAGE! Doing so anyway is of course at your own risk.

1. Edit your .htaccess to temporarily remove the 301 redirect. Also remove the old redirected page if you have not already done so and make sure there is no page returned when you access the URL. Make sure it returns a 404. If the page is critical and you can't risk having it unredirected even for 1 minute then temporarily replace the old URL with a copy of the redirected page with a NO INDEX, NO FOLLOW metatag for the purposes of this procedure.
2. Login to the URL console and submit the OLD (404 erroring) URL to be removed using the single page 404 method.
3. Optional: watch your realtime log on that domain for the Googlebot to drop by and try to get the 404'd page. It should not take more than 60 seconds.
4. Your URL console should now say pending.
5. Replace the 301 redirect in .htaccess immediately.
6. within 24 hours you'll get an error message in the URL console, that the page cannot be deleted, but you already accomplished getting GBot to recrawl it. Within 2-3 days it SHOULD be gone from the index.

Again, use at your own risk!
Good luck

kgun

11:18 pm on Jun 19, 2005 (gmt 0)



So Google make a little manual that collects all information in one place about:

1. Metatags.
2. Robots.txt.
3. Redirects, hijacking etc.
4. Absolute vs. relative URL's.
5. Submitting the sitemap.
6. Consitency.
7. Anything else.

in one place. What about a little file

google.txt

that automatizes the process? Why have to log on to Google.com (se only two reasons - possibility to encrypt code to prevent hijacking and manipulation and fix things before the spider crawls the site)

Give an outline of good and bad pratices. Some examples.

KBleivik
Make it simple, as simple as possible, but no simpler.

sailorjwd

11:31 pm on Jun 19, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



kgun,

that last cup of coffee is keeping you up.

patchacoutek

12:05 am on Jun 20, 2005 (gmt 0)



Hi,

I would like to know how many people had all their pages become supplemental results or URL only, and then came back recently. I'm slowly losing hope, I think that maybe if all my pages became supplemental or url-only, I have been banned or got a bad penalty?

Please Help!

Did your pages come back when they were all supplemental?

Thanks
Alex

kgun

12:10 am on Jun 20, 2005 (gmt 0)



Tried to sleep. Some more thoughts.

On the file
GoogleBOT.txt, alternatively on robots.txt

1. Default options.
2. Other options.
- How to treat absolute and relative URLS,
- How to treat different redirects
- How to treat error situations
- How to treat www and non www.
3. Procedures for various types of servers. No
need to log into the serverpanel to handle
different servers.
4. Possibility to encrypt code!
5. Possibility to encrypt archived pages.

Version 1.0 of manual. Evolving manuals, version 1.1, 1.2 .... 2.0 .....

And professor GoogleBOT you have to be backward compatible with GoogleBOT.txt

KBleivik
Excellent coffee from Brasil. Impossible to sleep.

sailorjwd

12:35 am on Jun 20, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Patch,

For my site supplemental pages have usually meant one of 4 things:

1) too little content
2) dup content on your site
a) true dup content
b) page showing up twice - non www and www version
3) dup content on another site
4) possible 302 redirect issue (hijacking) (also a flavor of dup content)

sailorjwd

12:35 am on Jun 20, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Patch,

Pages can come back once some or all of the points are identified and rectified.

theBear

12:37 am on Jun 20, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



MikeNoLastName,

Have you figured out why a Google cache entry for your site is showing in the allinurl for your site?

Have you contacted Google?

I noticed it is still there.

kgun

1:04 am on Jun 20, 2005 (gmt 0)



Please encrypt your comments, send sticky mails or log into the Google panel.

KLastName with sticky mail off. Good night.
Q.E.D.

Tech_32

1:26 am on Jun 20, 2005 (gmt 0)



Hey There All

Just joined

So this is the place everybody keeps telling me about?

Looks like loads of fun.

Comments if I may?

No offense to anyone by any means, but I've been sort of reading around in here for a while, and I'm laughingly reminded of a bunch of old women sitting around a knitting clutch by some of the posts in here.LOL

Really though, some good and useful posts

Not having to deal much with the bug we've all come to love as Google.....all of my client sites have regained their original supreme stature and are just swimming right along

Just another busy day here in paradise

Have fun peeps

Atticus

1:36 am on Jun 20, 2005 (gmt 0)



Tech_32,

Welcome to WebmasterWorld.

Some of the old women are actually knitting shrouds for their competition. And look, that one has a machete under her shawl. Yikes!

theBear

1:41 am on Jun 20, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Tech_32, be careful on here, there is a lot of wildlife in the woods.

kgun, never use encryption, to really confuse the target a little cleartext can do the trick ;).

Tech_32

1:42 am on Jun 20, 2005 (gmt 0)



Too Kewl

:)

Tech_32

1:56 am on Jun 20, 2005 (gmt 0)



Says I'm not s'posed to put any outside earl's (urls) and to stay on topic....fair 'nuff

I worked at an online paid directory (name withheld for very obvious reasons) back in the nineties and can really enjoy tearing off my own chunk of red meat every now and again......there aint no kind of life other than the wild kind anyway...LOL

On topic?

Well, I'ld just like to say that this business of Google highjacking pages is the biggest bunch of boge I've ever heard. Please......I haven't laughed as much lately as I did when I read that.

Not only is this place fun, it's also filled with intrigue and mystery.....and of course, comedians. LOL

Seriuosly though.....Atticus, theBear, thanx loads for the welcome......

I do love my work....Keeps me out of Vegas, because this is just the business we're (or rather I) am in.

I've lost client accounts because of the manic-depressive nature of the indexes, but I try to just not get too worked up, or down about it.....It's a darn tough way to make a decent living for most and a horrible disappointment for others even.....But guys.....It's the thrill of the chase....Just remember that

[edited by: Tech_32 at 2:09 am (utc) on June 20, 2005]

This 1225 message thread spans 41 pages: 1225