Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Update Saga. Part 5

         

Brett_Tabke

8:26 pm on Nov 9, 2005 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



What say you?

Over and done with?

All done all through?

aeiouy

7:41 pm on Nov 10, 2005 (gmt 0)

10+ Year Member



That's the point the "smart businessmen" are trying to make. Perhaps if people started using the forums as an analytical tool and a place to exchange ideas, as opposed to a support group for the sharing of the "misery" google has inflicted, we would all be better off.

The problem is there is a significant percentage of webmaster's that have no real business experience and don't understand the demands and pitfalls of running a business. Too many people were told to tend to the Golden Goose, and then when the Goose dies they want to blame god for taking it away from them.

I think it is a good wake-up call for lots of people, as it is probably every time a change is implemented. If an algo change causes you to creep out on the window ledge, then you likely have no one to blame but themselves.

I continue to try and figure out what people want. I know of quite a few good and above-board webmasters who have benefited from this change. Yet I see some people here talking about how it is the worst thing ever. It is likely neither the best nor worst thing. It is folly to think Google could make millions of webmasters happy with their place in the rankings. In fact it is folly to think they could make even a significant percentage of them happy.

In some cases I saw improvements from these changes. In others I saw downgrades. I will use all the information I have and make appropriate decisions and adjustments moving forward. No amount of hand-wringing or footstomping is going to change the reality that will be the final outcome of the Jagger update. So I would suggest some of you direct that energy into an actual productive direction. Feeling sorry for yourself or getting mad at Google accomplishes nothing for your bottom line.

WebFusion

7:43 pm on Nov 10, 2005 (gmt 0)

10+ Year Member



If what you are doing isn't working change your approach. Goes along the line of "never fall in love with your business - it is just a tool - use it until it no longer serves it's purpose then find a new/better that will"

I think that's mroe relvant than ever now. While I still think one of the best approaches in the long term is Brett's "26 steps", in the short term, there is a TON of things people can do to increase traffic to legit sites (note: by legit, I mean sites that actually contain unique content/products, as opposed to datafeed generated affiliate sites and/or scraper sites). heck, one press release alone about a product that we've carried since day one on our site brought in over 10000 visitors in 24 hours and generated 133 sales. I should also note that press release cost us all of $75 to have a freelancer write of us (yet brought in over $4k in profit).

How many webmasters have gone the route of actually offering unique content to other quality sites in their niche simply for the courtesy of a link back. We actually have a contract with a few freelancers to create 5-10 new articles a week for OTHER sites in our genre, simply to build one-way links back to ours. It's a win-win, and generates both targeted traffic AND slowly builds our free traffic (which again, is just a bonus).

There are more ways to build traffic than the most obvious. Think outside the box. While this Christmas may be a bust, by this time next year, you could be in the position to give google the finger in their stocking ;-)

aeiouy

7:45 pm on Nov 10, 2005 (gmt 0)

10+ Year Member




I get Google Alerts and find many "scrapper" sites that are linking to me. Why doesn't Google come up with a way for the us to BLOCK these links from being associated with our sites...say if they link to you without permission.

Google would do that if they could. The problem is determining who "owns" the content? What if the scrapper site tells google to have you stop using "their" content before you make your claim? Then you are left out in the cold. That is kind of the crux of the issue. Defining ownership is simply a very difficult task, and not one that any algo, at this point, is very good at figuring out.

reseller

7:50 pm on Nov 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



LegalAlien

>>reseller,
I never intended on offending anyone. I just wanted to express some serious frustration at your incredibly happy approach throughout this update -- up at 5:30 with a bright smile..<<

I understand your frustration. Thats human.

Ok. Here is some webmaster- and personal things mixed together ;-)

As I mentioned in my previous post, I got a hit on 2-3 Feb 2005 (Allegra) where I lost 75% of my Google's referrals. Then got a second hit on 22nd July 2005 which left me around 5-10% of my pre-Allegra Google's referrals.

Then something happened around 19-22 September 2005 where I could see little traffic is returning back but not much.

So when Jagger update started, I had really not much to lose but everything to win. Therefore :-)

>>-- what brand of coffee do you buy? I've got to get me some! <<

Nescafe Instant Coffee Gold Blend, usually followed by a cup of a danish brand Cappuccino :-)

Hanu

7:56 pm on Nov 10, 2005 (gmt 0)

10+ Year Member



Defining ownership is simply a very difficult task, and not one that any algo, at this point, is very good at figuring out.

What about first come first serve? That's gonna catch 99% of the scraping cases.

arubicus

8:06 pm on Nov 10, 2005 (gmt 0)

10+ Year Member



"How many webmasters have gone the route of actually offering unique content to other quality sites in their niche simply for the courtesy of a link back. We actually have a contract with a few freelancers to create 5-10 new articles a week for OTHER sites in our genre, simply to build one-way links back to ours. It's a win-win, and generates both targeted traffic AND slowly builds our free traffic (which again, is just a bonus)."

Actually is a win win win situation for you, the site you provide content to, and for the visitor. Also it helps build you the all important BRAND and helps move a site into an authority status. On the plus side you also develope a relationship with other site owners big and small throughout your industry. Many will start coming back for more info and articles and begin to look to YOU as the leader of that industry.

Markoi

8:12 pm on Nov 10, 2005 (gmt 0)

10+ Year Member



From Matt Cuts blog:

Q: Do you still want spam reports if I see things like hidden text, hidden links, etc.?
A: Absolutely. Wefre working through the reports that wefve received, but Ifd love to have more. Just to repeat, you can do a spam report at [google.com...] with the keyword gjagger3 and wefll check it out.

G can't filter it out so we have do it? Strange!
I reported a few, and they are stil in the serps.

Take a look at this.

[google.com...]

How much spam do you need to filter it out?

2by4

8:18 pm on Nov 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



"How much spam do you need to filter it out?"

Enough to adequately train the filters is my guess. How much that is depends on how creative seos are, I've seen radically different ways to generate hidden text for example, each more creative than the last, and each harder to detect than the last.

flyboy

8:33 pm on Nov 10, 2005 (gmt 0)

10+ Year Member



I’m seeing some interesting things going on with jagger3, my chronological problems seem to have been fixed. Googlebot/2.1 is updating pages that where url only or supplemental, new pages added, old 404 pages disappearing. I have supplemental results going back to April 2004

MozillaBot has probable been on every page twice last week and once this week, not doing – changing anything, Googlebot only 3 or 4 pages a day out of about 600, but see the updates within 48 hours. Home page with a fresh tag every 48 hours.

The biggest obstacle I still see is self inflicted duplicate content; in January I introduced a logic bug into my site, when paging thru products in a category I changed case on the url, now my category page is url, and my bad case Category page is supplemental.

I had the bad case blocked by robots.txt for several months returning a 404, only MozillaBot came to the page. Just last week removed it from the robots now returning a 410 hoping Googlebot will be by soon.

These pages began performing badly in msn and yahoo, but came back to life within weeks.

Am I correct in assuming G has a within the site duplicate content penalty, has any body seen this self-correct, or should I to go beg GG and friends for forgiveness?

LegalAlien

8:35 pm on Nov 10, 2005 (gmt 0)

10+ Year Member



2by4,

>> Google isn't a public library, which does have this type of responsibility. It's a publically held media corporation, whose principle responsibility is to it share holders. <<

My comments were relating to this update and not Google in general.

Google stores other people's data and sorts this into an organized index, thereby allowing users to find that data. It does not own that data, it makes no changes to that data, it is not libel for that data and it doesn't write any data of its own (referring to the search side of Google). It simply indexes and presents it. That sounds a lot closer to a library, than to the other media examples you gave.

Whether publicly or state owned, does a library have the right to remove a large portion of its most popular books, just because it can't index them properly. Perhaps it does, but it also has a responsibility to its users, authors and publishers to ensure those books are back on the shelves within a reasonable amount of time, no?

This 1356 message thread spans 136 pages: 1356