homepage Welcome to WebmasterWorld Guest from 54.205.247.203
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 195 message thread spans 7 pages: < < 195 ( 1 2 3 4 [5] 6 7 > >     
Google's 950 Penalty - Part 9
annej




msg:3336309
 9:13 pm on May 10, 2007 (gmt 0)

< Continued from: [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

That's because we are shooting in the dark

We really aren't shooting in the dark. We are shooting at dusk. We can see a fuzzy image of what is out there. Sometimes when we shoot we hit the target and other times we can't no matter how hard we try.

But we waste our time when we start shooting at theories like

- Google is doing this so more people will pay for AdWords

- Google only hits commercial sites

- If you have Google analytics you will sorry

- Only sites doing something illegal are hit by -950

- It's because you have AdSense on your site

- Scraper sites are doing this to us

It goes on and on.

Is it because the phrase based theories are not an easy answer? It does take a lot of work to figure out why you might have been 950ed and sometimes you just can't find the answer. But I still believe that most 950ed pages have been caught in an imperfect phrase based filter.

[edited by: tedster at 9:14 pm (utc) on Feb. 27, 2008]

 

Marcia




msg:3343622
 1:49 am on May 19, 2007 (gmt 0)

So mattg3, how about telling us how the ADF** works. What factors are they looking for to pick up algorithmically to move out of the way?

**ADF = Adwords Diversion Filter

mattg3




msg:3343813
 11:11 am on May 19, 2007 (gmt 0)

I was stating a likelihood that adwords might be in the system based on the fact that the ethics are bit wobbly in Google somewhere else. :)

Making webmaster guidelines that go on about not having duplicate content and then hardlinking to a site that does nothing else seems to be obvious proof that these rules are flexible.

Google acts like a normal business, that's all I am saying. Maybe I am glad it's Google and not Yahoo, which are even more closed. I just doubt statements that there is a holy war against bad content and content devaluation on. It's pretty obvious this is just part of the bigger picture.

The bigger picture is that Google likes to increase its profit. Doh.. :)

Nothing wrong with that, I just don't believe the whole holiness argument.

So the question if adwords is in the serps algo is an open question. We have no direct proof of either.

Maybe the decision on Googles side is that they actually don't do that, as if it would come out it would be disastrous for them. Yet this is a business decision and not an ethical one.

And could people please not PM me about solutions to their 950, I have no more idea about the 950 than ayone else on the thread.

Marcia




msg:3343828
 11:36 am on May 19, 2007 (gmt 0)

I was stating a likelihood that adwords might be in the system based on the fact that the ethics are bit wobbly in Google somewhere else.

No, you were stating a whole lot more than just that simple likelihood, you have repeatedly done posts throughout all of these threads alleging a whole lot more than just that. This precisely illustrates the allegations you've been making - direct quote:

Well I guess there is a seasonal dial, one for getting rid of the adwords ads and so on. Some might be automatic. If the bounce rate [and whatever they use to determine user satisfaction] on page 1 and 2 rises, reshuffle.

Every update has to at least match the income of the previous setting or improve on it. so maybe they go for a medium bounce rate so that people click on the ads. Given the man power and the financial you can, I guess, build a nice system that goes for maximum checks if you have a local maximum and so on (a maximum can be local so occasionaly you want to try totally new values to check for a local maximum). Link that to news and adapt accodingly what data set algo variation will be used.


So tell us how this seasonal dial is working, and how it's known what every update is supposed to do. That kind of knowledge is miles ahead of many the rest of us, I must admit.

And as for the other statements I've bolded, besides, those specific points, please share some substantiations with us, as to what you're basing the allegations on that keep getting repeatedly posted. Particularly the part, stated as FACT, of what's required with every update.

It's time to stop holding back. Please do share the basis for your facts, we'd love to hear it.

mattg3




msg:3343876
 12:46 pm on May 19, 2007 (gmt 0)

I always state guesses which I have repeatedly said.

Allegations: Is this a DISCUSSION forum or a criminal court, loosen up. :)

But I have shown that Google does not stick to it's own quality guidelines. If you choose to ignore that it's your own decision.

Is it not a fact that Google continues to increase profits? So have a think how they could do that? Maybe it's in a different way. Unlike you I am NOT that damn sure about what I state.

Why don't you proof that they don't use adwords in the serps. It's just a matter of opinion, there are very few hard facts. Maybe you try to reread what I say without the particular glasses on, you might get a different impression.

But if you insist to put me in one camp, that's also your decision. The simple hard truth is we don't know. :)

mattg3




msg:3343907
 1:33 pm on May 19, 2007 (gmt 0)

So tell us how this seasonal dial is working, and how it's known what every update is supposed to do.

I can tell you my guess. IE that more shopping sites come up at Christmal time as this is what people do. That way you increase user experience and profits for shopkeepers and adwords. Should make everyone happy.

And why the hell not? Economics are part of things. People need to pay mortgages, meals, kids in school whatever.

crobb305




msg:3343946
 2:52 pm on May 19, 2007 (gmt 0)

I always state guesses which I have repeatedly said.

mattg3, Matt, I haven't read all of your posts, but you may have some good ideas. So keep posting! That is why we are here :)

JoeSinkwitz




msg:3344137
 7:18 pm on May 19, 2007 (gmt 0)

Had another site go to the end of serps this morning after cracking the top 10 two days ago; I wish I had been tracking overall backlinks and scraper link percentages on that domain prior to the event, but for this particular case, really really junky inbounds in a quick period of time may have been the culprit (meaning a bunch of new scraper links tagged me like a teenager with a fresh can of spraypaint).

A followup on another test for a different domain; severly de-optimizing a site didn't bring it back after a re-cache: we left the desired phrase in the title and mentioned it twice on a page with 300+ words, removing most synomyms too, with the intent being to describe the service as vaguely as possible, without completely ruining the user experience.

There are a lot of other tests going on with a lot of other domains; if anything major happens in terms of changing on-page or off-page factors, I'll pipe up again.

I still very much believe that the mess is related to the recent phrase-based patents, but I fear more and more that the quality of the top 10 results is hovering at roughly equal before and after the re-ranking(when looking at a large number of different queries), meaning that the collateral damage effect might be around a lot longer than any of us care to see.

Cygnus

[edited by: JoeSinkwitz at 7:18 pm (utc) on May 19, 2007]

ALbino




msg:3344196
 8:57 pm on May 19, 2007 (gmt 0)

Instead of worrying about AdWords, let's worry about -950, all right? :)

annej




msg:3344244
 10:30 pm on May 19, 2007 (gmt 0)

Although the message from MC implied we needed to de-optimise I think the de-optimization has to be regarding the phrases that are causing the problem.

It seems like one could de-optimize the wrong things and just make things worse. They are expecting a document to have a certain number of related phrases in relation to how many times the flagged phrase is seen. It seems there are problems if you go over or under.

This is why it's about impossible to tweak on a large scale to get out of -950 which is exactly what Google is hoping. The typical spam site is on a large scale.

The problem is that regular pages can be caught in the filter through natural writing.

I found that most of my 950ed pages I was able to get back but a few I couldn't. But the de-optimizacion I did was related to what I thought might be the problem phrases not overall changes in my site.

TaLu




msg:3344253
 10:52 pm on May 19, 2007 (gmt 0)

Experimenting here:

One site in the 950 hole, I was deleted all content (All) and only have put an empty index.html with title "No title", no desc, no keys, nothing in body, only the text "Name site is closed" where "name" is the penalized keyword too. 4 days before the site get back for the 950 keyword to the 2 place, one day before that I was changed the body text to "Site closed", only 1 day get back to the 2 position and now the site not are in ANY position. I think because dont have the keyword in any place, only in the domain name.

Today I was changed the site and put the key in title and body but the site dont have anymore. I will update this thread.

Regards.

mattg3




msg:3344263
 11:09 pm on May 19, 2007 (gmt 0)

but for this particular case, really really junky inbounds in a quick period of time may have been the culprit (meaning a bunch of new scraper links tagged me like a teenager with a fresh can of spraypaint).

Yapp like my domain spammer network. Solicited or unsolicited inbounds are possibly really a part of this. The unsolicited is a part of would be the worrying thing.

mattg3




msg:3344266
 11:16 pm on May 19, 2007 (gmt 0)

Instead of worrying about AdWords, let's worry about -950, all right? :)

On the day I got 950ed I received 100% increase (aka around 30.000 PI a day) in website targeted ads. Personally I would see this as a sign of quality, but hey in G's world, the funniest things happen.

We can at least probably guess that targeted ads don't attribute to SERP rankings, OR they tagged me with some off topic theme or since they aree the biggest science publishers in Germany did something else. We sold them our domain for 3 years so they are interested in us.

ALbino




msg:3344296
 1:03 am on May 20, 2007 (gmt 0)

Right, but for the rest of us I seriously doubt Google -950'd our sites to increase AdWords revenue. Especially since, at least in my case, we don't run AdWords/AdSense/AdWhatever. We have no advertisements on our site. This isn't the issue, and it's off topic. Start a new thread about AdWord manipulations if you feel compelled to talk about it.

mattg3




msg:3344314
 1:48 am on May 20, 2007 (gmt 0)

950 seems to be a general penalty for a myriad of things, one inbound bad links.

I don't believe adwords would cause a 950, I just mentioned it as it coincided with something I would see as positive for a server, getting more website targeted ads, so the advertiser wants to see him/herself on it. If advertisers want to be on a site one wouldn't expect a downranking unless it would cost Google money as they are not bidding. Even if Google would be as cynical as that, which i do not think is the case, a 950 would be an exagerated measure. I think adwords and 950 are unrelated unless they link themes somehow. But most of my ads fit perfectly and are very well targeted.

I would really say that MC's answer to annej is the best hint, it's a seo penalty.

You have either bad inbounds bad outbounds, fiddled too much on your server.

Since the guy that jumped back and has duplicate external content, duplicate internal content en masse it doesn't seem to be duplicate content related, he uses 25 keywords also and has a botched up title that goes for definition. From that I would guess it's a too much fiddling SEO as opposed to SEO per se.

If you have done no seo, you are probably caught up in unsolicited bad inbounds.

GUESS! based on my observations.

I wonder why MC states "more likely reading SEO forums". Just trying to cloud the clear sentence: "you did SEO"?

ALbino




msg:3344408
 4:33 am on May 20, 2007 (gmt 0)

Did we SEO? Not in the traditional sense. There's no H1 tags for example, nor did we intentionally try and repeat any keywords that aren't organically there. The structure is very specific with a definitive navigation at the top and bottom an no boilerplate text outside of the navigation links. The only way we really did any SEO at all was to make sure it's very spider-friendly with a dial-down style of browsing.

None of the affected pages have ads... in fact, none have OBLs at all. It's all unique content, written by us and for our site. There are a couple of minor scraper sites, but certainly not enough to affect us I would think, and actually, many of the affected pages have no scrapers at all that I can discern via G or Y.

Lack of deep IBLs to those pages? Probably quite a few of them, but certainly not all of them. Maybe no "quality" ones though. Even still, getting quality IBLs maybe help alleviate the situation, but I don't believe this is the cause for being in it in the first place. That's just a hunch though.

mattg3




msg:3344494
 9:11 am on May 20, 2007 (gmt 0)

So it's pure chance? Not saying it isn't which would be worse.

Did you work on this site recently? Changed ISP, did round robin DNS, any other sites registered that could have done something.

G once mixed two of my servers up (one empty one full), so their detection isn't perfect.

ALbino




msg:3344691
 3:54 pm on May 20, 2007 (gmt 0)

The site hasn't had a major redesign in at least 6 years. The DNS and ISP have remained static for probably close to that long. We did add an additional way to dial down from the first tier around last July, but we didn't go -950 until February, so I don't think that's related. That was the last change we had made, outside of adding new content, before going -950. Like I said, we don't SEO, we never "tweak" anything. Our results have been good for years and years -- so why mess with it? For some of the -950 pages we've been listed number #1 for years until now. It came out of nowhere, and we changed nothing.

Just for fun, I picked one of the -950 pages right now and plugged it into Web Archive which has it listed as starting Feb 22, 2003 and the cache of the page is almost identical to the one that's currently live (outside of some minor stylistic changes). So we ranked #1 for that two-word phrase for almost 4 years, and now we're -950? Maybe lack of freshness is our problem and we should switch ISPs, mess with the DNS and do a massive site overhual? :)

It's a mystery for sure.

annej




msg:3344812
 7:20 pm on May 20, 2007 (gmt 0)

ALbino, I don't think it is a mystery. A lot of content sites and/or pages that have been around for years have been 950ed.

I believe that such sites have been caught in a phrase based filter that Google designed to reduce spam pages. No algo is perfect and some content pages just got caught in the net. It's not that we did anything different. Google Search is what changed.

Knowing that we have to hope that Google is constantly adjusting this filter and eventually it will more accurately target just spam sites. Meanwhile you can look back through the 950 threads to see what some of us tried and what seemed to work.

The phase based spam patent is a pain to read but you might get some insight looking at it.
Several patents including this one are listed at
[webmasterworld.com...]

ALbino




msg:3344855
 8:48 pm on May 20, 2007 (gmt 0)

I've actually been following this thread all along, but didn't start posting until Part 3. I was just recapping my specific "sob story" for matt.

Marcia




msg:3344914
 11:24 pm on May 20, 2007 (gmt 0)

Incidentally, I don't think *part* of what's happening is a penalty coded into the algorithms (plural), but seems more like a query time action or a filter. When a site can pop out in a few days' time, IMHO it isn't reflecting an algo change in the usual sense of the word.

Added:

In other words, query time reranking or filtering.

For some of it, that is - not all of it. There are some sites that appear to be penalized for obviously well-deserved reasons, but not all. It's not those, it's the others I'm referring to, the ones that are page/phrase specific and can pop in and pop out.

[edited by: Marcia at 11:29 pm (utc) on May 20, 2007]

tedster




msg:3345006
 2:09 am on May 21, 2007 (gmt 0)

It seems like one could de-optimize the wrong things and just make things worse. They are expecting a document to have a certain number of related phrases in relation to how many times the flagged phrase is seen. It seems there are problems if you go over or under.

I think some precision about what the word "de-optimize" means might serve us well here. Many people "optimize" by focusing their on-page factors and link anchor text on their search phrases, right? Some get more precisely detailed and look at the old-time parameters such as keyword weight, prominence, repetition etc. But who among us has developed a habit of asking "do I have enough related vocabulary on this page?"

I'm beginning to suspect that one common reason for trouble may be either no, or too few, co-occurring phrases. This would happen because of our "old-school SEO", text-match mindset. Almost without thinking my habit has been to find more opportunities in the content to get my target keywords in place more frequently. That knee-jerk tendency of mine also tends to eliminate many semantically related phrases - things that might naturally "co-occur" on the topic.

I recently had some success in moving some URLs from page four to page one in a single jump, simply by diversifying the language on the page. No, I wasn't working with a -950 issue in either case, but the approach I used and its startling effectiveness seemed very suggestive.

So one answer may be adding, and not subtracting -- and that's my point here.

TaLu




msg:3345015
 2:25 am on May 21, 2007 (gmt 0)

Albino, some aspect I was viewed in -950 pages is the static of it pages, if a page dont have changes for years the algo maybe can think this page its not important for the results because is out to date.

I think the update frequency its now one important factor for get well ranked.

Any think the same?

annej




msg:3345021
 2:37 am on May 21, 2007 (gmt 0)

"do I have enough related vocabulary on this page?"

Yes, yes, yes!

I was just rereading the spam patent and had the same idea. I'd been concentrating on phrases that were already there and I should have been looking more at what related phrases were not there.

That would fit my phrase that was the name of a war. To decrease the use of that phrase I substituted two other phrases that would let people know what war I was writing about. I even used these related phrases in some internal links.

Maybe it wasn't the phrases I got rid of that got the pages back. Maybe it was the phrases I added.

annej




msg:3345023
 2:40 am on May 21, 2007 (gmt 0)

The problem with the theory that pages that haven't been updated for a long time are more likely to get 950ed is that I have tons of old pages that are still in the top few search results. I'm guessing I'm not alone in this.

ALbino




msg:3345041
 3:14 am on May 21, 2007 (gmt 0)

I seriously doubt the age has anything to do with -950 specifically. We have many older pages than the one I cited above that haven't changed either and aren't affected by the -950 filter.

ALbino




msg:3345044
 3:18 am on May 21, 2007 (gmt 0)

So I guess what annej and tedster are almost saying is that the way to get out of the filter is to try and "look" like other pages? So if you have a movie site you should spam it with IMDb cast lists, or if you have a book site you should spam it with Amazon book reviews? Or with a war site you should just fill it full of Wikipedia info? That sadly actually sounds like it might work, but would be just awful for the users.

[edited by: ALbino at 3:23 am (utc) on May 21, 2007]

tedster




msg:3345049
 3:32 am on May 21, 2007 (gmt 0)

I'm mostly saying to write more naturally and expansively -- and to take off those text-match keyword blinders. Seeing how Wikipedia discusses a topic is a decent place to start to break down your blind spots, but I would definitely not say to just grab that content, or even snippets. That's not the point at all. Just think more like a user --- what they would expect to see, what's natural and comfortable to the vocabulary of your topic -- and don't avoid those word choices.

I want to be clear that this is stil theoretical in regard to the -950 phenomenon. The approach has helped me rank some rather depressed pages, but they weren't -950 pages. In fact, I just made the "add, don't subtract" connection a few minutes before I posted about it.

Marcia




msg:3345056
 3:51 am on May 21, 2007 (gmt 0)

Getting out of the thing, or avoiding the thing, is kind of closely related to what Matt has been advising for ages - to make sites for users, not search engines - which is pretty darn close to what the Eisenbergs teach about identifying the persona for a site for conversions. It isn't about SEO and keywords, it's about marketing and knowing how to communicate with users.

I call it "white hat content spamming" but it isn't really any kind of a hat. It does, however, take not page-stuffing a site with redundancies, to keep the most important pages out of the supplemental index (which is another topic we should be looking into in far more depth).

Sure there can be pages that slip through the cracks, but if a site does rank for relevant pages and users like it, they will find the other pages on the topic that are up on the site.

[edited by: Marcia at 3:54 am (utc) on May 21, 2007]

JoeSinkwitz




msg:3345076
 4:10 am on May 21, 2007 (gmt 0)

Tedster,

My severe de-optimization efforts probably could have been made clearer. Prior to modifiying the text, it contained what one might consider to be a sufficient amount of synomymn and appropriate usage vocabulary, judged in similarity of keyword usage percentages of the associated text against the other non EOSed sites. After the de-optimization efforts, the desired phrase was all but stripped from the text, using vaguer descriptions that wouldn't have alternative meanings. It didn't budge after the re-cache, so that site is being reverted in order to test another element.

For fun I loaded up the text of a different EOSed site I control into a blogspot, pumped it full of links, and saw that it didn't get EOSed in the re-ranking once popping. So, it isn't entirely phrase-based...other elements are at play that we haven't figured out yet. Aside from scraping and cloaking to regain the ranking of a site, there must be a legitimate way that naturally written content can come back.

Biggus_D




msg:3345081
 4:25 am on May 21, 2007 (gmt 0)

there must be a legitimate way that naturally written content can come back.

I would like to see that. Almost 4 months later (and adding content everyday) our stats not only do not not improve: they are getting worse.

mattg3




msg:3345193
 8:08 am on May 21, 2007 (gmt 0)

So all pages have to be like Wikipedia, but not too much like Wikipedia unless you are one particular company that gets hardlinked?

This 195 message thread spans 7 pages: < < 195 ( 1 2 3 4 [5] 6 7 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved