| This 186 message thread spans 7 pages: < < 186 ( 1 2 3 4 5  7 ) > > || |
|Google's 950 Penalty - Part 5|
< continued from [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >
"That's exactly the sort of sites I'm referring to"
Unfortunately some comments about this issue apparently can't be bothered to actually, horrors, look at the serps. Authority has a specific meaning with Google, and its plain that authority sites are what are commonly, mistakenly hit by this penalty. I don't think this is a good summary of the effect, but one simplistic way to look at would be to say authority sites with volume of quality in-links are being confused with spam sites with a volume of rotten quality in-links, sometimes.
One of the most interesting phenomenons is how an authority site can be #1 for red topic, blue topic, green topic and purple topic, but be 950 for orange topic, even though the linking, page structure and keyword usage is basically the same for all of them. Clearly a a ranking mistake is being made (either the 950 result, or all those #1's).
[edited by: tedster at 9:17 pm (utc) on Feb. 27, 2008]
qwerty - As nuevojefe says - "To understand the filter you have to understand what they are trying to combat". Problem phrases are likely to be those that are likely to be abused by spammers. If a phrase is not searched for often, there is little incentive for spammers to flood the serps, and for Google to apply these filters.
Don't know how it holds for others, but from what I've seen problem phrases are searched for enough that they show up when looking on GoogleTrends.
You can't be sure which phrases are the problem but you can make an educated guess by looking at the scraper sites that link to the affected page.
I look up each of my pages on Yahoo Site Search. You can block out your internal links and scan through the links that are left.
Whats somewhat interesting is that I am still recieving traffic for those keywords and my traffic has not dipped much at all. A friend who is searching says my site on his computer ranks top 3 in Google for its phrase, and here is position 870.
|Whats somewhat interesting is that I am still recieving traffic for those keywords... |
Yea, in some cases the affected page still ranks for the phrase in question in some DCs. I'm not sure how long that lasts or if it's common.
As MC (no relation to JC) has mentioned previously their normal penalties can be somewhat based on scheduled re-reviews. Like if you do X you get penalized for Y days, and then re-reviewed (be it manually or automated), then if still worthy of penalization you may stay penalized for Y+n days, etc.
So, one must wonder if a similar system is being used here as many people have been filtered, released, re-filtered, etc.
If that is the case it could mean significant problems for some people who just "wait it out". But, based on one case for us, it is also 100% possible to come back without doing anything, and stay back (since early January).
So, some possibilities include:
1. Scheduled re-evaluation of data - Meaning they look at pages' current criteria and re-rank based on their latest iteration of the algo.
2. Filter tuning and user data - Meaning they simply are adjusting the filters from time to time and evaluating user data for each site and in relation to sets of SERPs.
One thing I can add in our case is that getting more links, of marginal quality anyways, does not help. Nor did changing internal navigation to omit the search phrase in question. (those links by the way work great under any other circumstance).
Something else to consider is that the algorithm may be more focused on looking for phrases that an automated content generation system would consider related based on the usual methods of sourcing and aggregating data to use in the automated creation of content.
That means it's not necessarily looking for some magical number of related phrases that make the page worthless... if they are all truly related phrases in relation to the context of the page. What it means is that they are looking for some combination of related phrases in-context along with a number of related phrases out of context and penalizing for surpassing a certain threshold.
Basically, to sum it up in one sentence, it's the difference between co-occurance of phrases in a machine generated corpus (like the top 1000 results for "widgets") vs the co-occurance of phrases if a human handpicked 100 documents about widgets in one particular context.
*note* - Before I submitted, I just did a sanity check, by searching for one generic keyword and the page we've had penalized is ranking just as it was a few days ago (#55 out of 51,500,000 results). But when another keyword is added, forming a phrase which one can easily determine context from, the page is banished to 950+ realm.
*edited for clarity:
The #55 position keyword is not one we are targeting directly, just a result of links and content for the primary keywords.
And, the page in question maintained top 1-8 rankings for about six two-word phrases which convey context; for the previous 5 months.
For us the issue only affects .com results, google.ca shows all of our rankings intact as they were previously with no penalties whatsoever. Is it possible I have diagnosed this incorrectly or do others notice the same thing?
So just to be clear, you believe this penalty is only about phrase detection and not internal linking to that page using keywords?
The phrase based indexing patents make special note of phrases occuring in anchor text - as well as within other html mark-up and even special grammatical use, such as appearing inside quotation marks. It's all at work, in my opinion, so each individual case may respond differently.
Googles objective is to isolate what it calls 'honeypot' pages which appear to have been written purely to attract as many keyword phrases as possible. Pages are defined both by the links (especially internal navigation) AND the words on the page. If these two factors, when analysed off line trigger suspicion, then the page will be treated with suspicion at run time according to specific searches. Google accepts that a suspicious page could still be relevant, after all, a good page may cover many phrases and still be helpful to a user. Therefore there are redeeming factors it will look for at run time. The best way to understand the process is to imagine that google puts all suspicious pages for a search to one side, then turns up the normal run time ranking algos for that set of pages according to the search phrase. If it see's no offpage 'recommendation' using stricter normal ranking algo's, the page remains set aside and ends up 950+. Hence effected pages can rank for some terms top, but others bottom, or did rank top, but now bottom..... or, according to how high they turn up the algo for suspicious pages, you can survive for a few days (weeks?) then get hit again when they turn up the 'suspicious' algo requirements.
As all the run time ranking algos are being applied for suspicious pages but in a much stricter way, any number of changes could help to signal to google that your page is in fact relevant for a specific search phrase and not a spam or honeypot page. The best solution is not to be flagged as suspicious in the first place and this happens off line and with NO search phrase in mind. However, some sectors dictate that avoiding the frequent use of multiple phrases on a page is almost impossible, so in this circumstance, you have to use off page signals that your page is both of a sufficient quality and relevant.... for the search phrase used. Thus a hit page can still rank (if deemed suspicious for that search phrase) but getting the off page signals to be effective is very difficult and complicated.
Where I believe google has got it wrong is that when a page is set aside at run time, the links off that page have no 'local rank' value for other pages. Hence steveb's comment that pages in the same folder can all get hit because of a top level page being hit. That strikes me as a bit harsh, but having said that, I can see how this will weed out clever spammers.
There is no magic solution applicable to all sites that any member can annouce here, although the solution is within your grasp. If you are being hit and cannot avoid it by rewriting your pages to avoid being percieved as potential spam, you need to increase the off page signals that your site is both relevant and high quality. This involves links and there is a fine line between getting it right or having no effect at all... or increasing the chance of being hit further. SEO is site specific and to be honest, you have to spend a lot of time and energy working it out for yourself and identify what your exact problem is.
...950 and falling..interesting thread to say the least. And yes, I'm a newbie here so be easy on me if I'm out of place, or in the wrong place. Now my story for what it's worth.
We've had several good content niche sites long before Google was even a goo (10 years for some of them) so naturally when Google came along they were ranked high in their serps as they were in Yahoo. Traffic, repeat traffic and of course Adsense income have been good. That was until the Googlebomb hit us 8 weeks ago. In an instant, over half of all our indexed pages (100's) in G were gone. Result, need I say it -$ revenue tanked. Many of the pages came back within a week or two, but just when we thought things were getting back to....Boooom! Gone again. And the killer is that there is not a damn thing we can do about it at least from what we can surmise by the myriad of solutions offered here in this thread. Throw the dice and pick one.
To note, our bombed sites are well designed and structured, no keyword spamming, fresh content, plenty of quality outbound/inbound links from major media, useful info for visitors, ETC. ETC. ETC.
[edited by: tedster at 6:36 pm (utc) on Mar. 6, 2007]
|For us the issue only affects .com results, google.ca shows all of our rankings intact as they were previously with no penalties whatsoever. Is it possible I have diagnosed this incorrectly or do others notice the same thing? |
Yes I see the same thing but for us it's not google.ca it's google.ru and a few others.
|Googles objective is to isolate what it calls 'honeypot' pages ............. you have to spend a lot of time and energy working it out for yourself |
This message #:3272371 by MHes is the best concise, understandable description I've seen.
Sure old sites, hobby sites, academic sites, and many more legit sites have been hit. It's not fair but it doesn't help to curse Google. It is the way it is. You can either wait it out or try some things to see if they will improve your rankings. The old-timers here are not keeping a secret on how to get out of this penalty. It just takes a lot of boring study and tedious work and maybe you will be able to sort out what is wrong and correct the problem.
My site has gotten all its rankings back. One of the 'honeypot' pages which was removed from the serps for its search term is now back. I made changes to the page to remove the overuse of the keyword but the cached version is still showing the honeypot page from before. We acquired some very high-quality backlinks recently which may have caused the page and the site in general to 'bypass' the filters. Some people might call this a 'rollback' but unless everyone sees their rankings come back then it must be the new backlinks that have made the difference and the filters are still lying in wait in case we falter.
Correct me if I'm wrong about my understanding of this please or if you see any misunderstanding in what I've written - thnx.
In addition to this particular filter I see some odd rankings on the 72 series datacenters. On the datacenters where our pages are gone, either 950 or gone entirely, there is up to 3 and 4 listings of the same websites at different positions sprinkled throughout the rankings from 1 to 950. Some of them appear for example at 500, 680, 780, 950 ish.
One of the pages that made it into the top10 of my main key-phrase recently, has one line of content:
An introduction to widgets – what it is, how it developed, and how to create it.
The rest of the page is advertising and.....internal links. So much for the theory that internal links trigger a phrase-based filter.
I know, phrase-based proponents are just doing their best to find answers, like the rest of us, but imo "phrase-based re-indexing" is still a hollow phrase.
I would be real curious how it ranks for allinachor? Run the command; allinanchor:key word and where does the site rank? Is it rich in that department or weak? It has seemed to us if you have a strong, diverse set of external back links you are ammune from this thing.
But who really knows, one of the challenges with this thing is they are defintely adjusting it making it hard to gauge wether any changes you made helped or not. I continue to see sites I am absolutely amazed are sitting 2 or 3 spots from dead last.
|Run the command; allinanchor:key word and where does the site rank? Is it rich in that department or weak? It has seemed to us if you have a strong, diverse set of external back links you are ammune from this thing. |
Im afraid this isn't accurate randle from our standpoint. Our site is at 950 for 3 keyword phrases on the index, and ranks top 10 - 20 for allinanchors for those exact phrases....
In allinanchor it ranks 10, in regular SERPs 8. This site is one of the main encyclopedias, though. I've never seen so many ads on one page.
My site has thousands of back links, including links from sites that rank top on my main key-phrase. One site is linking one-way from his homepage to my site, but is currently ranking better than I am.
The former #1 is also affected. It has hundreds of one way links, many of them .edu, while it only has 5 outbound links.
Links don't seem to matter much, as far as the ranking issue is concerned, in my search category.
No matter which way you turn it, nobody has been able to come up with an explanation. On Google Groups, Googlers are very keen on "fixing errors". If you report what they see as an error, they'll fix it the same day.
I posted this a few days back:
"If you search for 'keyword1 keyword2' (one of our key phrases) we are nowhere on some data centres.
If you search for 'keyword2 keyword1' then we rank where we used to rank for 'keyword1 keyword2'. "
We are fine again, and did nothing. It seemed to clear are google re-spidered our home page.
We're back at nº1 after a month.
What we've done (not necessarily effective, as we never get an answer from Google):
- Removed latest widgets, most read widgets and a long menu (blog style) with all kind of widget sections.
- Written several times to Google complaining
- Posted on Google webmaster discussion group
Let's hope this time lasts.
[edited by: Biggus_D at 3:20 pm (utc) on Mar. 7, 2007]
We are also back after a month plus (gone since Feb 1st)
I removed a menu that Google somehow started showing as snippit in search (?)
Saying a prayer for longevity this time, as well!
I am also back, after being back once already for just 3 days I too am hoping it lasts this time, I have rewriten my menu system.
I am sorry to say this but was kind of hoping nobody else was showing any improvement, as I feel this is a sign of a google change rather than anything I have done over the last week or so.
Something very interesting - I just checked webmaster tools and it's showing updated link data.
On several of the pages that seem to be affected by this penalty, it's showing TWO entries for the same page with slightly different backlink data.
Could this be something to do with the penalty?
For example - One url is shown twice, one with 69 backlinks, another with 67.
Another url that gets affected is shown twice, once with 12 links and another with 10.
Anyone else seeing this?
Yesterday, I suddenly dropped for almost all my keywords - page 1 to higher than I can count... This has never happened to me before, can't make sense of it...
Just checked the links in webmaster tools and I see the same thing - same urls listed twice.
Its an update of some sort, hang on there is definately something brewing.
Lots of strange things I can see, so hold tight.
Three or even four listings of our website between position 1 and 950 in Google for one search term.
No loss in traffic and perfectly fine ranks on about 90% of DC's and 950 on the rest.
Something is definitely going on here!
Just to be clear, are we seeing this on all types of sites?
My site is on a shared server. It would be interesting if no one with a dedicated server or VPS were paying attention to this thread.
BTW, for every post, there are probably a thousand views here.
On my sites it looks more like a rollback to last year. Google is inconsistent. Whatever comes will stay only shortly. Google is now frankly incompetent.
Wikipedia on top then loads of bs, sites with mega adsense adblocks. I mean MEGA. User experience ... hahaha
I have a dedicated server and some of my sites on it are in 950 land. Three sites in particular, interlinked (reasonably, in my eyes), and on different aspects of the same theme (human sexuality and sexual health), have been demoted.
Ironically these have all been at number one spot in Google at various times, some for years. The first one was demoted two years ago, the second last April, and the third a few weeks ago.
It seems to me that the clustering analysis effect described earlier in this thread is not only applied to pages on the same domain but also to groups of sites on the same server.
I'd therefore say the occurrence of the predictive phrases over a group of sites has helped to condemn them to the end of the search results, so I'm doing some remedial work right now and will report back if it helps. I mean: new servers, removing interlinking, changing word and phrase density, and altering navigation text and structure.
I also note that all these sites have been demoted in Google image search. I saw the image search thread (very muted, it seems to me, in view of rather large changes) mention a removal of adult images, but I wonder if in fact the explanation lies rather more with in a site suffering the 950 penalty?
PS As far as Google's Search quality is concerned, like many other affected owners I'd say that in my view the quality of the search results has been lowered, because my sites are not there. They are good sites, full of content (and, naturally, monetized to the full.) As Christine Keiler so memorably observed: "Well, he would say that, wouldn't he?" The truth is, once I detach myself from my personal involvement, and do a selection of test searches, I really don't notice a diminution of quality. So I don't think this algo is going to be rolled back or reverted any time soon.
Second, and more important, to flog a proverbial dead horse, even if the quality had gone down (however you define that) there is a, shall we say, limited incentive for Google to improve it, since (1) they have no real competition and (2) I'd assume an increase in search quality reduces adwords revenue.
Oh - I just wanted to add thanks to MHES for his input. Most helpful.
|It seems to me that the clustering analysis effect described earlier in this thread is not only applied to pages on the same domain but also to groups of sites on the same server. |
This is sort of off topic but I'd like to see if others think I'm on the track of something or am I just imagining this.
It could be that or it is based on what Google sees as sites that are related to each other not in topic but in possibly coming from the same person or group of people. I don't know how to describe this exactly. But I think there are patterns that Google considers that makes it likely sites are connected in some way no matter what server they are on.
I have three websites and they now all rank at the bottom of the last page of results for common keywords. They used to all rank in the top five results. The first website dropped in June last year, the second was in November last year, and the third was this January. Now I barely get any traffic from Google and see no signs of a recovery.
I am at a complete loss.
| This 186 message thread spans 7 pages: < < 186 ( 1 2 3 4 5  7 ) > > |