| This 199 message thread spans 7 pages: < < 199 ( 1 2  4 5 6 7 ) > > || |
|Google's 950 Penalty - Part 12|
< continued from [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >
It is totally nonsense for me to worry about TBPR when I was badly hitted from a -950 penalty (look! my PR raised in almost all pages ... but as I said who cares?).
Please all -950ers come here and join this thread to group possible causes.
Here are mine:
1) E-mail to Adsense team about an account creation with domain name
2) Too many adsense boxes
3) Midly Over-optimized pages
4) Too similar titles
5) Some directory links (as almost all my competitors though)
I add that in last months no big changes were done!
Join -950ers power :-)
[edited by: tedster at 9:08 pm (utc) on Feb. 27, 2008]
Ted, excellent post as usual. I often think the most successful people in our field take the time to read, review and hypothesize on just these types of patents.
And really what you are alluding to speaks of the types of properties of any given document and the WAY in which Google might see your document.
freelistfool, thanks for confirming it. Until now pure text has been weighted less than anchor text, this has changed now. Text is weighted more.
What I don't understand is why they hit 95% of local yellow pages sites. Those sites have a natural high co-occurrance of phrases due to the nature of the company name.
|Until now pure text has been weighted less than anchor text, this has changed now. Text is weighted more. |
Seopti, how have you come to this conclusion?
For most of the sectors I watch, linking has always, and is still, winning the battles. All that is changed it would seem to me is the re-ranking based on co-occurrence and spam detection of documents in Google which is causing massive movements backward for some websites.
|Until now pure text has been weighted less than anchor text; this has changed now. Text is weighted more. |
I suspect, despite Google's alleged love of short webpages, it recently adjusted the algo to shift the line in the sand wrt what it considers "too thin."
Some of my thin (underdeveloped) webpages don't do so well now. Other sites with more thick page content are doing better than usual.
"Content is king."
Google seems to be saying, 'If you have something to say, say it. If not, don't bother.'
I used to think it overrespected anchor text, so any shift of emphasis to text is a move in the right direction. But it still has a problem differentiating between anchor text and text. Search results should give the best final page result and Google should not get confused by navigation menu text.
Google seems to be saying, 'If you have something to say, say it. If not, don't bother.'
I'm starting to agree with this, too.
I'm piddling online and the vast majority of pages that have "no pagerank information available" (on domains with PR of course) rather than "0 pr" seem to be ones that are dominated by links in the content area (subcategory pages that link to the content pages). The content pages themselves show PR but their parent pages do not. Google might be bypassing the middle pages with little/no content.
I agree with the "thin content" or "navigation pages" theory. The page on my site that has been affected is a main menu page so it has a little blurb at the top describing what is available on the inner pages then a bunch of category links to get to the next level. It has disappeared for any competitive keyword.
The next level pages are still doing fine (still in the top 10 for the most important keywords).
I'll add some content to the main menu page and see if it has an effect.
I suggest folks under the 950 penalty choose random commercial keywords related and unrelated to your sector, then look who is at the back of the line, 950'd. Check their pages using the Google Highlight Tool from your ToolBar, and see if you can figure out a pattern.
I see one four-letter comm word... 950'd, for example... has a navigation menu with 20 different "unique" internal links, but they all have that one targeted keyword in each those unique nav links, lol. This is on a site by a former SEO expert.
I think the 950 is all about links--artificial and legitimate--including attempts to use keyword-stuffed internal links as a substitute for natural, powerful backlinks.
Keyword repetition doesn't get more link juice; it dilutes it. Consecutive keyword repetition is the worst. Three times consecutively even in outbound links can get you 950d.
Most of the phrases I've not been 950'd for have good backlinks. Others say they got out from the 950 penalty with good backlinks.
The New Google Math
Bad internal links -950
Good back links +950
"the vast majority of pages that have "no pagerank information available" (on domains with PR of course) rather than "0 pr" seem to be ones that are dominated by links in the content area"
Yes, but that has nothing to do with -950. It's happening all over the place, to sitemap pages for example.
In general Google seems to now not like pages with lots of links and photo pages without a lot of text. Both lead a person to think "more text on pages is a necessary thing now"... but still these are general issues, not 950 things.
I think I figured out what tripped the 950 on one of my sites.
Trying to helpful to visitors (the road to 950 hell is paved with good intentions), I created several webpages which look like search results for specific keywords. I do have Google site search, but the results aren't always as good as the ones I edit, or in the best order, because of weaknesses in Google's search algo. (These weaknesses include its inability to separate navigation links from page text.)
These category index pages don't have any page content besides the link text (anchor text). So it's title (H1), subtitle (H2), then all the links.
Crossreferencing does help visitors but Google now reads duplicate links as duplicate content (SPAM) if you put all the links it already found on other pages into one page.
The target pages from the index pages were all linked back to the index pages.
The most recent webpages like this were 950'd almost as soon as they went live.
The thing that's slightly different with this one 950'd page is the actual 950'd keyword is only in the page title, not repeated in multiple links on the page, like my other 950d pages. The only place it's "repeated" is in a navigation link. So it's one page with one keyword plus 50 pages with a link that has the keyword to that page.
So I wonder if Google, which has considered too many backlinks too fast as a form of spam, now also considers too many internal links (to the same page) too fast also spam.
Anyone planning to organize related content on their sites may want to reconsider. Meanwhile, I have to find an alternative site search to Google!
|now also considers too many internal links (to the same page) too fast also spam. |
Not only that, but possibly in combination with other 'failure checks' the number of identical links pointing to the same page from one page, or a number of 'Overly SEO'd variations' to the same page from one page might trigger it.
|Crossreferencing does help visitors but Google now reads duplicate links as duplicate content (SPAM) if you put all the links it already found on other pages into one page. |
I think you are correct.
The site we are having problems with does not suffer any loss of rankings where pages do not have links on every page.
However I believe it has something to do with the percentage of links over unique text-based content.
|I believe it has something to do with the percentage of links over unique text-based content. |
You could be right. I suspect the algo was recently revised turning the dial to require more text on site. (My thin, underdeveloped sites were hit.) Which I suppose is reasonable. Sites with more links than text typically don't often have much value, so Google would be justified in lowering their rankings.
It could be a percentage, like you suggested, or even a minimum amount of text. Perhaps both. Google's algo could say, You need at least X characters/words, and after that no more than Y percent links/text. We know about Keyword Density. Think of it as Link Density.
I'm also noticing many .cn sites 950'd. Very long multi-hyphenated domains, and urls, too, sit at the back of the line now in obscurity. Keyword-stuffing URLs, of course, are a very old SEO scheme. Link-stuffing is the spam target in Google's crosshairs now.
I just noticed a major international corporation has been 950'd for one of its top, industry-leading products. The 950'd page(s) have little content and mostly links, including repeated (successive) primary keywords in those links.
The interlinked pages have only a one paragraph product summary. (The directory page is PR4, but that's irrelevant.)
I just added a new directory to one site today with about 2-5 times more text than usual and used unique intra-directory nav links with almost no keyword repetition (one repeat in 20 links). We'll see what happens.
I'll soon have to change all my footers, to strip them of repeated keywords. I just hope Google doesn't freak out over huge structural changes, or 1,500+ new links to old/new site directories.
It's easy to imagine Google considering so many new links to one page so quickly "unnatural" or a "spam attempt."
Anyway, even after those changes, I doubt I'd be comfortable making a reinclusion req before my thin pages have more content.
Does Google consider more than one link to the home page per page spam? I like to have a Home link in the header and footer. This site has both, of course, but what about 950'd sites?
Also, would it be unreasonable for the 950 algo to "think" x number of header/footer links is spam (link repetition)? How many is too many?
I suspect I first got 950d after I used a repeated keyword in a footer link. I revised footers on 1,500 pages (sitewide). Not too long after that I was 950d. Probably tripped the 950!?
I also wonder if it's harder to get out of the 950 than just removing one offending thing.
Has anyone broght a site back from -950 last couple of weeks?
I find it extremely hard. Seems to me they do not release -950 sites at this time of the year.
One of my websites was 950-ed a few months ago. Month after month, I have done little changes, step by step, to see what could have triggered the 950 penalty:
- reworked navigation (reduced the number of sitewide internal links, grouped them by categories)
- removed all external links
- removed dashes in all filenames (changed i.e. red-square-widgets-canada.html to redcanada.html)
- removed all Amazon links
- removed pages with little content
... all that to no avail. 1 week ago, that site was still towards position 950 of all relevant searches.
Then I thought of paid links. Have I ever sold links on that site? - Well, yes, a very long time ago (2-3 years ago), and they were so relevant in their context that they were in fact a real asset to my website. Anyway, these paid links have never been sitewide, and they had been removed over 6 months ago. So they don't seem to be the cause of my site's penalty.
Lastly, 4 days ago, I had a good look at my html. And here is what I found at the bottom of each page:
<!-- bottom sponsor link -->
Copyright <a href="http://www.my_site.com/">www.my_site.com</a>
<a href="http://www.my_site.com/">www.my_site.com</a> is a registered trademark
The comment "bottom sponsor link" was placed there, at the bottom each page, because ca 3 years ago, I was thinking of placing sitewide footer links. However I had never implemented that idea, and the only links that were there had always been links to my own website.
So, 4 days ago, I removed that comment, and I removed the footer links to my site that followed.
And today, my site is out of 950 penalty.
So it looks like I had been punished because I had simply thought of paid links!
[edited by: tedster at 10:02 pm (utc) on Nov. 24, 2007]
[edit reason] moved from another location [/edit]
|I agree with the "thin content" or "navigation pages" theory...I'll add some content to the main menu page and see if it has an effect. |
About a week after I added the content and mixed up the keywords (i.e. instead of "red widgets" I put "widgets in colors of red...") about half the keywords on the page came back.
Today everything came back, but I think it was due to a new index or algorithm change because I had another site where I made no changes at all that has been bouncing back and forth for months come back.
Long story even longer, adding content and mixing up the keywords helped a bit but didn't solve my problem. I'm still searching and trying to find the fixes that will keep me out of the -950 for good.
site that had gone 950 of mine came back last week
Just changed the sitemap, so less repitition of widgets + location/colour etc
I wonder why Google gets so upset over link keyword repetition versus, say, title keyword repetition. There are so many sites that repeat a keyword as many as three times in the <title> tag, and, um, I don't think it's there to make the user experience so much better.
Google itself says the title tag is very important, so you'd think webmasters would overoptimize/keyword stuff it; then Google would target them for doing that.
I'm guessing that's a natural sequel spam target to the link keyword spam campaign (950), so I took the opportunity from being 950'd to revise certain pages and extract repeated keywords.
I don't like to change title tags, but decided under the circumstances to do it. If nothing else, it looks better in bookmarks.
The silver lining of the 950 for me is cleaning up a bunch of things I didn't like but didn't want to touch under the belief system, "If it ain't broke, don't fix it."
|I wonder why Google gets so upset over link keyword repetition |
Well, Google's algo is very much link driven, more so than the other search engines it seems. So this area of link text is quite sensitive when it comes to "over-optimization". In addition, heavily repeated keywords in internal links can appear on many or even all your site's pages. That's a lot bigger factor than one page's title element.
|So it looks like I had been punished because I had simply thought of paid links! |
Mark, I'm not sure you can make that conclusion. First, as you can see here, something just shifted at Google and other urls also saw the -950 lifted. But more than that, Google seems to be handling purchased links in other ways right now rather than the -950 reranking. Not saying it "can't be" or "won't be", but so far I'm not seeing it - even for run-of-site links.
At any rate, whatever caused the change for you, that's good news you're sharing. Enjoy and make the best of it!
I'm now completely re-developing / re-structuring the troubled website as changing the existing links for Google will not be good for users.
I am sectioning the website and reducing the number of repeated links.
In a couple of month’s time I'll be able to post the results!
why will it take a couple of months Pete?
My experience with this penalty, that within 2 weeks of fixing the problem, the penalty is removed.
It depends how many URLs you have. Just imagine a site with 100k URLs which has been designed for long tail gets hit with -950.
It will take at least 1 month to get the site back. So you hav e one try every month to find out about this damn -950 hell.
I see some caches for non-supplemental pages (not 950ed either) that were last crawled in July, so even getting Google to see new content on an existing URL could take months, let alone have it reevaluate a penalty.
Last site i managed that got hit, had 340,000 pages indexed, triple that not yet indexed
The Only thing I fixed, was the site map and navigation, to remove repitition of keywords in anchors,
new york hotels
filed a reinclusion request, site recovered after 5 days
|why will it take a couple of months Pete? |
Development time unfortunately :(
nippi, thanks for the info. Maybe it makes sense to file a reconsideration request for algorithmic penalties like -950, it just can't be wrong, I will give it a try ..
|I suspect the algo was recently revised turning the dial to require more text on site. (My thin, underdeveloped sites were hit.) Which I suppose is reasonable. Sites with more links than text typically don't often have much value, so Google would be justified in lowering their rankings. |
Potentialgeek… I agree with your theory, would be interesting to know if anyone else posting in this thread falls in to this category…..I’m pretty sure I do.
My site is still experiencing a penalty, as of the 28th Oct my traffic went down almost 99%. It hurts. Bad.
I submitted a re-inclusion request yesterday, do I have any chance of hearing anything back from Google? Why are so many here talking about making changes then waiting weeks to see the outcome?
[edited by: tedster at 5:10 pm (utc) on Nov. 28, 2007]
|I submitted a re-inclusion request yesterday, do I have any chance of hearing anything back from Google? |
Google will never write you back. Either they'll remove the "penalty" or they won't. I wouldn't sit around waiting for a letter back from them. Years ago, google would write you back with a generic form letter, but that went away a long time ago.
|Why are so many here talking about making changes then waiting weeks to see the outcome? |
That's all you can do, really. Change this. Wait. Change that. Wait. And hope the changes you make have appeased the google gods.
So, 4 days ago, I removed that comment, and I removed the footer links to my site that followed.
I removed all comment tags from my site on monday. I've had comments like this on my site forever:
so I went through and removed them. Not sure how a comment would matter; but who knows anymore.
I don't think the penalty is formed from a % link to text ratio. I've had pages that are like 90% text, 10% links thrown into oblivion. While other, thinner, pages still rank top 3.
So if I make a change, should I be waiting a few days or weeks? If I am un950'd, do I shoot back to where I was or is it gradual?
So many questions, I wish Google would provide a little more info.
I believe it's immediate. If the penalty is lifted, the rankings should immediately (within a day or so) readjust back to normal.
Did anyone ever think to consider that the 950 penalty is or is related to being sandboxed?
I bring this up because I have done a lot of research on this, and it seems that the "950 penalty" symptoms are very similar to the "sandbox" symptoms. The only major difference is that the sandbox is often discussed when a website is relatively new, but who says established sites can't be sandboxed anyway? Excuse me if I am beating a dead horse here and this has already been discussed, I'm new here but not new to the SEO forums world.
| This 199 message thread spans 7 pages: < < 199 ( 1 2  4 5 6 7 ) > > |