Forum Moderators: Robert Charlton & goodroi
It is totally nonsense for me to worry about TBPR when I was badly hitted from a -950 penalty (look! my PR raised in almost all pages ... but as I said who cares?).
Please all -950ers come here and join this thread to group possible causes.
Here are mine:
1) E-mail to Adsense team about an account creation with domain name
2) Too many adsense boxes
3) Midly Over-optimized pages
4) Too similar titles
5) Some directory links (as almost all my competitors though)
I add that in last months no big changes were done!
Join -950ers power :-)
[edited by: tedster at 9:08 pm (utc) on Feb. 27, 2008]
And really what you are alluding to speaks of the types of properties of any given document and the WAY in which Google might see your document.
Until now pure text has been weighted less than anchor text, this has changed now. Text is weighted more.
Seopti, how have you come to this conclusion?
For most of the sectors I watch, linking has always, and is still, winning the battles. All that is changed it would seem to me is the re-ranking based on co-occurrence and spam detection of documents in Google which is causing massive movements backward for some websites.
Until now pure text has been weighted less than anchor text; this has changed now. Text is weighted more.
I suspect, despite Google's alleged love of short webpages, it recently adjusted the algo to shift the line in the sand wrt what it considers "too thin."
Some of my thin (underdeveloped) webpages don't do so well now. Other sites with more thick page content are doing better than usual.
"Content is king."
Google seems to be saying, 'If you have something to say, say it. If not, don't bother.'
I used to think it overrespected anchor text, so any shift of emphasis to text is a move in the right direction. But it still has a problem differentiating between anchor text and text. Search results should give the best final page result and Google should not get confused by navigation menu text.
p/g
Google seems to be saying, 'If you have something to say, say it. If not, don't bother.'
I'm piddling online and the vast majority of pages that have "no pagerank information available" (on domains with PR of course) rather than "0 pr" seem to be ones that are dominated by links in the content area (subcategory pages that link to the content pages). The content pages themselves show PR but their parent pages do not. Google might be bypassing the middle pages with little/no content.
The next level pages are still doing fine (still in the top 10 for the most important keywords).
I'll add some content to the main menu page and see if it has an effect.
I see one four-letter comm word... 950'd, for example... has a navigation menu with 20 different "unique" internal links, but they all have that one targeted keyword in each those unique nav links, lol. This is on a site by a former SEO expert.
I think the 950 is all about links--artificial and legitimate--including attempts to use keyword-stuffed internal links as a substitute for natural, powerful backlinks.
Keyword repetition doesn't get more link juice; it dilutes it. Consecutive keyword repetition is the worst. Three times consecutively even in outbound links can get you 950d.
Most of the phrases I've not been 950'd for have good backlinks. Others say they got out from the 950 penalty with good backlinks.
The New Google Math
Bad internal links -950
Good back links +950
p/g
Yes, but that has nothing to do with -950. It's happening all over the place, to sitemap pages for example.
In general Google seems to now not like pages with lots of links and photo pages without a lot of text. Both lead a person to think "more text on pages is a necessary thing now"... but still these are general issues, not 950 things.
Trying to helpful to visitors (the road to 950 hell is paved with good intentions), I created several webpages which look like search results for specific keywords. I do have Google site search, but the results aren't always as good as the ones I edit, or in the best order, because of weaknesses in Google's search algo. (These weaknesses include its inability to separate navigation links from page text.)
These category index pages don't have any page content besides the link text (anchor text). So it's title (H1), subtitle (H2), then all the links.
Crossreferencing does help visitors but Google now reads duplicate links as duplicate content (SPAM) if you put all the links it already found on other pages into one page.
The target pages from the index pages were all linked back to the index pages.
The most recent webpages like this were 950'd almost as soon as they went live.
The thing that's slightly different with this one 950'd page is the actual 950'd keyword is only in the page title, not repeated in multiple links on the page, like my other 950d pages. The only place it's "repeated" is in a navigation link. So it's one page with one keyword plus 50 pages with a link that has the keyword to that page.
So I wonder if Google, which has considered too many backlinks too fast as a form of spam, now also considers too many internal links (to the same page) too fast also spam.
Anyone planning to organize related content on their sites may want to reconsider. Meanwhile, I have to find an alternative site search to Google!
p/g
now also considers too many internal links (to the same page) too fast also spam.
Not only that, but possibly in combination with other 'failure checks' the number of identical links pointing to the same page from one page, or a number of 'Overly SEO'd variations' to the same page from one page might trigger it.
Crossreferencing does help visitors but Google now reads duplicate links as duplicate content (SPAM) if you put all the links it already found on other pages into one page.
I think you are correct.
The site we are having problems with does not suffer any loss of rankings where pages do not have links on every page.
However I believe it has something to do with the percentage of links over unique text-based content.
I believe it has something to do with the percentage of links over unique text-based content.
You could be right. I suspect the algo was recently revised turning the dial to require more text on site. (My thin, underdeveloped sites were hit.) Which I suppose is reasonable. Sites with more links than text typically don't often have much value, so Google would be justified in lowering their rankings.
It could be a percentage, like you suggested, or even a minimum amount of text. Perhaps both. Google's algo could say, You need at least X characters/words, and after that no more than Y percent links/text. We know about Keyword Density. Think of it as Link Density.
I'm also noticing many .cn sites 950'd. Very long multi-hyphenated domains, and urls, too, sit at the back of the line now in obscurity. Keyword-stuffing URLs, of course, are a very old SEO scheme. Link-stuffing is the spam target in Google's crosshairs now.
I just noticed a major international corporation has been 950'd for one of its top, industry-leading products. The 950'd page(s) have little content and mostly links, including repeated (successive) primary keywords in those links.
The interlinked pages have only a one paragraph product summary. (The directory page is PR4, but that's irrelevant.)
I just added a new directory to one site today with about 2-5 times more text than usual and used unique intra-directory nav links with almost no keyword repetition (one repeat in 20 links). We'll see what happens.
I'll soon have to change all my footers, to strip them of repeated keywords. I just hope Google doesn't freak out over huge structural changes, or 1,500+ new links to old/new site directories.
It's easy to imagine Google considering so many new links to one page so quickly "unnatural" or a "spam attempt."
Anyway, even after those changes, I doubt I'd be comfortable making a reinclusion req before my thin pages have more content.
Does Google consider more than one link to the home page per page spam? I like to have a Home link in the header and footer. This site has both, of course, but what about 950'd sites?
Also, would it be unreasonable for the 950 algo to "think" x number of header/footer links is spam (link repetition)? How many is too many?
I suspect I first got 950d after I used a repeated keyword in a footer link. I revised footers on 1,500 pages (sitewide). Not too long after that I was 950d. Probably tripped the 950!?
I also wonder if it's harder to get out of the 950 than just removing one offending thing.
p/g
- reworked navigation (reduced the number of sitewide internal links, grouped them by categories)
- removed all external links
- removed dashes in all filenames (changed i.e. red-square-widgets-canada.html to redcanada.html)
- removed all Amazon links
- removed pages with little content
... all that to no avail. 1 week ago, that site was still towards position 950 of all relevant searches.
Then I thought of paid links. Have I ever sold links on that site? - Well, yes, a very long time ago (2-3 years ago), and they were so relevant in their context that they were in fact a real asset to my website. Anyway, these paid links have never been sitewide, and they had been removed over 6 months ago. So they don't seem to be the cause of my site's penalty.
Lastly, 4 days ago, I had a good look at my html. And here is what I found at the bottom of each page:
<!-- bottom sponsor link -->
Copyright <a href="http://www.my_site.com/">www.my_site.com</a>
<a href="http://www.my_site.com/">www.my_site.com</a> is a registered trademark
The comment "bottom sponsor link" was placed there, at the bottom each page, because ca 3 years ago, I was thinking of placing sitewide footer links. However I had never implemented that idea, and the only links that were there had always been links to my own website.
So, 4 days ago, I removed that comment, and I removed the footer links to my site that followed.
And today, my site is out of 950 penalty.
So it looks like I had been punished because I had simply thought of paid links!
[edited by: tedster at 10:02 pm (utc) on Nov. 24, 2007]
[edit reason] moved from another location [/edit]
I agree with the "thin content" or "navigation pages" theory...I'll add some content to the main menu page and see if it has an effect.
About a week after I added the content and mixed up the keywords (i.e. instead of "red widgets" I put "widgets in colors of red...") about half the keywords on the page came back.
Today everything came back, but I think it was due to a new index or algorithm change because I had another site where I made no changes at all that has been bouncing back and forth for months come back.
Long story even longer, adding content and mixing up the keywords helped a bit but didn't solve my problem. I'm still searching and trying to find the fixes that will keep me out of the -950 for good.
Google itself says the title tag is very important, so you'd think webmasters would overoptimize/keyword stuff it; then Google would target them for doing that.
I'm guessing that's a natural sequel spam target to the link keyword spam campaign (950), so I took the opportunity from being 950'd to revise certain pages and extract repeated keywords.
I don't like to change title tags, but decided under the circumstances to do it. If nothing else, it looks better in bookmarks.
The silver lining of the 950 for me is cleaning up a bunch of things I didn't like but didn't want to touch under the belief system, "If it ain't broke, don't fix it."
p/g
I wonder why Google gets so upset over link keyword repetition
Well, Google's algo is very much link driven, more so than the other search engines it seems. So this area of link text is quite sensitive when it comes to "over-optimization". In addition, heavily repeated keywords in internal links can appear on many or even all your site's pages. That's a lot bigger factor than one page's title element.
So it looks like I had been punished because I had simply thought of paid links!
Mark, I'm not sure you can make that conclusion. First, as you can see here, something just shifted at Google and other urls also saw the -950 lifted. But more than that, Google seems to be handling purchased links in other ways right now rather than the -950 reranking. Not saying it "can't be" or "won't be", but so far I'm not seeing it - even for run-of-site links.
At any rate, whatever caused the change for you, that's good news you're sharing. Enjoy and make the best of it!
I suspect the algo was recently revised turning the dial to require more text on site. (My thin, underdeveloped sites were hit.) Which I suppose is reasonable. Sites with more links than text typically don't often have much value, so Google would be justified in lowering their rankings.
Potentialgeek… I agree with your theory, would be interesting to know if anyone else posting in this thread falls in to this category…..I’m pretty sure I do.
My site is still experiencing a penalty, as of the 28th Oct my traffic went down almost 99%. It hurts. Bad.
I submitted a re-inclusion request yesterday, do I have any chance of hearing anything back from Google? Why are so many here talking about making changes then waiting weeks to see the outcome?
[edited by: tedster at 5:10 pm (utc) on Nov. 28, 2007]
I submitted a re-inclusion request yesterday, do I have any chance of hearing anything back from Google?
Why are so many here talking about making changes then waiting weeks to see the outcome?
So, 4 days ago, I removed that comment, and I removed the footer links to my site that followed.
I don't think the penalty is formed from a % link to text ratio. I've had pages that are like 90% text, 10% links thrown into oblivion. While other, thinner, pages still rank top 3.