homepage Welcome to WebmasterWorld Guest from 54.145.209.80
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 195 message thread spans 7 pages: < < 195 ( 1 [2] 3 4 5 6 7 > >     
Google's 950 Penalty - Part 9
annej

WebmasterWorld Senior Member annej us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3336552 posted 9:13 pm on May 10, 2007 (gmt 0)

< Continued from: [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

That's because we are shooting in the dark

We really aren't shooting in the dark. We are shooting at dusk. We can see a fuzzy image of what is out there. Sometimes when we shoot we hit the target and other times we can't no matter how hard we try.

But we waste our time when we start shooting at theories like

- Google is doing this so more people will pay for AdWords

- Google only hits commercial sites

- If you have Google analytics you will sorry

- Only sites doing something illegal are hit by -950

- It's because you have AdSense on your site

- Scraper sites are doing this to us

It goes on and on.

Is it because the phrase based theories are not an easy answer? It does take a lot of work to figure out why you might have been 950ed and sometimes you just can't find the answer. But I still believe that most 950ed pages have been caught in an imperfect phrase based filter.

[edited by: tedster at 9:14 pm (utc) on Feb. 27, 2008]

 

annej

WebmasterWorld Senior Member annej us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3336552 posted 7:41 pm on May 12, 2007 (gmt 0)

I was reading the speculation about how scraper sites might HELP cause this penalty.

I'm still not totally convinced that any inbound links can affect this filter (it's more of a filter then direct penalty).

But if these scraper and other links do affect this it's because of phrases used. For example if a certain phrase (or word as in the patent definitions "phrase" could be an individual word) is used in the anchor text of the inbound links and on the page in question the inbound link could possibly shift the balance to where the page would go to -950. So many factors are at play here so it's hard to know for sure.

As I mentioned a few message up it appears that it takes a combination of the questionable phrase (Phrases Google included in it's phrase based spam filter) and other factors to 950 a page.

Robert Charlton

WebmasterWorld Administrator robert_charlton us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 3336552 posted 8:16 pm on May 12, 2007 (gmt 0)

For reference... the Aug 23, 2006 "Datacenter comments" video on which Matt Cutts discusses over-optimization. He referred annej to it on May 10, 2007....

[video.google.com...]

When Matt says "not quite quite as much," repeating "quite," it feels like he is talking about something subtle here... I don't think he's being ironic. What he says, as best as I could get it down, was...

...think about ways that you could back your site off... and how you could sort of not be optimizing not quite quite as much on your site.

Thinking about annej's reaction to this...

I did watch the video and it was interesting but I'm still having trouble seeing the connection to the -950 thing. Most of the pages I lost had been up and ranking well for years. Are they saying there is suddenly over optimization on old pages?

If they're adding new considerations, possibly yes, because the ranking criteria would be different. Matt's comments didn't suggest that, but in the context of the constant change he was discussing, it might be implicit that new considerations would come into play. It's hard to say whether or not he would then call these a "new algo."

annej - In the Phrase Based Multiple Indexing and Keyword Co-Occurrence [webmasterworld.com] thread, you and I had a brief exchange about application of algo elements to niche areas with small sample sizes. I theorized that Google could be constantly re-evaluting threshold criteria as new pages are added, and that certain algo layers might kick in when niche areas get sufficiently large. I believe such an indication may be included in the Hilltop patent, but I don't remember it well.

Also, keep in mind that, as the niche areas grow larger, pages which rank today may not rank tomorrow unless sites are improved and perhaps more links gotten. In an expanding universe, what stays the same is, in effect, getting smaller.

annej

WebmasterWorld Senior Member annej us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3336552 posted 10:24 pm on May 12, 2007 (gmt 0)

My frustration with MCs response is that they are still not admitting that some sites other than the intended spam sites are getting caught in this filter. But I guess I just have to accept that we will never hear a peep about that from Google. That's the change from the earlier Google.

But you are right, there is still something in MCs message to me and in the video that may well be of help. He seems to be saying that some of us are optimizing just a little too much. Nothing blatant, just a little thing we have done can be causing our problems.

I'm going one step beyond this. I think the only times this "little to much" is a problem is when by chance we have used a word/phrase that they have marked as related to spam sites as in the patent.

Sometimes it's really hard to see what they might be considering borderline over optimization. For example earlier a section on my site on widgeting patterns was 950ed. I had all the widgeting pattern pages linked to each other in the navigation. My thinking was thatif a visitor was interested in one widgeting pattern she would likely be interested in another. In hopes it would help get my 950ed pages back I took that part of the navigation out leaving just the link to the contents page for widgeting patterns. That seemed to have solved the problem as the pages did come back.

I still don't know how my navigation was over optimizing unless it was just that the navigation had too many links in it. But it may be that de-optimizing in small ways like this may help get 950ed pages back.

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3336552 posted 11:48 pm on May 12, 2007 (gmt 0)

A basic poroblem of course is by no actual definition of the non-concept of "over-optimization" would anyone suggest that having the words "Impala", "Cavalier" and "Chevelle" on a "History of Chevrolet" page be overoptimization, but that is exactly what sometimes gets penalized.

In-depth, literate, valuable pages seemingly get snared by something looking for random keyword text. I don't mean this as a specific example even, but its very clear many pages being penalized would never be called overoptimized by a human looking at it.

annej

WebmasterWorld Senior Member annej us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3336552 posted 1:43 am on May 13, 2007 (gmt 0)

but its very clear many pages being penalized would never be called overoptimized by a human looking at it.

You do have to use your imagination but if the page is buried down in 950 land what do you have to lose trying?

castar

10+ Year Member



 
Msg#: 3336552 posted 2:16 am on May 13, 2007 (gmt 0)

Good observations. Question: When making these changes to try to get out of -950, do you think Google will penalize you for making too many changes too often? For example, I've been taking off keyword phrases that I think might be causing this, waiting until google spiders the page (usually a few days). Then if I'm still 950'd, I switch things back and then try something else. My fear is that I might be penalized for making changes to that page too frequently. Your thoughts?

Biggus_D

5+ Year Member



 
Msg#: 3336552 posted 2:54 am on May 13, 2007 (gmt 0)

Overoptimized? No way.

Sounds like Google wants us to think that this is our fault when it's not. Sounds like a reverse "psychological projection".

Besides, we do not optimize anything (even our URLs are plain ugly). We just write about widgets.

So, what do we have to do? Do we have to write worse content just for Google?

And last but not least, how does that benefit the user?

[edited by: Biggus_D at 2:56 am (utc) on May 13, 2007]

trakkerguy

5+ Year Member



 
Msg#: 3336552 posted 3:56 am on May 13, 2007 (gmt 0)

Castar - YES, Google will not like you tinkering with a page too much. You do lose trust. So any benefit from small changes may be offset by the negative of too much change.

dibbern2

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3336552 posted 4:39 am on May 13, 2007 (gmt 0)

Besides, we do not optimize anything (even our URLs are plain ugly). We just write about widgets.

Okay, think about this for a minute. Optimizing doesn't always mean a bag of fancy tricks. Read Annej's comments carefully, and you'll see her "over optimizing" was a pretty plain navigation scheme that just happened to fit what Google might have been looking to penalize as over use of keywords. The point is, your plain pages could fall in the same trap.

I am experiencing something very similiar to what Anne describes, and have a little evidence to support her thesis; although it's always speculation---we never truly know.

YES, Google will not like you tinkering with a page too much. You do lose trust.

This may be so, but as was already mentioned: what do you have to loose when you're sitting at #953?

castar

10+ Year Member



 
Msg#: 3336552 posted 4:42 am on May 13, 2007 (gmt 0)

Thanks, trakker. So, let's see if I have this right. I'm 950'd because

I've over optimized my page from years back according to seo white hat standards at that time;

Scrapers and spammers have been copying parts of my page and putting up many links with my page title in their link (without approval), that deflates my trustrank;

And, my attempting to bring the page up to Google standards I lose more trust in the process?

What's wrong with this picture? I don't mean to rant... it's just so frustrating not knowing how to correct something that G says I've done wrong.

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3336552 posted 5:11 am on May 13, 2007 (gmt 0)

castar, are you 950'd for all of your search terms or just some?

And if it's only for some and not for all, is it for whole phrases, or phrases that are OK with just a couple of words but 950'd when an additional word is added on to make a longer phrase?

loner

5+ Year Member



 
Msg#: 3336552 posted 6:09 am on May 13, 2007 (gmt 0)

I had every one of my 2,000 pages go 950 over a 4 month period on a regional website. I removed the links in the footer to the 20+ communities in the region, signed G's admission to spamming and told them I was really, really sorry about that. I had those same links there since 97. Anyhow, within a couple weeks, things were back to normal, in fact a little bit better in as much as visitors were hitting pages directly related to the community they are searching for. We all win.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3336552 posted 6:29 am on May 13, 2007 (gmt 0)

Some recent musings I've had ---

Interesting you should mention repeated footer links. I've seen references in many information retrieval papers to the challenge of repeated sections of any kind of text - when that text is not directly relevant to the topic of the individual document. If some kind of semantic analysis is being used, then the co-ocurrance factors can get quite confused by such section on the page, and it becomes an indexing problem.

Remember not too long ago when Adam Lasnik posted about being wary of extended boilerplate? If mere text boilerplate is an issue, how much more problematic would be "boilerplate" words in anchor text. A number of the -950 urls I've looked at do fall into the "large footer link block" area. It's also a place where some link sellers hide links that are sold just for PR purposes.

Another factor that I've been noticing on a number of -950 pages is the lack of a consistent menu block across the site. If the main navigational links are set into a clearly marked area -- some container such as a table cell, div or list, then the overall algo can pick out the navigation and not use those blocks in the pure phrase assessment of the individual document.

But without a clearly delineated menu block, then every link and its anchor text enters into the phrase co-occurence matrix.

[edited by: tedster at 7:01 am (utc) on May 13, 2007]

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3336552 posted 6:55 am on May 13, 2007 (gmt 0)

It's also a place where some link sellers hide links that are sold just for PR purposes.

I just saw one this week where the link to one "other" site was right in with the site's navigation - identical niche product, keyword phrase fits right in but just goes to a different site. The site linking out is #1 for the primary plum keyword. Very, very camouflaged.

dibbern2

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3336552 posted 6:58 am on May 13, 2007 (gmt 0)

It is sooooo tempting to grab at a possibility such as what tedster muses about, and bend it, hammer it, until it fits my particulair situation, and then think "Eureka! That's the answer to getting out of this hole!" when I really just don't know. That's one of the problems of shooting in the dusk, I suppose.

I'm looking at a site comprised of 6 large directories. 4 of them have escaped any penalties, rank fine. 2 of them are 950'ed. What's different about the 2 victims? Keyword rich internal crosslinks at the bottom of every page. The others? Hardly a single crosslink in the navigation, just a return to the index page.

The crosslink scheme in the 950'ed directories made sense and was good for the user, but that's another discussion.

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3336552 posted 7:01 am on May 13, 2007 (gmt 0)

I completely tore up an old site that hasn't been touched in a couple of years preparatory to maybe doing a re-model(it's a burden and a bother but I'm stuck with it). Totally changed navigation for all the sections and pulled out internal footer links.

It didn't go +950, it just went PR0. Two older sites, same thing, as a matter of fact.

BTW, those sites have been so mercilessly scraped and hammered by MFA garbage it's pitiful.

[edited by: Marcia at 7:06 am (utc) on May 13, 2007]

mattg3

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3336552 posted 9:01 am on May 13, 2007 (gmt 0)

The spamnetwork links via their 3 dmoz clone pages to my homepage, it seems as if the whole site has been taken down.

I had a dynamic menu, consisting out of links in the content so that users could easily choose where to move on.

I minimised my boilerplate repetition copyright blabla back in February, I think. Outgoing links 2 (no casinos hotels poker, prawn whatever dodgy. All solid sites, all links no index nofollow)

Title: one word the topic the page is about
Keywords: one word the topic the page is about
Description: was the words from the dynamic menu, now the beginning of each content, which should be helpful. I had to change a lot here as Google was pulling descriptions from last year.

But I can't stress enough

The guy that bounced back has half to more than half boilerplate repetition with ONE word replaced. Extremely long title tag, 25 keywords in KEYWORD, some duplicated external content, some weird page-topic meta tag. It's essentially a shortened scraper site with loads of noise text to I guess cloud the scraped origin. He also has subdomain spam. aka keyword.example.com and so on.

What is noticable though he has only 4 menu links. It's German but his impressum says he is in Latvia. I am German registered German server hosting and so on. his domain name is english and the wois info tells me it registered in 2005 (we 1996) there is a german email adress in his registry but the rest is all Latvian.

From all what he does wrong only the menu springs out, ver short. We have a portal with directory, jobs, videos, pictures, event calendar, wiki, external forum on is own domain (was not taken out). We are also linked mostly on the homepage (Association of our professional body their portals , forum sites on topic blogs on topic everything nicely on topic besides the spamnetwork which pulled an inbound of that DMOZ clone of a pre 2000 DMOZ clone where info is wrong they call us like our unnaffected english site but link to the German server), but also with many subpage links. His site is PR4 we are PR 6 since years now.

From what I see only the menu springs out and the IBLs.

Since the english site is not affected by the 950 and that spammaster jumped back, I think in my case it might be the IBLs or the menu. I pulled the dynamic menu, reluctantly.

As said somewhere else 30% return rate six pages per visitor on samples up to 85.000 a day. Now 10.000 mostly bookmarked or what analytics calls direct. The forum site has more. The english pages have less so that can't be it either. I found scraped content from us, we get constantly scrapers but I can't be 24/7 after them and lock their IPs out in the firewall. Many many of our images are hotlinked, pulling around 4 GB. Since we have the bandwidth, I let them do that. I have 100 unique videos on video.google with inbound links to my homepage.

I also do content, so I am really miffed that I have to chase up technical stuff again, grr.

Anyway take your pick what the hell the 950 is now about. The biggest sitewide spam signal that an AI stupid (as in AI sin't very clever) robot could see are the unsolicited IBLs, which claim SEO on the pages, imo.

I am still on 5-10.000 one country only visitors a day on that site, mostly direct now. At least it's comforting that there is a Google less baseline that is not great but survivable.

mattg3

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3336552 posted 9:21 am on May 13, 2007 (gmt 0)

I'm looking at a site comprised of 6 large directories. 4 of them have escaped any penalties, rank fine. 2 of them are 950'ed. What's different about the 2 victims? Keyword rich internal crosslinks at the bottom of every page. The others? Hardly a single crosslink in the navigation, just a return to the index page.

The crosslink scheme in the 950'ed directories made sense and was good for the user, but that's another discussion.

We had keyword rich crosslinking in the menu and the content.

What a stupid robot word list or whatever might not get is the background you need to explain to understand the sites topic. aka the George Bush White House example given by Google could have oil, Washington, USA, tea war the UK, architecture, a definition of all shapes as opposed to oval, the secret service weaponry whatever on that page.

If the January update had that on page should appear filter in it (we did well), where George Bush on a page should have White House in it and this change now had a should not include word list attached, I assume very detailed pages like ours, or maybe annejs sites. I assume in history the possible crosslinking that makes historical sense is undetectable for a mere AI based stupid word list. (I am unaware of the last Nobel Prize the Googlebot got) So the more background info you give the worse you might drop as the AI word list isn't a historian, but is a simple moronic reflection of associated words. That could hit hard if it's not offset with .edu trustrank. Given that Google doesn't get pool (as possible in Poker) and swimming pool, this word list must be extremely unsophisticated.

But that's pure guessing of course.

mattg3

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3336552 posted 9:30 am on May 13, 2007 (gmt 0)

A number of the -950 urls I've looked at do fall into the "large footer link block" area. It's also a place where some link sellers hide links that are sold just for PR purposes.

I have 4 links to the same site all nofollow noindex. I also included the text into pictures in february, so I escape the boilerplate issue and I am still able to say what I want to say to the human user. AKA medical legal disclaimers. German law is quite funny, so I better not risk putting this of page, only because Google needs to fit the world into their IQ 1 algorithms and a judge has no understanding for Googles needs.

I notice though that the spammer that jumped out of the 950 has boilerplate repetition without links.

mattg3

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3336552 posted 9:49 am on May 13, 2007 (gmt 0)

I'm going one step beyond this. I think the only times this "little to much" is a problem is when by chance we have used a word/phrase that they have marked as related to spam sites as in the patent.

But did you read the example that someone gave that his swimming pool lights ad was flagged cause it contained the word pool as it could appear in Gambling. Google obviously uses simple one word lists.

But I would take MC by his word, if he links to a hint about SEO I would believe it. We just don't know if that overoptimising could be offsite and then you are screwed if your internal page rank doesn't offset massive sraping or unsolicited garbage IBLs.

If its offsite really, then you might be innocently associated.

I going on about this as my basically unaffected english site that uses the same software, same menus, equal content is unaffected. It has less IBL, is PR 5, ten years old too and ibl since then but no spamnetwork. Given it's lower PR standing I would have expected it to drop first as it is also realistically not as known as the German one is.

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3336552 posted 9:58 am on May 13, 2007 (gmt 0)

cause it contained the word pool as it could appear in Gambling. Google obviously uses simple one word lists.

I believe that one word used several times that's out of context for what would be normal and usual for the topic of the site and wasn't compatible with the rest of the text on the site could raise an eyebrow.

That's why I was asking a few posts ago if the affected phrases were stand-alone or whether 2-3 word phrases were doing OK but showed up with a problem when another word was added on and made a longer phrase.

mattg3

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3336552 posted 10:10 am on May 13, 2007 (gmt 0)

As far as i can see long tails do better, but that just might be because there are less options. I have a wipeout definitely on one word queries. The example I mention with the spammer is a very good indicator of that sites performance.

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3336552 posted 10:36 am on May 13, 2007 (gmt 0)

"one word used several times that's out of context for what would be normal and usual for the topic of the site and wasn't compatible with the rest of the text on the site could raise an eyebrow."

That would seem logical, but it seems to be the opposite of any over-optimization idea. Having "tractor" three times on a page devoted to needlepoint is obviously not what anybody would consider "over optimization". The problem seems to be related words, like using yarn three times. That could be (plausibly if stupidly) viewed as "over optimization".

mattg3

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3336552 posted 10:44 am on May 13, 2007 (gmt 0)

Good observations. Question: When making these changes to try to get out of -950, do you think Google will penalize you for making too many changes too often? For example, I've been taking off keyword phrases that I think might be causing this, waiting until google spiders the page (usually a few days). Then if I'm still 950'd, I switch things back and then try something else. My fear is that I might be penalized for making changes to that page too frequently. Your thoughts?

Sadly I was doing he same. I was jumping occasionally back and changed keywords and description, not on the english site.

When we jumped to high traffic I was soo frustrated before that I didn't make much changes also on the content. Over the last two years it always seemed as if work {also non optimising) just plain content, would result in traffic drop and if you gave up then it suddenly started to rise. It started to annoy me so much that I then started to play with the keyword and the description header since Google displayed some moronic junk from some old datapull. I even had a description from another site of mine popping up.

I think I am happy now how it is keyword only the topic name and description the start of the content.

Maybe it's an anti fiddle penalty. Maybe understandable in the keywords but not content wise. The problem is just as such if the traffic drops you start fiddling .. out of panic. But the drops seem to be also affected by content changes. I am talking here adding 50 pages max over 2 months in collaboration with others. Not 20.000 or so.

Given that the spammer has extremely elaborate 25 words long KEYWORD tag too long title tag. It might well be you need to leave them untouched now.

So that's something I am guilty off. Each roller coaster ride Google side I changed the Keyword tags as I realy didn't get why they sent me high traffic then a week low then when we were on holidays (might be a clue to no work ;) ) only high then high motivation loads of work sinking again. Frustration no work, up again.

Since last year I definitely suspected that more work leads to unstable rankings. When I was i Germany and my business partner in Japan we had a two month sustained high traffic period. Then back all refreshed loads of new content, fiddling, a sustained drop now.

Well me and my partner have decided to ignore Google now. You gonna get mad and I am offski now to take new pictures... :)

Petra Kaiser

5+ Year Member



 
Msg#: 3336552 posted 11:26 am on May 13, 2007 (gmt 0)

Google trends gives some interesting results on testing related items, The example above
(input: Impala, Cavalier, Chevelle) shows identical curves for search volume in 2006.
For our lost key phrase curves are identical too.
If I add a word to the trends, which gives us page #1 result, this particular word shows a different curve. A least trends may be helpful to find out what seems to be related.

crobb305

WebmasterWorld Senior Member crobb305 us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3336552 posted 2:48 pm on May 13, 2007 (gmt 0)

I'm still not totally convinced that any inbound links can affect this filter (it's more of a filter then direct penalty).

This phenomenon (penalty, filter, etc) is happening for a single phrase on one of my pages. This phrase appears only once on the page and once in the title. So I don't know how it has been "overoptimized" unless IBLs are causing it.

trakkerguy

5+ Year Member



 
Msg#: 3336552 posted 3:28 pm on May 13, 2007 (gmt 0)

Castar - I'm not saying you shouldn't make changes. You may have to. It's just better to make them less often. As Mattg3 says, you often drop down soon after making changes, then will rise back up over a few weeks time. Is definately a "too much fiddling" mini-penalty.

On a site that has been static for a while, and has good trust, you can usually make a change without problem. But too many cause a drop, and the more often you make changes, the more sensitive G is to any further changes you make. You lose trust.

You can even send a recovering site back to -950 by making changes too often, and have to wait weeks for it to come out.

annej

WebmasterWorld Senior Member annej us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3336552 posted 5:32 pm on May 13, 2007 (gmt 0)

Lots of new info here. I feel like we are really on track in this thread now.

I had those same links there since 97.

Another example of how pages that have been up for years have suddenly become over optimized.

A number of the -950 urls I've looked at do fall into the "large footer link block" area.

My long list of links to widget patterns was like this BUT the links were in a side column. contained in a div. Has anyone else had problems with navigation contained in a div or table?

What's different about the 2 victims?

This is where I think the phrased based thing comes in. The 950ed ones had a suspect combination of phrases and the others didn't. Once a "watched" phrase is in there you are held to a higher standard. (just a theory but something to think about)

have been so mercilessly scraped and hammered by MFA garbage

My above theory fits this as well. Why are some scraped pages, sections or whole sites still doing fine while others are not? Either it's because the filter has nothing to do with inbound scraper link text or because the a phrase has set of the red flag so now inbounds can cause damage.

I have 4 links to the same site all nofollow noindex.

It may be that even though you have nofollow Google notes it is a link and it seems like anchor text counts for more in this filter.

Maybe we all need to go back to those cute little graphic link buttons from the mid 90s. ;)

Having "tractor" three times on a page devoted to needlepoint is obviously not what anybody would consider "over optimization". The problem seems to be related words, like using yarn three times. That could be (plausibly if stupidly) viewed as "over optimization".

Exactly! The patent refers to, "at least one phrase significantly exceeds the expected number of related phrases".

Google trends gives some interesting results on testing related items

All these tools can be a great help when you combine what they show with your own intuition and knowledge about your topics.

So I don't know how it has been "over optomized" unless IBLs are causing it.

That's the kind of information I'd hoped my comment would bring. It's seeming more and more likely IBLs could be in the mix.

europeforvisitors



 
Msg#: 3336552 posted 6:07 pm on May 13, 2007 (gmt 0)

Another example of how pages that have been up for years have suddenly become over optimized.

Wouldn't you expect Google's algorithms and filters to have changed since the company launched in 1998? It's possible that those changes simply reached the tipping point as far as your site is concerned.

I'm not saying that you deserve a -950 penalty or any other kind of penalty. (After all, I haven't seen your site.) I am saying that evolution is inevitable, and change doesn't always work to everyone's advantage.

LineOfSight

5+ Year Member



 
Msg#: 3336552 posted 6:57 pm on May 13, 2007 (gmt 0)

You can even send a recovering site back to -950 by making changes too often, and have to wait weeks for it to come out.

My main phrase (for homepage) went from circa #25 to -970'ish. Made a round of changes (big deoptimisation excercise) then a couple of days later made a few more changes. The site came back -350, -280, -, -110 and -60'ish. Thought all was good and then got 350'd again. Not been cached since the 8th so I suspect the second round of changes caused the drop again. Leaving it alone now, no more tinkering ;)

Has anyone else had problems with navigation contained in a div or table?

Not sure about this - all navigation is in a <div> which is clearly marked 'navigation'. But the navigation is essentially a list of 12 main keywords that point to on topic, depotimised pages but does appear on every page. If that's causing me a problem I really don't think there's a great deal I can do.

Google trends gives some interesting results on testing related items

I'm not sure how to interpret this information - If I type my main key phrase the last 2 links in the right hand side are to press release sites / the releases that I issued at the begining of the year. Is that good, bad or indifferent?

[edited by: LineOfSight at 6:58 pm (utc) on May 13, 2007]

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3336552 posted 4:54 am on May 14, 2007 (gmt 0)

After a lot of thought, I don't think I'd be willing to make too many changes all at once. It might be just one very small, very correctible thing that's triggering a thumbs down on a page, and if multiple changes are made all at once there's no way to analyze the results or consequences.

No, there is no clear indication of exactly what's happening, but there can be no doubt that there's more than just one thing happening. It's very unusual that ONE thing all by itself can have such profound effects on such a broad level, affecting so many different types of sites of all different sizes across so many verticals.

In particular, I wouldn't take a chance on monkeying around with outbound links unless it's removing the ones that go to bad neighborhoods or very inappropriate sites.

Annej,
Try putting a no follow tag on your recip links page and wait a week to see if it has an effect.

90% of the sites that are in the -950 penalty either were involved in link exchanges or have recip links pages. That is the common factor.


That takes the grand prize as being the most irresponsible advice that's been given since this whole 950+ thing started. Doing that could have VERY serious negative consequences down the road, not for any single page or phrase but for entire sites.

If a site has been penalized (site not page) for being link spammers, it's an entirely different issue -and it's a small minority of sites in the final results page, and they should have been caught long ago. Maybe they were, but nobody else noticed them MIA for what they were ranking for.

I challenge anyone to show us Walmart's reciprocal links page. Or Target's. Or JCPenney's. Or Amazon's link page. Or the Google Directory's link page. Or Bizrate, or Priceline or Ebay, or the dozens of independent sites that have ONLY one way inbound links and do not link out at all and have tons of other pages ranking well, including many top ten or #1 rankings,that have only SOME of their pages in the 400's and 900's. Redundant pages that should be clustered out of top rankings.

And the top ten results of a HUGE number of searches across multiple verticals most certainly DO have reciprocal links - many of them. And bought links too, but that's another issue. ;)

I would seriously suggest reading up what those papers have to say on the primary and supplemental index and how they're being partitioned. There might just be some little grain of truth in there that might be suspected to actually be happening, since the infrastructure of the secondary index did, in fact, under go a change and so did the procedures for indexing.

Check into things further before cheating legitimate, on topic link partners and removing quality outbound links, and risking losing those quality recips back that were helping the site to rank well when they find out they've been cheated. Then dig into your pockets and go find your friendly neighborhood link broker.

This 195 message thread spans 7 pages: < < 195 ( 1 [2] 3 4 5 6 7 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved