Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Can some sites become so entrenched that on page factors hurt little?

         

limoshawn

7:17 pm on Feb 16, 2011 (gmt 0)

10+ Year Member



I'm looking at the big W-pedia, pages in a constant state of flux, vandals, updates. Not talking about the addition of UGC like comments, talking about the core content of a page changing a lot, sometimes in a manner that should effect seo adversely, changes that would effect a normal site adversely, yet the big W-pedia seems immune. Are their offsite factors so great that google really does not care about their on page factors?

wheel

8:14 pm on Feb 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



OK, you must have hit me on a bad day. I don't have an answer to your question, but I've got a related one.

Is there a way one could poison a wikipedia page through onpage content? Could one login and maybe overload the keyword density? Leave the content looking natural and informative, but hurt the rankings?

In terms of offpage, I wonder what would happen if one bought 500 poison links and put them all at one page.

limoshawn

10:50 pm on Feb 16, 2011 (gmt 0)

10+ Year Member



not that i want to hurt them, but i don't think it can be done on page. i have seen pages "vandalized" and remained that way long enough to get cached at least once and still stay a top the serps. As far as externally, it seems that a page does not need many external links to garner the top spot, in fact it seems to take far far fewer page links for a w-pedia page to out rank a tradition site, so i'm not sure if poison links would effect it negatively. I think the main factor might be the internal linking which would be difficult to break.

tedster

11:29 pm on Feb 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Google has talked about using domain-wide signals in their rankings, and changing how those signals are scored from time to time.

Google hasn't shared the secret sauce on domain-wide or site-wide signals, but I do think they can make some of the on-page "fine points" that SEOs traditionally talk about a lot less important.

freejung

11:36 pm on Feb 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I sometimes feel like some of my older pages have been "grandfathered in" on some of the more recent algo changes, that is, that my old well-established pages can get away with having problems that my newer pages can't get away with. That's just a general impression.

wheel

12:24 am on Feb 17, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The wiki page at the top of the serps in my niche is a pr5. That's a lot of oomph for an inner page.

I'd still be curious what'd happen with a whack of crap links to the page. Goodness knows the wiki article is terrible at best, misleading at worst. Heavy site, but at least in this case the content is simply wrong.

limoshawn

12:46 am on Feb 17, 2011 (gmt 0)

10+ Year Member



I'm looking at something similar, page with a pr5, the page averages less than 200 words total on the page, no images. the main (most popular) keyword is not on the page or in any meta tags. a search for the keyword in quotations shows the w-pedia page in the #2 spot, a regular search (no quotes) shows them in the #1. yahoo links shows just under 200 external links pointing to the page. a sampling of those links shows mostly links from garbage sites.

TheMadScientist

1:32 am on Feb 17, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The problem with 'taking down' sites like Wikipedia, and some others mentioned in recent threads, through 'bad links' or TOS violations is Google is for the users and users like / expect Wikipedia, so there's really not much chance of taking them out of the SERPs IMO.

I would say the best method to replace them is get links to a great topical site and out rank them.

I guess my direct answer to the original question is:

Yes, definitely imo. They become 'expected' over time and removing an 'expected' result would damage Google more than 'letting things slide' or replacing them with mom-and-pop.

limoshawn

2:03 am on Feb 17, 2011 (gmt 0)

10+ Year Member



They become 'expected' over time

so does 'expected' = manual placement on google's part? forget beating or "taking down" i'm trying to figure out how a site gets to a level like that of w-pedia in a world that google says is purely algorithmic. i'm looking at a w-pedia page that does not have the keyword phrase "blue green widget" anywhere on the page, content or meta tags or code yet an advanced search on google for the exact phrase "blue green widget" returns the w-pedia page in the top 2 results, usually #1.

TheMadScientist

2:48 am on Feb 17, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Manual Placement? No.
In the SERPs with +1 weighting, possibly manually assigned? Yes, imo.

I would guess 'expected' could be determined algorithmically too based on click-thru percentage, inbound links, % of clicks relative to % of clicks on other results, etc. My guess is it's not a manual ranking (they really don't do that, except to play with Bing), but imo manual review with '+1 it's a brand' weighting is possible (likely?).

I'm sure there is more that goes into it than I've outlined, but I'm pretty sure I could determine 'expected' or 'flag for review as a possible brand name' so there's not much question in my mind they can too.

Think 'brand naming' could be completely manual too, but either way I've heard they are weighting 'brand names' differently than others and I would guess that would carry over to sites like wikipedia even though their 'product' is free information.

How could you do the same thing?
Build a 'brandable' website for the long term.