Forum Moderators: open
on most of your points, right on!
on one, i must more than quibble: "There is no such thing as an informative, user-friendly, easily navigate-able, well built site that's got 100k pages. Just doesn’t happen, apart from the occasional forum and I mean *occasional* - even then most forums are full of dribble as well."
that's just not true. amazon.com is one (in fact, they have more than 3 million pages indexed in google). nytimes.com is another. webmasterworld is another. i could name hundreds of other such sites.
the fact is, it is very very very possible to create that many pages, but not unless you approach it from a web user's standpoint and not from an seo's standpoint.
The 30 pages that were all of a sudden not ranking in SERPs are back. We never lost any other positions really... weird!
I don't like huge auto generated sites either but it would seem a shame (even silly) to penalise other websites simply because they are large.
Guess it's not a generic answer to the question of this thread but there is sommething shining through...
'primarily dynamically, automaticaly controled sites with less of a human element.'
I have noticed a Small change, In my area i got a few Big websites that would rank high with no content, it would just auto gen the keywords... as far as i can tell a few of them are gone... :-D
Any one else seen something like that?
As diamondgrl points out, there are loads of excellent, large sites. Large is not a problem.
But one gets the impression that some of the large sites mentioned in here are, well, not C*N*N or NYT*mes. ;-)
"Large is not a problem."
Sure is. The issue here isn't quality of "large". That is irrelevant to the phenomenon.
This issue is the authority that a large niche domain grants itself dwindled a lot.
What we have a lot more of ranking well is:
1) trivial pages from general large sites (like Amazon or CNN)
2) below average mini-domains
What is suffering are large-ish niche domains, both authoritative and non-authoritative
”Large is not a problem” ... Sure is.
I'll try again: Large is not a problem. That is a simple statement. It means that *simply* being large is not a problem. IF being large were a problem, then all large sites would be suffering right now. They are not, clearly. So, large is not a problem.
Are some kinds of large sites suffering? Yes. Is it because they are large? No. Size is only tangentially related to what is going on (though as we all know, this is not always true).
If a webmaster in recent months decided that being large was a good idea, and that webmaster decided to get big fast by adding a bunch of auto-generated fluff, then that webmaster may be feeling quite let down right now. But the issue is not that the webmaster fielded a large site; it's that a large number of pages containing little or no unique content are having more and more trouble getting any decent placement in G. And that’s as it should be.
This issue is the authority that a large niche domain grants itself dwindled a lot.
Caveuncle was very successful hunter/gatherer. :)
No, and I'm not sure why you make it sound like a runny nose has to lead to cancer. If internal link value is devalued, that doesn't mean "suffering".
The phenomenon has zero to do with the quality of junk generated pages. And it is not some godawful death sentence. The phenomenon has to do with ranking of pages on large sites that only have internal domain linking versus minisites whose index pages have offdomain linking (from their own mini-webs or otherwise).
True, many large sites benefitted in the past from what G just devalued. But what G just did has very little to do with the size of a site, per se.
People who have large sites with quality content are not being hurt, necessarily. We have some large authoritative sites, and some large sites that are not authoritative, and all are doing fine. And that's true of some of our competitors too.
People should, as always, focus not on size, but on site structure and linking strategies.
Rate of growth has certainly nothing to do with anything, so we disagree.
Trying to simplfy everything to fit into one predetermined mindset isn't a good way to go. Google makes lots of changes, and this recent one is even dissipating already.
People with these junk sites, or people with new domains or lots of new pages are effected by a much broader phenomenon that people are caling a sandbox. That particular phenomenon is a fully different one than one effecting internal linking of larger domains that have been around for a long time and haven't added tons of pages recently.
If you get visitors from thousands of terms a day, it is pretty easy to see that a tweak came along that favored minisites and didn't favor authoritative domains that offer better, deeper content. The main page of a five page hobby site shouldn't have any advantages over the main directory page of 50 page section on a 1000 page authoritative domain.
The bottom line though is that building larger authoritative domains is probably still a better idea from a seo perspective, despite the August diminishment of value of internal linking on such domains.
My observations so far still suggest that the answer is: No.
What we see:
-- Some large sites are being hit, especially on subpages. But this is also true of some small sites we see.
-- The sorts of changes we see in the SERP's do not seem consistent across all categories we follow (sort of like when there was post-Florida speculation about kw's being targeted).
-- The hardest hit categories tend to share common navigation characteristics across sites.
We have numerous large sites fed by thousands of kw's daily, in cat's where we compete with mini-nets. If there were any sort of universal tweak that favors mini sites/mini nets and de-emphasizes authoritative domains, in all likelihood, we'd have seen it. We don't.
All of which suggests to me that this is not anything to do with large sites per se.
I can't get too specific, but it seems to me that these tweaks have more to do with link structure (and possibly link text).
Large sites have many pages that might potentially relate to G's perception of the importance of any given page, so certain kinds of tweaks affect large sites more than small ones...as this tweak appears to do. But it need not, depending upon how larger sites are structured.
This is the first time I've ever seen something that made me re-look at Brett's fundamental rules of site building, and wonder what G is thinking. I don't think this particular little phase of knob turning will last long. It flies in the face of developing sites that make sense for consumers.
Obviously, Google's biggest battle is currently against auto-generated sites with no useful purpose. Since these sites tend to have little outside linking to their thousands of pages, they largely rely on internal anchor text linking and PR to score well.
The "collateral damage" here is that some pretty darn good "largish" sites have slipped in ranking for many of their terms because they also relied on their PR and internal anchor. Mini-Sites have often fared better as they are more likely to focus on outside links to all of their pages, or were simply built around 1-2 keyphrases. As with every algo change, there is good and bad that comes out of it. (except for the "sandbox" which has only served to keep stale spam in the index)
Advice = Work on getting more outside backlinks to your most important pages.
Clean up your linking and URLs and you'll be fine.
Rate of growth has certainly nothing to do with anything, so we disagree.
steveb, I'd question the above statement in one sense if I may.
A moving average taken over a fixed period is a common concept in finance.
The idea of applying this concept to the valuation of links, whether internal or external, paid or free, reciprocal or one way, might be a way of looking at some of the recent changes and theories that have sprouted about them of late.
Can you explain or direct me to a link which will show the difference between the 2 types of linking schemes you refer to. Specifically "site wide,blanket linking" and the more "classical approach"?
Are these referring to
a. a site where every page cross links
and b. a site where the links are more heirarchical?
Many thanks
Yes that's generally what I was referring to. As a caveat, be sure and always do your own investigation. No matter how heart felt one's opinions/conclusions are in here, they may be wrong.
Option 'a' in your post is self explanatory. Here is Brett's post on Theme Pyramids [searchengineworld.com]