Forum Moderators: open

Message Too Old, No Replies

The End of Large Sites?

After August Shake-Up: Considering Smaller Sites?

         

adfree

3:38 pm on Aug 11, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



What's your take: percentage-wise it seems as if smaller, more static, more focused sites might be the name of the game right now. Much more than large, primarily dynamically, automaticaly controled sites with less of a human element.

Does that make sense?

diamondgrl

2:05 pm on Aug 19, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



subway,

on most of your points, right on!

on one, i must more than quibble: "There is no such thing as an informative, user-friendly, easily navigate-able, well built site that's got 100k pages. Just doesn’t happen, apart from the occasional forum and I mean *occasional* - even then most forums are full of dribble as well."

that's just not true. amazon.com is one (in fact, they have more than 3 million pages indexed in google). nytimes.com is another. webmasterworld is another. i could name hundreds of other such sites.

the fact is, it is very very very possible to create that many pages, but not unless you approach it from a web user's standpoint and not from an seo's standpoint.

petehall

3:31 pm on Aug 19, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Our site is only about 300 pages max!

The 30 pages that were all of a sudden not ranking in SERPs are back. We never lost any other positions really... weird!

I don't like huge auto generated sites either but it would seem a shame (even silly) to penalise other websites simply because they are large.

ogletree

3:45 pm on Aug 19, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The only way to do what you say is for someone to physically look at each site.

adfree

11:21 pm on Sep 4, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Coming back from a three-week vacation I had to learn that my (August 5 pounded) 380,000 page site tanked to almost zilch while my new site with just 25 pages generates 5 times as much as this large site now...

Guess it's not a generic answer to the question of this thread but there is sommething shining through...

rj87uk

12:33 am on Sep 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



On the idea that Google has dropped

'primarily dynamically, automaticaly controled sites with less of a human element.'

I have noticed a Small change, In my area i got a few Big websites that would rank high with no content, it would just auto gen the keywords... as far as i can tell a few of them are gone... :-D

Any one else seen something like that?

caveman

7:04 am on Sep 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yes people need to distinguish between "large sites" and "large auto-generated sites that have little value in G' s view".

As diamondgrl points out, there are loads of excellent, large sites. Large is not a problem.

But one gets the impression that some of the large sites mentioned in here are, well, not C*N*N or NYT*mes. ;-)

george123

7:22 am on Sep 5, 2004 (gmt 0)

10+ Year Member



steveb_downgrading these mini-webs-i think you are wrong here i have notice a site with only 3 pages #1 for a very competitive term,but they seem to be a website factory with crosslinking with over 200 domains (all there own and all 3-4 pages)they rank for each domain at the top 10 for the keywords they want.ita the most dence crosslinking i ever show as about guide lines of G and Y about crosslinking let me LOL ,they are in the same top in both SE

steveb

8:25 am on Sep 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



george123, you ought to go back and re-read the post.

"Large is not a problem."

Sure is. The issue here isn't quality of "large". That is irrelevant to the phenomenon.

This issue is the authority that a large niche domain grants itself dwindled a lot.

What we have a lot more of ranking well is:
1) trivial pages from general large sites (like Amazon or CNN)
2) below average mini-domains

What is suffering are large-ish niche domains, both authoritative and non-authoritative

george123

9:48 am on Sep 5, 2004 (gmt 0)

10+ Year Member



steveb ,agree

caveman

7:20 pm on Sep 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



”Large is not a problem” ... Sure is.

steveb I’m surprised. ;-)

I'll try again: Large is not a problem. That is a simple statement. It means that *simply* being large is not a problem. IF being large were a problem, then all large sites would be suffering right now. They are not, clearly. So, large is not a problem.

Are some kinds of large sites suffering? Yes. Is it because they are large? No. Size is only tangentially related to what is going on (though as we all know, this is not always true).

If a webmaster in recent months decided that being large was a good idea, and that webmaster decided to get big fast by adding a bunch of auto-generated fluff, then that webmaster may be feeling quite let down right now. But the issue is not that the webmaster fielded a large site; it's that a large number of pages containing little or no unique content are having more and more trouble getting any decent placement in G. And that’s as it should be.

This issue is the authority that a large niche domain grants itself dwindled a lot.

Ugh. I hate disagreeing again, but I must. The majority of the sites that we operate fall squarely into the category of “large niche domain”...and they are doing just fine. It could be that they feature unique and (hopefully) useful content. Or it could just be that we’re lucky. And as caveuncle always said: It’s better to be lucky than smart.

Caveuncle was very successful hunter/gatherer. :)

steveb

9:57 pm on Sep 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"then all large sites would be suffering right now"

No, and I'm not sure why you make it sound like a runny nose has to lead to cancer. If internal link value is devalued, that doesn't mean "suffering".

The phenomenon has zero to do with the quality of junk generated pages. And it is not some godawful death sentence. The phenomenon has to do with ranking of pages on large sites that only have internal domain linking versus minisites whose index pages have offdomain linking (from their own mini-webs or otherwise).

caveman

12:14 am on Sep 6, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The point is that unless one is in the game of trying to get big, fast, large has little to do with it. Unfortunately, it appears that some are being fooled into thinking that *just being large* is suddenly a problem. I'd hate to see people start breaking sites apart for the wrong reasons.

True, many large sites benefitted in the past from what G just devalued. But what G just did has very little to do with the size of a site, per se.

People who have large sites with quality content are not being hurt, necessarily. We have some large authoritative sites, and some large sites that are not authoritative, and all are doing fine. And that's true of some of our competitors too.

People should, as always, focus not on size, but on site structure and linking strategies.

steveb

12:32 am on Sep 6, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"unless one is in the game of trying to get big, fast"

Rate of growth has certainly nothing to do with anything, so we disagree.

Trying to simplfy everything to fit into one predetermined mindset isn't a good way to go. Google makes lots of changes, and this recent one is even dissipating already.

People with these junk sites, or people with new domains or lots of new pages are effected by a much broader phenomenon that people are caling a sandbox. That particular phenomenon is a fully different one than one effecting internal linking of larger domains that have been around for a long time and haven't added tons of pages recently.

If you get visitors from thousands of terms a day, it is pretty easy to see that a tweak came along that favored minisites and didn't favor authoritative domains that offer better, deeper content. The main page of a five page hobby site shouldn't have any advantages over the main directory page of 50 page section on a 1000 page authoritative domain.

The bottom line though is that building larger authoritative domains is probably still a better idea from a seo perspective, despite the August diminishment of value of internal linking on such domains.

caveman

7:13 am on Sep 6, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Trying to stay on the topic: The End of Large Sites?

My observations so far still suggest that the answer is: No.

What we see:
-- Some large sites are being hit, especially on subpages. But this is also true of some small sites we see.
-- The sorts of changes we see in the SERP's do not seem consistent across all categories we follow (sort of like when there was post-Florida speculation about kw's being targeted).
-- The hardest hit categories tend to share common navigation characteristics across sites.

We have numerous large sites fed by thousands of kw's daily, in cat's where we compete with mini-nets. If there were any sort of universal tweak that favors mini sites/mini nets and de-emphasizes authoritative domains, in all likelihood, we'd have seen it. We don't.

All of which suggests to me that this is not anything to do with large sites per se.

I can't get too specific, but it seems to me that these tweaks have more to do with link structure (and possibly link text).

Large sites have many pages that might potentially relate to G's perception of the importance of any given page, so certain kinds of tweaks affect large sites more than small ones...as this tweak appears to do. But it need not, depending upon how larger sites are structured.

This is the first time I've ever seen something that made me re-look at Brett's fundamental rules of site building, and wonder what G is thinking. I don't think this particular little phase of knob turning will last long. It flies in the face of developing sites that make sense for consumers.

mfishy

11:38 pm on Sep 6, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Pages from large sites are often slipping lately simply because of the dampening of internal anchor text.

Obviously, Google's biggest battle is currently against auto-generated sites with no useful purpose. Since these sites tend to have little outside linking to their thousands of pages, they largely rely on internal anchor text linking and PR to score well.

The "collateral damage" here is that some pretty darn good "largish" sites have slipped in ranking for many of their terms because they also relied on their PR and internal anchor. Mini-Sites have often fared better as they are more likely to focus on outside links to all of their pages, or were simply built around 1-2 keyphrases. As with every algo change, there is good and bad that comes out of it. (except for the "sandbox" which has only served to keep stale spam in the index)

Advice = Work on getting more outside backlinks to your most important pages.

crobb305

11:50 pm on Sep 6, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



What do you guys consider "large" sites? My site had about 10 pages until April, when I added about 70 new content pages. Then, in May, the entire site vanished from the serps, only internal links indexed with url (no title or description). My site is now about 100 pages but can't be found for any terms in Google. Incidentally, there are two "supplemental results" pages that remain in the Google index from my site, both cached April 2. This seems to be the last time
Google deep crawled my site for it's index. Could this all be related to the sudden addition of new content pages?

steveb

12:09 am on Sep 7, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Mostly off-topic but... crobb305, adding pages didn't hurt you. Adding pages that have almost identical multi-parameter default.asp? urls hurt you. Your internal linking is suicidal. Additionally your home link goes to default.asp? instead of domain.com/

Clean up your linking and URLs and you'll be fine.

mfishy

12:23 am on Sep 7, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



crobb305, I believe you are referring to a different, more sinister phenomena known as "slow death".

crobb305

1:00 am on Sep 7, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



mfishy, what is "slow death"?

caveman

4:49 pm on Sep 7, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



mfishy, well said, and agree, as usual...except for one caveat. I don't see that *all* internal links have been dampened across the board. Those who have employed site wide, blanket linking and nav, appear to us to be taking the greater hits. OTOH, we have large sites that have followed more classic approaches to site nav that have not only not been hurt, but seem to have been helped by this change. FWIW.

stever

5:03 pm on Sep 7, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Rate of growth has certainly nothing to do with anything, so we disagree.

steveb, I'd question the above statement in one sense if I may.

A moving average taken over a fixed period is a common concept in finance.

The idea of applying this concept to the valuation of links, whether internal or external, paid or free, reciprocal or one way, might be a way of looking at some of the recent changes and theories that have sprouted about them of late.

Collieman

7:38 am on Sep 8, 2004 (gmt 0)

10+ Year Member



Caveman,

Can you explain or direct me to a link which will show the difference between the 2 types of linking schemes you refer to. Specifically "site wide,blanket linking" and the more "classical approach"?

Are these referring to
a. a site where every page cross links
and b. a site where the links are more heirarchical?

Many thanks

caveman

4:49 pm on Sep 8, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Collieman,

Yes that's generally what I was referring to. As a caveat, be sure and always do your own investigation. No matter how heart felt one's opinions/conclusions are in here, they may be wrong.

Option 'a' in your post is self explanatory. Here is Brett's post on Theme Pyramids [searchengineworld.com]

This 83 message thread spans 3 pages: 83