homepage Welcome to WebmasterWorld Guest from 54.196.207.55
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Has Google increased the importance of website size as a ranking factor
aristotle




msg:4591157
 1:08 pm on Jul 8, 2013 (gmt 0)

Let me begin by saying that all of my websites are small, less than 50 pages, because i spend a lot of time researching everything I write about, and also because I hand-code all of my pages. Until about two years ago, many of my articles were number 1 in Google search for their main key phrase, often even outranking wikipedia and large organizations. I should mention that all of my sites are strictly non-commercial with no products or ads, and that the articles generally don't relate to high-volume commercially competitive search terms.

Anyway, about two years ago, many of the articles that had ranked number 1,2, or 3 in Google for a long time, began to slowly slip downward. Two of the sites were hit by the first rollout of Penguin, and haven't recovered, but even on the three non-Penguinized sites, the rankings have slowly slipped. In all cases they have been replaced at the top of the results by wikipedia and large organizations.

[Note: My sites are still ranking very well in Bing, with hardly any of the slippage that has occurred on Google.]

From what I've read and observed, many other small websites have also fallen in Google's rankings over the past couple of years.. So my question is, has Google been adjusting its algorithm to promote the rankings of larger websites at the expense of smaller sites? And if so, why would they do this?

 

goodroi




msg:4591161
 1:36 pm on Jul 8, 2013 (gmt 0)

This is funny because I have enterprise level clients that are complaining about the exact opposite. The big brand name websites complain that Google is unfairly penalizing them because Google changes the algo so fast that only the smaller sites can keep pace with the changes. The large brand name sites need to plan months in advance and coordinate with multiple departments and agencies, then hope that the corporate execs and legal team approve everything.

Back to the key issue of this thread - I do not think website size has a significant impact on rankings. I do think domain trust & authority can significantly impact rankings but not size.

You can test this by looking at many UGC websites. These sites are enormous but the user generated content is often poor, that these domains tend not to develop enough trust & authority. This is why I don't think sheer size is important.

If you provide great information that can't be found anywhere else or common information that is presented in a way far superior than anywhere else, it will likely lead to your website name naturally appearing in Google auto suggest. IMHO that is a simple way to know you have a trusted and authoritative website. If you haven't hit that level, then you should revisit your website's value proposition.

For my own private websites, I focus on providing more value. My sites naturally will grow in size over time but I don't spend time artificially inflating its size.

aristotle




msg:4591169
 2:33 pm on Jul 8, 2013 (gmt 0)

The big brand name websites complain that Google is unfairly penalizing them because Google changes the algo so fast that only the smaller sites can keep pace with the changes.

Thanks for your reply. Actually I never try to "keep pace with the changes" in Google's algorithm. I originally designed my sites based on careful considerations, and have never changed them in any significant way, and won't do so unless I see a good reason to do it. It has nothing to do with Google.

If you provide great information that can't be found anywhere else

Many of my articles originally had information that couldn't be found anywhere else on the web. That's not true any longer for most of them. In fact there are some wikipedia articles that got virtually all of their information from my articles, and now the wikipedia articles are at the top of the rankings and get most of the traffic that my articles used to get.

Perhaps it's easier for a large site to gain trust and authority. Maybe that's the explanation for what I'm seeing.

HuskyPup




msg:4591202
 4:02 pm on Jul 8, 2013 (gmt 0)

Google's lost the original author/website information and seems to merely focus upon their boggers, scrapers and wikidictionaries that have copied stuff ad verbatim.

This seems to be proven by the fact that Bing, and some other engines, usually do show the original at the top.

It's certainly nothing to do with size, I have sites varying from 50-60 pages through the hundreds and into the low tens of thousands and I've had precisely the same treatment from Google even though they were all original authority sites when first launched.

Some of the results I see these days are not only pathetic they are downright incorrect but how the heck is an algo supposed to know that when it is biased towards their own properties and white-listed "customers".

indyank




msg:4591208
 4:11 pm on Jul 8, 2013 (gmt 0)

If you provide great information that can't be found anywhere else or common information that is presented in a way far superior than anywhere else


Though you do it, the powerful ones are going to catch up. This is what Aristotle is experiencing with Wikipedia. When sites like Wikipedia do it, you naturally loose the ranking to them. So what could be done towards this?

goodroi




msg:4591220
 4:53 pm on Jul 8, 2013 (gmt 0)

Don't stop and rest. Just because our website is making money today, does not mean we are entitled to make money tomorrow. Unfortunately many webmasters forget this and get lazy.

To avoid allowing the competition to catch up, you need to innovate. Hopefully you are creative enough to come up with fresh ideas that will add value to your website. Here are some basic ideas:

People scraped your content & put it into wikipedia?
Then add a photo gallery and a video gallery with good descriptions. Users love visuals and wikipedia does a poor job with this. This also opens up your site to more traffic with universal serps incorporating images & videos directly into the serps.

Haven't added any new content in the last year?
Look through your log files, ppc and competing sites to identify new topics to cover. Just because you have a great article about widgets, does not mean you should stop writing about widgets. Add an article about how to repair widgets, or new laws impacting widgets, or finding best coupons for widgets. Get creative and expand your content for what consumers are searching for today. Ten years ago people stored computers files on floppy disks, five years ago it was usb keys, now people use clouds. Make sure your content keeps pace with your industry - actually forget that. You want your content to set the pace for your industry. Be known for being the first to cover a new topic or concept in your industry.

Scrapers are republishing your content?
Present your content in a better way. Infographics might be a tired link strategy but users still like looking at them. So maybe create a few infographics just to better present boring content on your site. All of your competition has the same article but you are presenting it in a more impressive and enjoyable way. (plus file dmca notices against those scrapers)

Fortune 500 company outranking you?
Get creative and take risks. Big companies are so worried about their brands, they rarely get approval to take any risky moves that might offend people. Smaller sites can launch viral campaigns that are refreshingly honest and blunt. You may offend a few people but overall you can gain a much wider market. Pick a famous group/person/organization and write an honest blog post about "Why I think ... is a freakin' idiot!" or "Top ways <fill in your industry> rips-off consumers" Giving honest insights can connect with online users who get bored with safe corporate marketing speak.

None of these ideas might be right for your personal situation. The point is to always be creative and innovate. Voluntarily raise your quality standards so by the time competition catches up to you, you have already moved on to the next step.

netmeg




msg:4591244
 6:18 pm on Jul 8, 2013 (gmt 0)

This year I'm spending a lot more time on my site architecture and UI. I want to make sure my sites (and my client sites) work as well as possible on as many devices as possible - not just adequate, but *outstanding*. I'm trying to cut way down on the number of clicks/pageviews it takes to get to the main content, even on the ad supported sites that rely on pageviews. Eliminating any unnecessary navigation, and in some cases rewording anchor text to make it abso-freaking-lutely clear what the user will get when he clicks. Making sure that my top navigation only goes to the content, and all the administrative stuff goes in the sidebar or the footer. I have tons of outbound links, but I'm checking them with a spider tool to make sure they all still work (turns out a lot of them don't) Making changes for field types (dropdowns, check boxes, radio buttons etc) so make it easier for people on phones. Make sure that there's a big honking site search box, and that it actually works.

Yea this is all basic stuff, but it's amazing how much of it gets away from you, specially if you've had a site up for five years or more. And the big sites even with their inhouse SEO and IT departments - a lot of them forget about this stuff too, so in some cases it's an opportunity.

The bottom line is, 2013 Internet isn't 2005 Internet; everything is a moving target now. Having great, unique content isn't enough. You have to have that PLUS a fantastic user experience. And it ain't easy.

Saffron




msg:4591281
 10:00 pm on Jul 8, 2013 (gmt 0)

I don't know about website size, I think they're placing much more importance on "brands". My site has been hit hard, but my competitors seem to have been obliterated, all good sites. I feel sorry for them. G have decided two sites in particular will rank no. 1 & 2 for everything.

I'm trying to find other ways for traffic. Writing more quirky articles. I ask my friends and family to ask me questions about the topic of my site. I've covered all the well known ones, but it's amazing what a child will ask about the topic, and has given me some good ideas for new content. The two new sites whopping mine are medical, so I'm going for the non medical articles. You have to evolve in today's market.

Dymero




msg:4591286
 10:11 pm on Jul 8, 2013 (gmt 0)

I don't know that they're focusing on large sized sites. Maybe "smart sized" sites. If you're creating hundreds of pages of thin content tha'ts only optimized to bring in the views, that will probably get your site clocked by Panda. If your hundreds of pages are relevant and good quality, though, then that's okay.

"But eHow," you say. I say, forget eHow. Strike your own path and focus on good content and, yes, promote it to generate interest.

johnhh




msg:4591305
 10:56 pm on Jul 8, 2013 (gmt 0)

If you provide great information that can't be found anywhere else or common information that is presented in a way far superior than anywhere else


Have actually done this - got a load of data and presented it well, my competitors ignored it's public availability. It answers a "what is " question, over 2,000 extra pages linked in to our own data so thats 6,000 extra pages, yet currently probably creating 30% of the traffic !

Now have a new 1 million plus rows database answering another series of similar questions, still working out how to present it.. :)

edited for spellin as usuak - yawn

Rosalind




msg:4591441
 7:53 am on Jul 9, 2013 (gmt 0)

Perhaps it's easier for a large site to gain trust and authority.

People can only remember so many domains, so they'll recall and link to the few that are everywhere. One way to compete with that is to consider teaming up with people, accepting high-quality guest posts, or merging related websites under one name.

aristotle




msg:4591486
 1:20 pm on Jul 9, 2013 (gmt 0)

Thanks for all of the replies. After thinking about it some more, I've concluded that it probably isn't realistic for one individual like myself to continue to compete against wikipedia with its thousands of "contributors", or large well-known organizations with thousands of employees. There was an opportunity to do it for a number of years, and that's one of the great things about the web, in that it potentially enables people with minority views or unusual knowledge to reach large audiences at little cost. But that has become much harder to do now that Google has de-emphasized quality, value, and relevance.

Although my sites are non-commercial, they're important to me because I try to use them to bring attention to some of society's problems and possible solutions, and to support other individuals and groups who want to do the same. In my opinion Google's current policies reinforce the status quo and make it harder for humankind to solve the problems it faces. Fortunately there are many people who will continue to fight no matter who opposes them.

Planet13




msg:4591510
 2:35 pm on Jul 9, 2013 (gmt 0)

Getting back to the original question for a moment:

Has Google increased the importance of website size as a ranking factor?

I would say without a doubt yes.

I was looking at the results for a particular style of clothing (we'll call it "widget clothing") and the SERPs were flooded with large name clothing retailers who you would normally NOT associate at all with "widget clothing". They are companies that make or sell clothing that is about 180-degrees away from "widget clothing".

Yet they had all made one page on their large domain dedicated to "widget clothing" and that page was outranking independent designers who are dedicated to "widget clothing."

Another disturbing thing is seen the plethora of ebay sites where someone has created a username based on a keyword (we'll say, "bluewidgets" for the keyword "blue widgets" and that page is ranking, even though there text is stolen from other sites).

(Can you tell I am tired of filling DMCA requests against ebay and amazon user pages?)

EditorialGuy




msg:4591525
 3:36 pm on Jul 9, 2013 (gmt 0)

The real question isn't "Has Google increased the importance of website size as a ranking factor?", but "What has Google changed that has led to higher rankings for megasites?"

As Tedster liked to say, "correlation isn't causation."

outland88




msg:4591580
 6:39 pm on Jul 9, 2013 (gmt 0)

I would agree with Planet 13 that site size is unquestionably a factor in the algo. In fact this pops up in Health areas probably more than any other area I’ve seen. You can literally find large auto parts sites and even baseball card sites who’ve written a mediocre page about a health subject and are ranking well purely based upon the size of the site. The answer is pretty clear in those instances without testing. Many niche categories go largely untouched until the brands feel it can’t be ignored. Once they do and they will because their appetite is voracious soon you fall. The idea that you can dodge this by dealing in balloons this year and clock parts the next really just shows you’re chasing income not a commerce site. I'm not against that but putting that out there as a workable business solution makes me cringe.

One of my favorite things to do in Health is show people the top 100 or 200 domain names and have them guess what subject was being searched for. Nobody has ever been able to give me a correct answer because site size was a large factor in the rankings. It was not the sole factor though.

Planet13




msg:4591588
 7:04 pm on Jul 9, 2013 (gmt 0)

"What has Google changed that has led to higher rankings for megasites?"


It would seem to me that that google is relying less and less on it's ability to understand semantics and language in its algo, and is relying more and more on other factors.

It seems strange because, really, google's ability to understand language and semantics seems like its ace in the hole. It is a core competency that no other organization seems even remotely capable of matching.

Other factors - such as link profiles, social activity, and even user behavior - can be exploited to a certain extent by google's competitors. For instance, firefox and opera could probably sell user behavior to bing and other search engines, and I am guessing that MS could just provide user behavior from IE to bing.

so to me, it just seems weird that google would marginalize its key competitive advantage in its ranking algo...

[edited by: Planet13 at 7:36 pm (utc) on Jul 9, 2013]

aristotle




msg:4591590
 7:10 pm on Jul 9, 2013 (gmt 0)

From what I've seen recently, sites like Ehow, Yahoo Answers, Youtube, and of course wikipedia and Amazon, can appear on Google's page 1 in searches for almost anything. All of these are very large websites, but somehow i don't think of them as "authorities". Maybe Google is so worried about spam, and its continuing inability to stop spammers, that it's decided that the only solution is to fill up page 1 with these megasites.

Planet13




msg:4591600
 7:40 pm on Jul 9, 2013 (gmt 0)

"Maybe Google is so worried about spam, and its continuing inability to stop spammers, that it's decided that the only solution is to fill up page 1 with these megasites."


that could be.

In the long run, the trend toward megasites at the top of the SERPs would probably not be overly beneficial to google, since it is something that bing, DDG, yandex, and pretty much any other search engine would be able to replicate easily.

Really the value proposition of a search engine (to its users) is to deliver quality results that would otherwise normally be hard to find.

so by delivering obvious results, how much value does a search engine have for its users?

outland88




msg:4591604
 8:05 pm on Jul 9, 2013 (gmt 0)

I’ve argued since 2005 Google knew the only way to deal with the spam problem was to “fix” the results which became the brand solution. Fix meaning rig the results, for lack of better words. Google has always known the spam problem was related to Adsense (which many honest Adsense webmasters absolutely don’t want to here) but wasn’t about to kill their “golden goose”. To argue it was related to Adwords is simply preposterous because about all search engines have a form of advertising that didn’t yield a $900 stock. Google wasn’t smarter or better they just gobbled up the marketplace by putting every Tom, Dick, Harry, and Violet in business via Adsense.

I now accept the fact that Google wants to be a “knowledge base” based upon you giving them free information. I will not do so. Ms. Mayer said roughly the same five years ago. What’s in play is just basic Marxist theory. The top results were previously your advertising for making money that many no longer have. Increasing strictly the site size at this point in competitive areas would be a daunting task. Personally I choose to form relationships outside of anything related to Google.

[edited by: outland88 at 8:10 pm (utc) on Jul 9, 2013]

EditorialGuy




msg:4591605
 8:07 pm on Jul 9, 2013 (gmt 0)

so by delivering obvious results, how much value does a search engine have for its users?


It's possible that market leaders don't need "unique selling propositions" in the same way that businesses playing catch-up do. Being good enough is good enough.

If you're the maker of Tide laundry detergent, you'll continue to improve the product incrementally without doing anything to alienate your existing mass audience. If, on the other hand, you're the maker of an also-ran laundry detergent, you'll come up with a USP like organic grease-eating micro-organisms because having a parity product isn't good enough to unseat or even nibble away at the market leader.

Robert Charlton




msg:4591746
 9:28 am on Jul 10, 2013 (gmt 0)

Size by itself isn't sufficiently descriptive. I think you've got to look at size in combination with trust and authority, and a number of other factors. There are many tradeoffs to take into account.

If a site is too large, with insufficient trust and authority, then the size is a liability. Pages will lack enough link juice to rank. Making a site too large is analogous to overexpanding a business enterprise and not having the proper customer base and resources to design, run, and maintain it. Too large a site will not only lack link juice... it will likely also suffer in terms of content quality and user experience.

That said, there is the old joke that if you can't make it good, make it big. I think that's less true for websites than it is, say, for painting or architecture.

It may be that many of the larger sites we see ranking are there because they're survivors that have naturally expanded over time... and they often have custom infrastructure that's come from years of development and re-investment. Large sites can also take advantage of economies of scale that many smaller sites don't have, and can offer a richness of user experience which a smaller site can't provide.

The algo has, for at least a dozen years that I can remember, often favored large sites. It's amazing what you can get away with at PageRank 7 (yes, PageRank) that you can't at PageRank 3. Your page templates, eg, can be much less unique. It's as though the algo regards templated pages with more high quality inbound links as more unique than they'd see those same templated pages with lesser inbounds.

But smaller sites... perhaps not burdened with that legacy infrastructure... can move and adapt more quickly, and that, in the right hands, is a distinct advantage. My inclination is to grow with demand and resources... as goodroi puts it, to "naturally grow in size over time".

At the same time, in the current algo climate, I am concerned about not starting too small. I've been assuming, eg, with a current article site I'm developing, that I should push for a certain critical mass before launching. It seems to me that if "engagement" suggests time on site (which in part I think it does), a critical mass of really good pages would be helpful in giving a new site a better start.

rajeshth02




msg:4591778
 11:38 am on Jul 10, 2013 (gmt 0)

SEO Definition 2013: Just do the right job, without any expectation..

Planet13




msg:4591936
 7:43 pm on Jul 10, 2013 (gmt 0)

"Size by itself isn't sufficiently descriptive. I think you've got to look at size in combination with trust and authority, and a number of other factors. There are many tradeoffs to take into account."

While I agree with this, I would say that trust and authority have been amped up, while google's semantic signals have been SIGNIFICANTLY reduced in the recipe.

I would venture to say that if an ebay or amazon user set up a page on ebay with the name "webmasterworld" they would start ranking very highly at a very quick rate for the phrase "web master world". I think it might be possible that they could even outrank the webmasterworld.com site.

Hmmm... think as an experiment I will try and set up an ebay account under mattcutts...

mcskoufis




msg:4591949
 8:19 pm on Jul 10, 2013 (gmt 0)

Hmmm... think as an experiment I will try and set up an ebay account under mattcutts...


Lol... That's a great idea to raise awareness about the brand (or rather authoritative) signals that Google is using to present us with branded only content on the top of the SERPs.

Truth is that working with major brands over the past 5-6 years has made my life easier on the link building front, with very few optimized links you can work wonders on the rankings of very poor content pages that you need over 20 approvals to edit anything...

These sites already have tons of links, get frequent press coverage, bloggers link to them naturally, etc. I never conduct intensive link building campaigns, rather focus on On-Page factors more than anything else.

Tip to those optimizing such websites: Clients (their marketing departments) can find easier to approve things like alt text and others that only pop on screen when the mouse hovers over than on the actual on-page text that is constantly visible.

Regarding size, I am managing major multinational brands with 5 page mini websites for their products which rank for the keywords they can rank for given they are full of graphics and no text...

outland88




msg:4592012
 11:49 pm on Jul 10, 2013 (gmt 0)

You don’t have to make a test that complicated Planet 13. Many people already have access to free web space via their own, often times large, ISP(brand). Slide a few informative pages over to subdomains and see if living off the ISP (brand) works. In some cases it works quiet well if the brand and site size is still held in high esteem by Google.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved