homepage Welcome to WebmasterWorld Guest from 23.23.57.182
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
If website has a mix of quality and bad pages
ganeshjacharya



 
Msg#: 4625810 posted 3:10 pm on Nov 25, 2013 (gmt 0)

If website has a mix of quality and bad pages does it effect the overall ranking of the website?

Is it necessary to remove bad pages from a website even if there are good quality pages those that are popular?

 

Zivush



 
Msg#: 4625810 posted 3:31 pm on Nov 25, 2013 (gmt 0)

How do you measure 'quality'?

netmeg

WebmasterWorld Senior Member netmeg us a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



 
Msg#: 4625810 posted 3:59 pm on Nov 25, 2013 (gmt 0)

And what's the percentage of each?

There's no hard and fast rule for this stuff. But your larger percentage probably needs to be above average quality (however you define that).

adder

5+ Year Member



 
Msg#: 4625810 posted 7:43 pm on Nov 25, 2013 (gmt 0)

What's the purpose of the "bad pages"? Do the visitors benefit from those bad pages?

Sgt_Kickaxe

WebmasterWorld Senior Member sgt_kickaxe us a WebmasterWorld Top Contributor of All Time



 
Msg#: 4625810 posted 8:32 pm on Nov 25, 2013 (gmt 0)

Matt Cutts has definitively answered this question.

We had to move fast when attacking spam. We attacked anything that looked like a content farm. But when most of the site is spam, there is nothing we can do. Even if you have some quality content but mostly negative content… nothing can be done.

He said that after Pubcon 2013 in defense of Google in a mini-argument with Jason Calacanis. The question then becomes how much is too much? My answer is ANY low quality content is too much since it's very likely that Google sees more of it than I do on my own sites.

- content created just to cover keywords
- pages where affiliate content is clearly the priority
- pages with too many ads, especially above the fold
- pages with little to no content (ie: empty profiles on forums)
- pages with content that is not very useful
- pages that rewrite, in your own words, what was already written about elsewhere(feels original to you, it's not)
- pages linked to by spam networks(including from your own sites if Google feels they are low quality)
- even if your site has the best content in the world and NO spam if can be labeled spam if you give it too many links from your "blog network" or other frowned upon linking practices.

Cleaning house and deleting or adding a noindex meta tag to these pages can positively impact the rest of your site, eventually(may take months or years).

Quick question: Do 100% of the pages attributed to you as an author have affiliate offers or advertising on them? Believe it or not most websites have no ads on them so that figure stands out.

Sgt_Kickaxe

WebmasterWorld Senior Member sgt_kickaxe us a WebmasterWorld Top Contributor of All Time



 
Msg#: 4625810 posted 3:25 am on Nov 26, 2013 (gmt 0)

I just wanted to add a little to this(maybe a lot, I'll try not to ramble).

Brett Tabke wrote a long post on how to build a website a few years ago and a key part of that post was "to post daily" or, essentially add content regularly and "at the end of a year" you'd have 365 pages that would get some traffic which you could do something with to monetize(ads, aff offers, sell, etc)

I don't think that's true anymore

I recently performed a FULL upgrade on an old site which had only static files. EVERYTHING but the urls, textual content and titles changed and I do mean everything incl code, social footprint, cms, template, look, feel, internal links etc. The result was startling, absolutely no change in traffic and it's been a few months. In the past a bump was virtually guaranteed, now it's as if everything but the url/title/textual content is ignored.

I've been using this experience as a stepping stone to better understand the new way traffic is allotted to websites and, with the help of analytics and many other data sources one factor stands out, usefulness of content.

Pages that solve a problem, provide a detailed guide or answer a question thoroughly perform best in the new Google. Pages that are the most popular with people, are a personal editorial or are running commentary do not perform as well as they once did. If your approach is to offer your audience an online magazine about your favorite subject the odds are it's not getting nearly as much love as it once did. Essentially if your site is a cool place to hang out for editorials about a subject range then you are expected to draw your own audience with less help from Google.

example: youtube allows video owners to monetize their content but informed makers of game related content that they could not monetize the videos unless it contained a detailed guide that would help others in achieving something. A video of yourself just playing was no longer allowed. It doesn't matter if your gameplay was epic, was the most popular vid on youtube and had the biggest view count and follower count ever... not allowed to be monetized if it's not instructional.

Google search results are more like that then ever right now. Google serps want to link to instructional content for all non shopping queries. It's time to think about changing your approach to content creation and look at it from Google's point of view.

Caveat - don't go to the other extreme and become a content farm! Only the best guide ranks #1, that's your goal. If you're not into guides and want to focus on opinion/news pieces then Google is less interested in you right now than they have been in the past. This isn't true on all subjects of course, true authorities on every subject will always appear in the serps(seemingly around #4 with a G+ profile pic) but the top results tend to be very instructional in nature for non product/shopping keywords.

Ralph_Slate

10+ Year Member



 
Msg#: 4625810 posted 4:38 am on Nov 26, 2013 (gmt 0)

Wikipedia is perhaps a reason for "bad" pages - they have stub pages which are very clearly incomplete, but need to exist because if they didn't exist, no one would know about them and add to them.

Zivush



 
Msg#: 4625810 posted 9:37 am on Nov 26, 2013 (gmt 0)

Quality and not-quality is subjective.
Something may be considered good to someone might seen as bad (or Chinese) to the other. Any simple survey may show the differences in people interaction with the content.
The question is - How the damn algorithm evaluates content quality? links? time on page? where it is located?

Sgt_Kickaxe

WebmasterWorld Senior Member sgt_kickaxe us a WebmasterWorld Top Contributor of All Time



 
Msg#: 4625810 posted 12:52 pm on Nov 26, 2013 (gmt 0)

The question is - How the damn algorithm evaluates content quality? links? time on page? where it is located?


I wouldn't be surprised if they decided to say "ok, we have an index now, lets clean it up and grandfather what we have and focus on evaluating new content better". I say that because you can do almost anything you like to old content pages without losing much rank but getting a new page inside the top 3-4 for anything but longtail keywords has become more difficult. If they are moving away from being so link dependent it's my opinion that the serps would need to become more set in stone on existing content with a tougher evaluation process to see where new pages fit in.

I have worked with a couple of sites recently where random gibberish 8+ word titles returned in place 4-10 for the exact match of those titles suggesting the top 4 are off limits initially or the sites had an anchor somehow. That's not how it was even a few months ago.

Ralph_Slate

10+ Year Member



 
Msg#: 4625810 posted 5:40 pm on Nov 26, 2013 (gmt 0)

Sgt_Kickaxe, look at the situation from Google's perspective. It is using computers, not experts, to determine quality and to determine when one site should be knocked out in place of another. You may have just written the all-time best article on widgets, but how does Google know that? Their algorithms can't understand the universe of widgets.

They can test the new page by throwing it into the mix every so often. A/B test it against the established pages. If people like it (hard to determine) then they bump it up, maybe give it higher rotation. Eventually it may make it to the top.

It actually sounds a lot like trying to break a record into a radio station. Radio stations are going to give breaks to artists they have heard of before, and it's going to be very hard to get a new artist played, unless people ask for it.

So maybe the trick is to build a brand, get people to ask Google for it, and then you'll rise in the results.

n0tSEO



 
Msg#: 4625810 posted 10:22 pm on Nov 26, 2013 (gmt 0)

ganeshjacharya, can you please define "bad pages"? Are they of the kind Sgt_Kickaxe mentioned?

Sgt_Kickaxe

WebmasterWorld Senior Member sgt_kickaxe us a WebmasterWorld Top Contributor of All Time



 
Msg#: 4625810 posted 10:42 pm on Nov 26, 2013 (gmt 0)

Sgt_Kickaxe, look at the situation from Google's perspective

Not likely, Google has floated far too many big brand commercial sites to the top for my liking. It's almost as if they want to rid the net of non-commercial properties. I say that because it adds a whole other layer to what may be considered good and bad, subject matter and purpose of site.

EditorialGuy

WebmasterWorld Senior Member Top Contributors Of The Month



 
Msg#: 4625810 posted 11:59 pm on Nov 26, 2013 (gmt 0)

Wikipedia is perhaps a reason for "bad" pages - they have stub pages which are very clearly incomplete, but need to exist because if they didn't exist, no one would know about them and add to them.


And not just Wikipedia, either. CNet, ZDNet, and TripAdvisor are just a few megasites with maddening numbers of stubs. (I really, really hate it when I arrive on a CNet or ZNet "product review" page that turns out to have no review, but only a handful of price-comparison links).

I can understand the need for UGC sites (such as Wikipedia) to have stub pages that users will eventually populate with content, but there's no reason why such pages should be indexed by Google and presented in search results.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved