homepage Welcome to WebmasterWorld Guest from 54.211.181.45
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 98 message thread spans 4 pages: < < 98 ( 1 2 [3] 4 > >     
25 Signals of Crap
So many threads on Signals of Quality - let's take the reverse approach...
Fribble

5+ Year Member



 
Msg#: 3233693 posted 5:34 am on Jan 27, 2007 (gmt 0)

A recent thread I read in the supporters forum that mentioned the ever-popular phrase "signals of quality" got me thinking. Why don't we try and compile a list of possible "signals of crap"?

I realize that many of the list items would logically just be a perceived "signal of quality" reversed but if we all dig down into our experience then this exercise just might bring up a few unique and useful insights. Here's a start:

25 Signals of Crap

  1. Reciprocal link request pages.
  2. No Privacy policy.
  3. Outdated copyright date or last modified date visible on the pages.
  4. error pages that don't send 404 headers or send content regardless of the page requested/querystring entered.
  5. Massive numbers of incoming links from link farms.
  6. dead/404ing links.
  7. High link churn.
  8. No published contact address, email address or phone number.
  9. A high bounce rate (surfers clicking back on their browser and selecting another search result).
  10. Too much duplicate content.
  11. Whois info for the domain which is the same as other domains previously penalized or banned. (Could also be true of adsense publisher/affiliate ID's and other identifiable footprints)
  12. Use of/links to affiliate programs that are known scams
  13. Domains previously used for spam or that are blacklisted.
  14. Stagnation (Site never changes)
  15. excessively long URI's/URL's (query strings or folder and file names)
  16. A high percentage of affiliate links vs regular outbound links.
  17. No / very few outbound links.
  18. No / very few inbound links.
  19. All inbound links are to homepage only
  20. Outbound links to questionable/spammy/crap sites.
  21. Profanity or explicitly adult language on a non-adult site.
  22. Too many spelling errors.
  23. Contains unrelated subjects (ex: a site that reviews toys and tries to sell insurance or viagra).
  24. Lack of interest from social bookmarking sites.
  25. MySQL or PHP errors in the pages

I'm not claiming any of these are definitely a signal of crap, but they make my list of possibilities based on my own conversations and observation.

Please add/subtract/modify and let's see if we can find a new perspective and learn something.

 

readingonly

5+ Year Member



 
Msg#: 3233693 posted 2:38 pm on Feb 5, 2007 (gmt 0)

"22.Too many spelling errors."

But comments on my blogg have milions of speling erors. What can I do about it?

mattg3

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3233693 posted 3:32 pm on Feb 5, 2007 (gmt 0)

I would like to add "Private domain registration."

I don't have one of those, still I can understand why people want one. The world is a large bad place, any option to reduce the ablity to be stalked can only be appreciated.

Suppose someone publishes a website that defends against some kind of hate site. That publisher might not want to expose their Whois information publicly, making themselves a personal target of the hate site people.

If law enforcement has access to said private registration, this would prevent criminals using the same technique, while giving normal people the chance to protect themselves.

[edited by: tedster at 6:18 pm (utc) on Feb. 5, 2007]

voices

10+ Year Member



 
Msg#: 3233693 posted 4:15 pm on Feb 5, 2007 (gmt 0)

I like old copyright dates, shows me the site has been around awhile.

cnvi

10+ Year Member



 
Msg#: 3233693 posted 4:31 pm on Feb 5, 2007 (gmt 0)

Reciprocal link request pages.

I'm not claiming any of these are definitely a signal of crap, but they make my list of possibilities based on my own conversations and observation.

Fribble.. good list. I would like to explain why "reciprocal link request pages" may be a signal of crap.

Many junk sites utilize reciprocal linking because they are desperate for traffic and link exchange is their equalizer .. junk sites have few choices for generating traffic and they choose link exchange because it's available to them.

But just because the crooks do it doesn't mean its bad for everyone!

I have said it here and I will say it again:

NOT ALL SITES THAT EXCHANGE LINKS ARE LOW QUALITY SITES. Pick your favorite hobby and then search those keywords including "submit link" or "link request" and you will find hundreds of thousands of quality sites that have ongoing active link exchange campaigns - many of which use link management software to handle the data management chore that reciprocal linking comes along with.

I think paranoia abounds in the SEO community because many SEO's think that reciprocal linking is useless as an SEO tool. Who said link exchange should be an SEO function? I have never stated that here nor will I ever.

Link exchange, when done in low to natural volume with relevant sites is a perfectly acceptable (and search engine friendly) method of generating quality traffic entirely apart from SE returns. Link exchange should be conducted as a branding and traffic building function. Never as an SEO function.

For those of you who have been on the web pre-Google, you know that link exchange has and always will exist.

What you should avoid is reciprocal linking services that offer FULL DUPLEX link exchange products.. FULL DUPLEX means the software auto-links you to (sometimes irrelevant) sites in high volume in a short period of time that have no benefit for your end user without any editorial discretion on making links.

If you read Google's patents, you would know that Google wants you to obtain links using editorial discretion.

There is good link exchange and there is bad link exchange. If you decide to pursue a reciprocal linking campaign, use an EDITOR BASED software so that you maintain full editorial control on who you link out to.

For those of you who are rolling your eyes now, here's some undeniable proof from one of your favorite search engines:

Go to Google's webmaster guidelines read the FAQ "How do I add my site to Google's search results?" Google answers by saying "Although Google crawls billions of pages, it's inevitable that some sites will be missed. When our spiders miss a site, it's frequently for one of the following reasons ..."

Google goes on to list four specific reasons why a site may not have been indexed. The first of these reasons, the VERY first is "the site isn't well connected through multiple links to other sites on the web." To be "well connected" you need links "TO other sites." That's Google talking, not any self-proclaimed SEO guru. Those are Google's own words: The site isn't well connected through multiple links to other sites on the web.

Note also that Google specifically says "connected through multiple links TO other sites." The word "to" means out-going, forward. You are linking to someone, as opposed to receiving a link from someone.

How many forum posts have you read that claim forward links are bad, that only one-way incoming links are good? Google doesn't seem to agree. Google outright says that to be "well connected" you need links "TO other sites."

Here's another reference from Google: The question this time is "My site's no longer included in the search results. What happened?" After explaining how sites can sometimes be dropped because of "fluctuations" in data-center processing, Google says this: If your site is well-linked from others on the web, it's likely that we'll add it again during our next crawl.

There's that "well" word again. "well linked - well connected." In response to this question, Google is saying your site should be "WELL-linked FROM others." In the answer to the first question they said your site should be "WELL-connected through multiple links TO other sites."

The search engines have NEVER stated "don't link exchange". They have blogged (even recently) to watch high volume and relevancy. But they will never tell you to not link exchange because link exchange is what makes the web a web.

Sites that link exchange with proper relevance and in low to natural volume excel in both terms of the traffic they generate from those links, and whatever link juice those links provide in rankings. In my ten years of managing link exchange campaigns for literally thousands of clients, I have never seen a single case of reciprocal linking hurting a site in rankings with a couple of exceptions when the site was linking to a vast assortment of genres that may have been viewed as irrelevant.

Link exchange is a data management challenge - thus the link request forms you are used to seeing more and more often. As long as the form allows the receiving webmaster to maintain editorial discretion, there is nothing wrong with link management software.

I know why many of you are anti link exchange.. you are tired of seeing it abused. However, do know that not everyone exchanging links is abusing this time tested marketing vehicle. Link exchange - reciprocal linking.. whatever you want to call it. It's what makes the web a web and to expect new sites to "obtain one way links from quality sites" is not realistic for most small to medium businesses.

Link exchange works when not abused.

Getting back to the 25 indicators for "crap" ... judge a site based on it's overall quality and benefit to the end user, not how it markets it's site and builds its traffic.

econman

10+ Year Member



 
Msg#: 3233693 posted 5:02 pm on Feb 5, 2007 (gmt 0)

While humans can "judge a site based on it's overall quality and benefit to the end user" when I see the word "signals" I immediately think of the way Google and other SE's attempt to judge a site -- which is based on the "signals" emitted by data patterns that are easily captured, stored and evaluated using computer algorithms.

sunny_kat

10+ Year Member



 
Msg#: 3233693 posted 5:24 pm on Feb 5, 2007 (gmt 0)

I feel that reciprocal linking still exists as no one would be interested in linking to you without any exchange benefit.

The only factor that would change the way google represents the inbound links.

Thoughts?

aleksl



 
Msg#: 3233693 posted 6:25 pm on Feb 5, 2007 (gmt 0)

Fribble, nice post.

Lots of unrelated talk, so let's define this:

Is this about Google's "perceived" quality, or Quality for the user?

<<< No real menu or information architecture -- just a laundry list of links going on down the page<<<<<

like any SERPs, right? :)

mattg3

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3233693 posted 6:25 pm on Feb 5, 2007 (gmt 0)

Thoughts?

Google wants a 1999 web where most just published for fun, out of interest or academically. Links where then a measure of interest in the site. Anything like dealing no dealing, reciproce or broke isn't what they want.

To maintain their link thingy their only aim can be to exclude any commercial linking of any sort. Anything else willbe tarnished with some selfishness. ;)

So I guess i would throw out by default all forums, wikis, blogs, myspaces and so on. Then you remove all .com .whatevernewtldripoffwasinvented sites and leave the .gov .mil. edu sites. possibly leaving old sites valid. Pretty much what they have done.

Think 1999 :)

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3233693 posted 6:40 pm on Feb 5, 2007 (gmt 0)

Too many spelling errors

I just watched a video interview with Matt Cutts -
[zdpub.vo.llnwd.net...]

Here he mentioned that when he was working on Safe Search, one signal he found for junky adult sites was the bad spelling "amature".

That's just one example from several years ago, but it made me pretty sure that Google must look for excessive spelling errors as one signal of low-quality. My guess is also that they use different criteria for sites with user generated content.

[edited by: tedster at 6:54 pm (utc) on Feb. 5, 2007]

ken_b

WebmasterWorld Senior Member ken_b us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3233693 posted 6:46 pm on Feb 5, 2007 (gmt 0)

....no one would be interested in linking to you without any exchange benefit.

Not always true. There may well be many webmasters who won't link to a site without getting a reciprocal link back.

But there are other webmasters that actively look for sites to link to without even being asked, and certainly without asking for a link back.

That may depend on your market of course, relevant unreciprocated inbound links may be harder to get in some markets.

mzanzig

10+ Year Member



 
Msg#: 3233693 posted 8:05 pm on Feb 5, 2007 (gmt 0)

matt3g:

on private domain registrations...

I don't have one of those, still I can understand why people want one. The world is a large bad place, any option to reduce the ablity to be stalked can only be appreciated.

Yeah, sure.

I am keeping a list of MFA (made-for-ads) sites, and many use a private registration. To me it is a =clear= indicator for folks who want indeed not be able to be tracked down, for whatever reasons. If someone is unwilling to disclose where they work/operate, well, I do not have too much confidence in the validity of the site.

reprint

5+ Year Member



 
Msg#: 3233693 posted 8:35 pm on Feb 5, 2007 (gmt 0)

Lack of original content for an informational site. The duplication rule probably works well here but is not perfect. Adsense tends to work against this with keyword stuffing, better performance with shorter pages or incomplete information (probably get flamed for this) but incomplete information makes it likely people will look for links to get more info e.g adsense.
Quality must be tied to the purpose of the site. If it accomplishes its goal then its a quality site.
Design may look outdated but the site may still be valuable e.g. those targeted to countries with dialup.
If you are a vendor and attract customers and make sales, then its a quality site. You are fulfilling a need and a purpose. I dont know how a bot can detect that but natural selection will determine the sites that stick around in that area.

How about domain squatting or typo squatting? If it just serves up ads when people are looking for information.

Lack of archiving for informational sites leading to loss of valuable information and historical information.

Just some thoughts

tigertom

10+ Year Member



 
Msg#: 3233693 posted 9:15 pm on Feb 5, 2007 (gmt 0)

Have one or two 'signals of bad quality' on an otherwise popular site, no problem.

Have a few on a new or unpopular one, get nuked.

Sorry if this has been said already. I couldn't be bothered reading through the self-justifying whingeing earlier on.

annej

WebmasterWorld Senior Member annej us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3233693 posted 9:32 pm on Feb 5, 2007 (gmt 0)

To be "well connected" you need links "TO other sites."

That's very interesting. I've heard it talked about for a while as a possibility but this sounds official. How long ago was this?

JamieBrown

5+ Year Member



 
Msg#: 3233693 posted 10:30 pm on Feb 5, 2007 (gmt 0)

I like this thread - I'm going to bookmark it for regular future reference!

I don't know if a lack of (bot) accessibility counts as a signal of low quality, but I'll throw that in anyway - especially things like JavaScript links that mean Google won't get anywhere around your site, or a malformed robots.txt. May seem obvious and people often get it wrong, but its probably one stage back from pure "quality"?

Also, although I'm cynical about the impact of a Privacy Policy page, I'm not cynical about a proper P3P policy. Our sites all return their policy in the HTML header using something like:

Header append P3P "policyref=\"[mydomain.com] CP= "NON DSP COR CURa TIA"

I think professional sites (esp those with forms, cookies and carts) should present one of these, and I can imagine Google rating the site for it at least a little. Although of course many amateur or small sites won't so it can't be a definite off/on factor.

Not sure about stuff like the "mail a friend" link, or an e-mail address on the page etc, but maybe I'm oversimplifying Google's algorithm!

I wish I understood sensible link structure and how that affected SEO! :-)

[edited by: JamieBrown at 10:35 pm (utc) on Feb. 5, 2007]

JamieBrown

5+ Year Member



 
Msg#: 3233693 posted 10:39 pm on Feb 5, 2007 (gmt 0)

That's very interesting. I've heard it talked about for a while as a possibility but this sounds official. How long ago was this?

Although I don't know anything official, I've seen much anecdotal evidence of pages with good outbound links to respectable sites appearing much quicker than pages with just pure original content and no links, on the same domain. Usually my "site:" list is all pages with links (within content) for quite a while before the others slowly get filtered in. That must mean something.

mattg3

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3233693 posted 11:22 pm on Feb 5, 2007 (gmt 0)

especially things like JavaScript links that mean Google won't get anywhere around your site, or a malformed robots.txt

On the other hand technical ignorance could also be an indicator that you don't have the ability to con Google. How many mom and pop sites, or whatever that's called in american, are beyond design, technical ability and so on.

nuevojefe

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3233693 posted 4:26 am on Feb 6, 2007 (gmt 0)

# Reciprocal link request pages. Check.
# No Privacy policy. Check.
# Outdated copyright date or last modified date visible on the pages. Check.
# error pages that don't send 404 headers or send content regardless of the page requested/querystring entered. Check.
# Massive numbers of incoming links from link farms. Check.
# dead/404ing links. Only a few.
# High link churn. Possibly.
# No published contact address, email address or phone number. Check.
# A high bounce rate (surfers clicking back on their browser and selecting another search result). Medium, although 25%+ leave through adsense on first pageview.
# Too much duplicate content. No.
# Whois info for the domain which is the same as other domains previously penalized or banned. (Could also be true of adsense publisher/affiliate ID's and other identifiable footprints) Check, Check.
# Use of/links to affiliate programs that are known scams. No.
# Domains previously used for spam or that are blacklisted. Minor.
# Stagnation (Site never changes). Some UGC growth.
# excessively long URI's/URL's (query strings or folder and file names). Check.
# A high percentage of affiliate links vs regular outbound links. Check (most of the aff links are dead too :-) )
# No / very few outbound links. Check.
# No / very few inbound links. No.
# All inbound links are to homepage only. No.
# Outbound links to questionable/spammy/crap sites. No.
# Profanity or explicitly adult language on a non-adult site. No.
# Too many spelling errors. Check.
# Contains unrelated subjects (ex: a site that reviews toys and tries to sell insurance or viagra). No.
# Lack of interest from social bookmarking sites. Check.
# MySQL or PHP errors in the pages. Occassional.

Top 10 for one of the top 200 phrases according wordtracker's non-adult "long-term" top phrases list. Decent amount of secondary keyword rankings as well.

And no, we don't aim to make sites this crappy. We just have some that weren't highly important and ended up this way. When we get around to it (currently for this one) we completely fix these issues.

PM me if I win.

Bewenched

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3233693 posted 7:38 am on Feb 6, 2007 (gmt 0)

I read an article a few months back about how "poorly designed" or ugly websites... ones that look like an amature did them actually perform very well.

Their reasoning was that these types of sites are generally thought of as trustworthy by consumers because they dont look like some sort of corporate machine.

Honestly I cant remember where I read the article. I guess I could see the benefit if you were doing a blog for adsense or something ... something that didn't really "look" like a blog.. made it look more like someones personal homepage I guess.

Moncao

5+ Year Member



 
Msg#: 3233693 posted 8:43 am on Feb 6, 2007 (gmt 0)

Trademark 1999+ = a penalty?

glengara

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3233693 posted 11:04 am on Feb 6, 2007 (gmt 0)

We're talking about G so the obvious major signal of crap/quality are the outgoing links, in that "Timeline" post the first thing Matt noticed were those off topic links...

pontifex

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3233693 posted 11:28 am on Feb 6, 2007 (gmt 0)

#excessively long URI's/URL's (query strings or folder and file names)

dangerous assumption, since wordpress takes the subject into the url!

;-)
P!

gabrielk

10+ Year Member



 
Msg#: 3233693 posted 2:22 pm on Feb 6, 2007 (gmt 0)

I know the replies about misspellings in blog comments, and long post titles generated by WordPress are mostly tongue in cheek, but I wanted to point out that I think blogs are different animals entirely. Different kinds of websites have different kinds of specific rules; people make allowances for personal sites that they don't make for business sites, and they make allowances for hobby sites (such as band sites) they don't make for business sites. For example, if Amazon.com had the kind of navigation as arcadefire.com...well.

I am wondering about things like hiding the registrant information of the domain name, etc. When I judge a site as "crap" vs. "quality" it's generally a snap judgement based on seeing many of these factors and then I bounce. I don't really take time to see if the domain is blacklisted, has sent spams, or has hidden its registrant. Too much digging for me.

My first look is for the amount of content versus ads. If I mostly see ads and have a hard time distinguishing ad space from content space, I bounce because to me the site is crap. It's not providing anything useful that I can't find somewhere else on the web--it's a big place!

Likewise, if I see ads masquerading as navigation, I bounce.

Brett_Tabke

WebmasterWorld Administrator brett_tabke us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3233693 posted 6:42 pm on Feb 6, 2007 (gmt 0)

> 25 signals of crap

Why 25? All you need are three:

WordPress, Blogger, Blogspot :-)

...running and ducking for cover.

appi2

5+ Year Member



 
Msg#: 3233693 posted 6:44 pm on Feb 6, 2007 (gmt 0)

what so myspace is ok!

mattg3

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3233693 posted 6:50 pm on Feb 6, 2007 (gmt 0)

what so myspace is ok!

No it's considered off the scale :)

fischermx

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3233693 posted 6:59 pm on Feb 6, 2007 (gmt 0)


Why 25? All you need are three:
WordPress, Blogger, Blogspot

Yikes!, the boss is angry today .... :(

fischermx

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3233693 posted 7:01 pm on Feb 6, 2007 (gmt 0)

BTW, Is there a difference between Blogger and Blogspot?
It seems none today.

Was there any difference in the past?

shyspinner

5+ Year Member



 
Msg#: 3233693 posted 8:59 am on Feb 7, 2007 (gmt 0)

hey, I thought a year ago, it was said that business sites should have a blog...?

BeeDeeDubbleU

WebmasterWorld Senior Member beedeedubbleu us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3233693 posted 10:12 am on Feb 7, 2007 (gmt 0)

I believe that many Internet things like Blog's are very much flavour of the month. They spring up from nowhere have their ten minutes of fame and disappear into the ether as soon as someone comes up with somthing more interesting.

They are still popular, but their proliferation and the ease with which they can be created by people with no real intellect means that all sorts of spammers, chancers and weirdos are now publishing them. To me this means that they have started to move into the realms of a signal of crap.

groovyhippo

10+ Year Member



 
Msg#: 3233693 posted 1:28 pm on Feb 7, 2007 (gmt 0)

I don't think anyone mentioned having the words "under construction" on a page. Especially if it's accompanied by a nice animated GIF of a guy with a shovel.

This 98 message thread spans 4 pages: < < 98 ( 1 2 [3] 4 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved