homepage Welcome to WebmasterWorld Guest from 54.225.24.227
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 82 message thread spans 3 pages: 82 ( [1] 2 3 > >     
Google Algorithm - What are the 200 Variables?
irldonalb




msg:4030022
 12:54 pm on Nov 23, 2009 (gmt 0)

At PubCon, Matt Cutts mentioned that there were over 200 variables in the Google Algorithm.

I thought I’d start a list...

Domain
- Age of Domain
- History of domain
- KWs in domain name
- Sub domain or root domain?
- TLD of Domain
- IP address of domain
- Location of IP address / Server

Architecture
- HTML structure
- Use of Headers tags
- URL path
- Use of external CSS / JS files

Content
- Keyword density of page
- Keyword in Title Tag
- Keyword in Meta Description (Not Meta Keywords)
- Keyword in KW in header tags (H1, H2 etc)
- Keyword in body text
- Freshness of Content

Per Inbound Link
- Quality of website linking in
- Quality of web page linking in
- Age of website
- Age of web page
- Relevancy of page’s content
- Location of link (Footer, Navigation, Body text)
- Anchor text if link
- Title attribute of link
- Alt tag of images linking
- Country specific TLD domain
- Authority TLD (.edu, .gov)
- Location of server
- Authority Link (CNN, BBC, etc)

Cluster of Links
- Uniqueness of Class C address.

Internal Cross Linking
- No of internal links to page
- Location of link on page
- Anchor text of FIRST text link (Bruce Clay’s point at PubCon)

Penalties
- Over Optimisation
- Purchasing Links
- Selling Links
- Comment Spamming
- Cloaking
- Hidden Text
- Duplicate Content
- Keyword stuffing
- Manual penalties
- Sandbox effect (Probably the same as age of domain)

Miscellaneous
- JavaScript Links
- No Follow Links

Pending
- Performance / Load of a website
- Speed of JS

Misconceptions
- XML Sitemap (Aids the crawler but doesn’t help rankings)
- PageRank (General Indicator of page’s performance)

 

tedster




msg:4030261
 6:58 pm on Nov 23, 2009 (gmt 0)

Wow, thanks for even trying this task.

I think some of those may be the final effect of specific factors, rather than being factors on their own. Here are some ideas I've had:

1. History of past penalties for this domain
2. History of past penalties for this owner
3. Semantic information (phrase-based indexing and co-occurring phrase indicators)
4. Taxonomy flag for general category (transactional, informational, navigational)
5. Taxonomy flag for freshness
6. Taxonomy flag for market niche

Reno




msg:4030289
 7:28 pm on Nov 23, 2009 (gmt 0)

Probably some of the stuff left over from the old days is still in usage:

- Word emphasis (bold, italics, underline)
- positioning of most relevant words on page
- synonyms relating to theme of page/site
- alt tags / graphic file names / promimity of graphics to supporting text

..................

zehrila




msg:4030354
 8:29 pm on Nov 23, 2009 (gmt 0)

Tedster: Whats the difference between penalty for domain and penalty for owner? secondly, i have seen sites which got penalised and later when they were fixed, they touched sky in terms of traffic and serps.

Very nice thread, indeed.

tedster




msg:4030366
 8:43 pm on Nov 23, 2009 (gmt 0)

Whats the difference between penalty for domain and penalty for owner?

If several domains owned by the same owner pick up a penalty - especially a ban - there is some conjecture that this can spill over to other domains that are also owned by the same person or business.

kidder




msg:4030398
 9:35 pm on Nov 23, 2009 (gmt 0)

You don't win the prize unless you get all 200 in correct order.... :)

HuskyPup




msg:4030404
 9:43 pm on Nov 23, 2009 (gmt 0)

Tell me what the prize is first:-)

gouri




msg:4030428
 10:17 pm on Nov 23, 2009 (gmt 0)

Thank you for creating this list.

Great info.

peterdaly




msg:4030445
 10:39 pm on Nov 23, 2009 (gmt 0)

Link density:
Percentage of words on the page are linked words. (All links = 100% density, no linked words = 0% link density) A page that's all links is bad.

Non-Link word count:
How many words on a page are not links? More words that are not links is a general indication of more "real" content on a page.

I believe these are a factor, or at least looking at these two together is a good proxy of things that are indicators, in terms of "is there content on this page?" evaluation.

HuskyPup




msg:4030524
 1:50 am on Nov 24, 2009 (gmt 0)

A page that's all links is bad.

Incorrect, I have many trade widget pages like this and they all rank #1...caveat - for MY widget industry:-)

@ peterdaly - I know where you're coming from however specialised widget sites can do extremely well with these.

tedster




msg:4030559
 2:27 am on Nov 24, 2009 (gmt 0)

Agreed Husky, and yet there is some factor here - it's just got to be more complex. For instance, the quality of the neighborhoods being linked to probably comes into play.

barretire




msg:4030567
 2:44 am on Nov 24, 2009 (gmt 0)

Great list. Thank you...

TheMadScientist




msg:4030595
 3:17 am on Nov 24, 2009 (gmt 0)

Great Thread at a Glance I'd Add...

Outbound Links

Thought of a couple more not mentioned:
Frequency of Edits
% of Page Effected (Changed) by Page Edits

peterdaly




msg:4030626
 4:38 am on Nov 24, 2009 (gmt 0)

- Keyword in Title Tag
- Keyword in KW in header tags (H1, H2 etc)

Adding on to those...in order of descending value:
Keyword at beginning of title tag
Keyword at beginning of h1
Keyword at beginning of h2-h6

Exact matches may also be a plus...

tedster




msg:4030641
 5:19 am on Nov 24, 2009 (gmt 0)

I have to chime in again. Currently I don't see evidence of keyword in H tags as a ranking factor. Those days are behind us, I think - probably too much abuse. It's still a good idea, but the ranking magic seems to be gone.

gn_wendy




msg:4030700
 8:19 am on Nov 24, 2009 (gmt 0)

Currently I don't see evidence of keyword in H tags as a ranking factor. Those days are behind us

Probably some of the stuff left over from the old days is still in usage:
- Word emphasis (bold, italics, underline)
- positioning of most relevant words on page

Since H-tags often are bold, italic or underlined as well as prominently placed I would be willing to say that they have some relevance - not as H-tags, but because of style and position. Probably in the 180-200 position on our list of top factors.

...but I do agree with tedster, the magic is gone.

Hissingsid




msg:4030704
 8:25 am on Nov 24, 2009 (gmt 0)

You have missed most of the most important variables and who says speed = "- Performance / Load of a website "?

Cheers

Sid

tattoos




msg:4030710
 8:35 am on Nov 24, 2009 (gmt 0)

Nice list, great effort.

I am not sure if the following one applies though.

Per Inbound Link
- Authority TLD (.edu, .gov)

From Google groups Oct 2008 [groups.google.com]

TylerDee, TX: Are .gov and .edu back links still considered more "link juice" than the common back link?

Matt Cutts: This is a common misconception--you don't get any PageRank boost from having an .edu link or .gov link automatically.
If you get an .edu link and no one is linking to that .edu page, you're not going to get any PageRank at all because that .edu page doesn't have any PageRank.

JohnMu: We generally treat all links the same - be it from .gov or .edu or .info sites.

.gov and .edu sites generally have been around awhile and over time gained trust, so I guess that can make them seem more authoritative.. But on the flip side, authority sites tend to get targeted by spammers, .edu sites especially get hammered all the time...

Try this search for spammer-targeted keywords:
site:.edu/ keyword

What affect that has on trust is anyone's guess.
IMHO

Cheers
James

[edited by: Robert_Charlton at 6:21 pm (utc) on Nov. 24, 2009]
[edit reason] removed specific search [/edit]

gn_wendy




msg:4030713
 8:44 am on Nov 24, 2009 (gmt 0)

Matt Cutts: This is a common misconception--you don't get any PageRank boost from having an .edu link or .gov link automatically.
If you get an .edu link and no one is linking to that .edu page, you're not going to get any PageRank at all because that .edu page doesn't have any PageRank.

Matt said this in 2008 and he is talking about pagerank - a factor I tend to pay less and less attention to.

In my experience all .gov/.edu sites were not created equal, but I am a believer in the trust/authority factor.

tattoos




msg:4030723
 9:19 am on Nov 24, 2009 (gmt 0)

Matt said this in 2008 and he is talking about pagerank - a factor I tend to pay less and less attention to.

Excuse my ignorance, but isn't this discussion about PageRank and the factors that determine it? and I don't mean "toolbar pagerank" (a factor I pay zero attention to), but the algorithm Google use to rank a page in the SERP's.

I am a believer in the trust/authority factor.

I also believe in the trust/authority factor. I just don't believe TLD's have anything to do with it.

Cheers
James

irldonalb




msg:4030811
 12:41 pm on Nov 24, 2009 (gmt 0)

Hi,

Lots of good points I'd like to comment on.

@Hissingsid
who says speed = "- Performance / Load of a website "

This was brought up by several speakers at PubCon and Matt Cutts confirmed Google might go ahead with this in 2010.

@peterdaly
Link density is something I haven't considered. I'm going to read up on this.

@tedster / gn_wendy
I think you’re correct about the H tags. My 2 cents is that these tags have devalued in the past 12 months but I believe they carry more weight than standard text.

There has been very few additions on the link building factors.

Thanks
Donal

gn_wendy




msg:4030842
 1:44 pm on Nov 24, 2009 (gmt 0)

...PageRank and the factors that determine it...

Point taken.

I also believe in the trust/authority factor. I just don't believe TLD's have anything to do with it.

IMO .edu/.gov site's have an easier time of getting to a higher level of trust/authority than other TLD websites. If this has to do with the type of website or the TLD is a bit beside the point ... if the shoe fits.

Hissingsid




msg:4030846
 1:47 pm on Nov 24, 2009 (gmt 0)

This was brought up by several speakers at PubCon and Matt Cutts confirmed Google might go ahead with this in 2010.

I've listened and he mentioned speed and fast but not "Performance / Load of a website". The point I'm making is we don't know what Google means by speed yet. Or rather we don't know the most economic ways to make Google think our site is fast yet.

Cheers

Sid

aristotle




msg:4030894
 3:09 pm on Nov 24, 2009 (gmt 0)

Occasionally someone will specualate about the possibility of search engines using visitor behavior as a ranking factor. For example, browsers can collect information about how many visitors bookmark a page and then return to it later.

P.S. By visitor behavior I DO NOT specifically mean 'Bounce rate". Instead, I'm talking about behaviors like bookmarking and return visits.

mcskoufis




msg:4030909
 3:24 pm on Nov 24, 2009 (gmt 0)

Judging by the new tool available for optimizing CSS and page code, I'd say this is something of great benefit to Google as it may well mean less processing power.

In the "Penalties for the domain" list, I'd add CSS processing (excuse my english) for "display:none" CSS and maybe others that make the text not show up to the user.

signor_john




msg:4030912
 3:32 pm on Nov 24, 2009 (gmt 0)

P.S. By visitor behavior I DO NOT specifically mean 'Bounce rate". Instead, I'm talking about behaviors like bookmarking and return visits.

I'd guess that (along with "freshness" and and number of other factors), visitor behavior would be judged in context--i.e., in relation to similar pages or sites.

Take forums:

- A good pregnancy forum or rugby fans' forum should have a lot of repeat visits, because visitors are interested in the topic and can be expected to come back.

- A good Windows support forum should have fewer repeat visits, because once the visitor's problem is solved ("Help! I can't find the other computer on my home network!"), he shouldn't need to come back until the next time he has a problem.

TheMadScientist




msg:4030916
 3:35 pm on Nov 24, 2009 (gmt 0)

In the "Penalties for the domain" list, I'd add CSS processing (excuse my english) for "display:none" CSS and maybe others that make the text not show up to the user.

How would you suggest this applies to image swaps, especially when the image and the text seen by Googlebot are identical and how do they tell the difference or know when to apply a penalty? Or, are you suggesting 'display:none;' is an automatic penalty?

It's definitely not (in my experience) a site wide penalty to use an image swap on a page, which requires 'display:none;'. (Actually, in my experience, it's not even a single page penalty to use an image swap...)

[edited by: TheMadScientist at 3:45 pm (utc) on Nov. 24, 2009]

mcskoufis




msg:4030918
 3:43 pm on Nov 24, 2009 (gmt 0)

@TheMadScientist - I'm saying that if there is a little paragraph or block with links and the display is set to none or there is a padding or z-index element which makes this hidden to the user.

Not sure if Google is capable of doing this kind of CSS processing (as deep and as complex) but pretty sure some CSS elements/properties are getting checked automatically, either the css code is embedded on the code or included or on an external file.

TheMadScientist




msg:4030920
 3:47 pm on Nov 24, 2009 (gmt 0)

...or there is a padding or z-index element which makes this hidden to the user.

Personally, I can see this one, but not the first half of the statement...

I'm saying that if there is a little paragraph or block with links and the display is set to none...

I actually know of well ranking sites where that's absolutely not the case.

* I should note, WRT the preceding, they do make use of AJAX though, so it's not 'invisible' to user without viewing the source code, but the visitor does have to click on a link to get it to display...

mcskoufis




msg:4030924
 3:55 pm on Nov 24, 2009 (gmt 0)

I actually know of well ranking sites where it's absolutely not the case.

Have seen the same and also some blatant 302 redirects stealing rankings from very authoritative sites, which just shows the workload Googlers need to go through to be able to detect all kinds of spam...

This 82 message thread spans 3 pages: 82 ( [1] 2 3 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved