Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google Ranking Mystery - Is There a Rhyme or Reason?

Sites with no links and little content are getting top ranking.

         

Webmeister

11:15 pm on Jan 24, 2007 (gmt 0)

10+ Year Member



I just checked Google's results on my prefered keywords for our main sites, and I am stumped by the SERPS that I am seeing. Sites with little or no inbound links and limited content are making it to the top of some very popular keywords. I run these sites through my SiteAnalyzer program and compare them with mine, and I cannot see why Google is ranking these sites higher than mine.

For example, on one of my favorite keywords, the #1 ranking site only shows nine backlinks on Google (they are all internal links). The site has only 79 pages (according to archive.org they had over 140 pages this time last year). The PageRank on their pages are all PR4 and PR5 for the most part, with about a dozen PR0 pages. The prefered keyword shows up 384 times in their entire website.

In contrast, my website - which has gradually lost Google ranking over the last year (it used to be at #1 on this keyword) - shows 118 inbound links (a mix of internal and external). My website has 213 pages. Most of my pages are PR5, with a few of them at PR4 and about 15 PR0 pages. The prefered keyword shows up 294 times in my website.

I used to be able to look at a website, view its source, and run a few search engine scripts to determine why it was ranking high or low on Google - but not anymore. Can any of you explain what kind of logic Google is currently using to determine a site's ranking, or is it just a roll of the dice nowadays? Here are a few questions that I would love to have answers for:

1. Does fresh content on a regular basis increase your chances for a higher ranking?

2. Do incoming links even factor into the SERPS anymore?

3. Does Google penalize for having a few 302 and 404 pages?

4. Do you score points with Google by using one- and two-tier subdirectories with keywords in the folder titles?

5. Does Gogle penalize for having too many image ALT tags with the same keywords in them?

6. Has Google tossed out all of the above factors and gone to a random pick to determine the top-ranking sites?

These are just a few of the questions off the top of my head. If you have answers to these or information on other factors that Google is currently using to determine a website's ranking, I would appreciate hearing from you. Thanks!

Webmeister

8:44 pm on Jan 25, 2007 (gmt 0)

10+ Year Member



Does anyone have a link to anything written about Matt Cutt's comments about random results - or better yet, a transcript?

Ditto. I would like to see it as well. I was just on his blog looking for the statement but couldn't find it. If anyone has a link to it, please post it.

Bentler

8:59 pm on Jan 25, 2007 (gmt 0)

10+ Year Member



It's just a theory but I suspect Google mines quality indicators from e-mail correspondence, its metrics (analytics), and toolbar data-- for example, ranking by avg on-site page views per keyphrase. I also think they use a random everflux to give unproven pages enough exposure to generate statistically comparable metrics, then re-ranks them accordingly.

I think descriptive link text having a high correlation between on-site and off-site is important too.

So the way to rank well would be (in addition to on-site and off-site optimizations) to improve quality in general, be accurate with your link naming and try to generate good buzz about your site.

This is just theory though.

youfoundjake

9:03 pm on Jan 25, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Ditto. I would like to see it as well. I was just on his blog looking for the statement but couldn't find it. If anyone has a link to it, please post it.

[mattcutts.com...]

cabbie

9:27 pm on Jan 25, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



age of site and trusted links, unique content work for me.
Definetly don't chase too many links too quickly.
One other myth, Google doesn't give extra rasberrys for fresh content.

httpwebwitch

9:56 pm on Jan 25, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I have pages ranking at the number 1 postition on the serps

by itself means nothing. I rank #1 for thousands of keywords, like the ever-popular "regex #*$!", because my site is extremely relevant on such phrases.

Ranking is a mix of relevance*importance, where each of those are in turn a mix of hundreds of ingredients in the secret sauce. Don't concentrate solely on importance, when Google has done very well at figuring out how to deliver relevance too.

Links ARE one of those factors; it (PageRank) is the premise upon which Google was built. Are links as important now as they were 365 days ago? Yes they are, but I believe with subtly changing dynamics. GOOG is getting smarter at recognizing complex link patterns and identifying them as natural or weird - hence the scare of link farms and reciprocals.

hww
bleeped text. pardon my french.

SEOold

10:43 pm on Jan 25, 2007 (gmt 0)

10+ Year Member



I have been noticing this on two of my sites that are of same concept.

Site A

5 year old site, well branded
Good number of links - alexa rank 5000
Good number of back links
Over 100K pages
Google Bar PR 6

Site B
5 year old site - only the homepage was indexed
3 months ago I started to test the site out for Site A concept. Site B is a smaller site with maybe 30 pages, added unique content and practically no content or little to individual pages. Also site B does not have backlinks for inside site pages. Site B does have a Google bar PR4

Results: Site B and Site A both rank for the same term on the 1st page of Google results. Many of the terms are competitive.

It doesn't make any sense as to how easily Site B ranked for so many terms on page 1. For some terms the page out ranked Site A.

Muskie

2:28 am on Jan 26, 2007 (gmt 0)

10+ Year Member



It seems to me that every couple of months or weeks a thread makes it to the front page and thus my RSS feed where people can't understand what Google is doing. Google definitely doesn't want you to know exactly what they are doing, but the thing Google is always trying to do besides make money, is return the best results possible for every single search term entered.

Rather than complain about Google's algorithm, you can generally improve your content, add more content, do something, I'm not big on link campaigns and random link exchanges. Google has neutered a lot of tricks people used to abuse. I also don't think relying on a site analyzer and a couple of scripts is the way to go. If your site is too optimized Google can tell, other people can tell. If you have more words in Adsense ads than in content, if you are re-publishing content you got off an RSS feed for instance...

But since you asked a few specific questions:

1. Does fresh content on a regular basis increase your chances for a higher ranking?

Yes. Google is a greedy algorithm. It hungers for new quality content. I think if you regularly publish quality content, Google remembers and checks your site more frequently and possibly gives you a boost. But mainly real people are looking for new content and they will link to it.

2. Do incoming links even factor into the SERPS anymore?

Definitely. For a lot of terms I personally look up in my daily life the Wikipedia is ranking number one or top 3. The reason is the content in many cases is better but also the weight of links coming to the English wikipedia domain.

Google has never said all incoming links are equal. That is the premise behind PageRank, but deep links ie interior pages is probably something else Google considers not having every incoming link point to www.domain.com/

3. Does Google penalize for having a few 302 and 404 pages?

Not likely. That is a quality control issue. It can affect Google's spider's ability to navigate your site, it also looks unprofessional, but it is hard to never have a page not found error.

4. Do you score points with Google by using one- and two-tier subdirectories with keywords in the folder titles?

Being overcute won't score you points with Google over a long period of time. Maybe in March 2003 someone discovered some sort of trick to set up a directory structure that the Google algorithm was rewarding disproportionately, but these things get fixed. I think your directory structure should be set up logically for ease of updating.

5. Does Gogle penalize for having too many image ALT tags with the same keywords in them?

This definitely sounds spammy. The alt attribute should be used to describe the image for the benefit of screen readers and what not.

6. Has Google tossed out all of the above factors and gone to a random pick to determine the top-ranking sites?

I'm not saying that if 10 pages are ranked within a few decimals of each other in score or exactly the same their might not be an element of randomness, possibly defaulting to age, but all those PHD's in mathematics and what have you are going to come up with something better than RND or bubble sort.

Muskie

tedster

2:38 am on Jan 26, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Most of what I'd like to add is already in a recent thread -- so I guess that's why links were invented.

Natural vs. Un-natural - in SEO and the Google Algorithm [webmasterworld.com]

In "the old days" you could pretty much set up a punch list, tick off the items, and voila!, you were ranking -- even for some pretty low quality stuff at times. Today it takes less effort to actually create quality than to imitate it, and the punch list has become a doctoral thesis.

NedProf

7:48 am on Jan 26, 2007 (gmt 0)

10+ Year Member



I think I have the most interesting case ;)

Site A had pagerank 0, and now after 10 months it had pagerank 6......?

It has 5 pages indexed and 4 of them are supplemental.

Incoming links according to Yahoo and Live search: 1*PR6 2*PR4 2*PR5 1*PR0

I'm totally confused, is it a glitch in the Pagerank algorithm? That would mean I could open a new site and must try to get 1 PR7 link and I'm getting the 7 as well? Very strange...

[edited by: NedProf at 7:49 am (utc) on Jan. 26, 2007]

borad

1:21 am on Jan 27, 2007 (gmt 0)

10+ Year Member



"1*PR6 2*PR4 2*PR5 1*PR0" ==> PR6

consider, your links may include a 6.9 and two 5.9s, and your 6 may be just 6.0

this kind of error doesn't matter greatly at lower levels, but as you go up, there is a substantial difference.

Kurgano

8:27 am on Feb 2, 2007 (gmt 0)

10+ Year Member



I'm seeing that site quality matters more and more and not by being judged on incoming links either.

I have a 4 week old article on a 5 week old site that ranks 2nd for a popular search term in my sector out of 17,825 possible pages. No incoming links directly to it. PR0 for my article still but its ahead of PR4 and PR5 articles on the same subject. A noticeable difference is that my article validates and is seo friendly in every way i've learned here (minus mod-rewrite, I have no .htaccess on this site) but the others in the top 10 have between 32 and 108 errors according to w3c.

Its too soon to draw much from that but still... and yes i'm paying too much attention to every little detail but for the first few months I want to know everything, i'll stop looking in time. I don't rush to make changes regardless.

Forgive my analogy on this one, i'm a car guy, but it feels like google is able to tell which racecar has more horsepower under its hood and isn't judging the site on its exterior as much anymore. If it's true I'm happy. Quality shouldn't be penalized because of a lack of incoming links. If it was... old pages would hang around forever.

Kurgano

8:35 am on Feb 2, 2007 (gmt 0)

10+ Year Member



Another observation.

Syndicated content, I do not add any to my site but I do send my articles out to a hand chosen sector that is relevant to my site.

Having a site with content gotten from feeds or RSS or what have you isn't a horrible thing but I do suspect that you'd also best be providing original content too. I've watched two sites fall off the radar when they cranked up the amount of incoming stuff they have.

Google knows that an average webmaster can't generate 20 pages a day consistently, maybe they are paying a lot more attention to this type of thing now.

Is each new page ranked seperately? does your site have an average rating like in baseball? If you have PR4 overall and add ten straight articles with a value of PR2 tops on their own merrits do you lower your overall PR in the same way a baseball player lowers his average?

Its best to concentrate on quality and originality and attempt to offer something of real value... analyzing this stuff starts to hurt after a while.

Bocaboy

4:10 am on Feb 13, 2007 (gmt 0)

10+ Year Member



A real newbe, not even an aspiring webmaster, who opened a domain using a simple serversharing service and when doing a G search using my domain verbatim, G couldn't even find it, much less rank it.

So I end up here and realize there is no such thing as real search anymore for normal web researching people who have not been given the site domain on TV or in print.

If I post my domain on a telephone pole or in the classifieds, I guess that is the only viable "link".

La plus change, la meme chose.

Optimus

9:41 am on Feb 13, 2007 (gmt 0)

10+ Year Member



The current google results are a complete perversion. In my sector, on a highly competitive keyword1-keyword2 search, a PR4 site, stuffed with 200 words and key phrases of hidden text repeated over the entire 80 pages of miserable content, is ranking at position 6. That's 16 000 hidden words in total! It was registered in 1995. Six of the top 10 sites for this search phrase are beyond abysmal in content and user experience. They are all at least five years old.

It appears that google really couldn't give a damn...

SteveWh

12:19 pm on Feb 13, 2007 (gmt 0)

10+ Year Member



Yes. They are now going for random results. Matt Cutts spoke about it at a recent convention. He said Google is tired of SEO's trying to manipulate the search engines so they are now pretty much relying on random results. He compared it to a lottery system. Every site now gets a chance to rank well.

Humor about the time machine aside, did he really say the part in bold above? That is the lottery in a "classifier system", which is a type of learning system, artificial intelligence. It allows low ranked sites to occasionally rank high to see how well they compete. If they perform well while their ranking is temporarily unnaturally high, their permanent ranking will be slightly improved.

It is VERY hard to truly know what does and does not significantly affect Google rank. If you do nothing your site can go up or down because things change naturally at Google and with your competitors. If you do something the same applies. So how do you know what you did (or didn't do) caused a change?

Amen.
This 45 message thread spans 2 pages: 45