Rhonie - 8:30 pm on Feb 28, 2011 (gmt 0)
The following is common sense to me on how Google may have changed their algorithm. In no way shape or form that this may be correct, but if I was ahead of programming for Google it would run about like this for several reasons.
Now I am not a pro, mathematician or anything close, but this is what I have about figured out:
If you are a retailer it is very important that your description of a product is not the same in another site, you need to adjust that, and you need to link to quality sites for reviews of the product you are selling for credibility.
Now the first = 1 if someone would enter the exact name of your site. No deduction for that, but you would get a deduction and a possible sandbox if your site was like Seers.com, and your content in the page anyway resembled Sears.com. Also, if your site contained too many keywords with the name of Sears.com, and had content not related to Sears.com, or any main keywords that Sears uses, perhaps like Craftsman or Kmart.
Now news sites that are big that have published content are not punished, or sites with a huge amount of visitors either. There may be an exception, or inclusion in the script for sites that keep up a high number of hits per page written in the algorithm, the script may be written for this to change on a daily basis.
Lets see if I have this about right?
In no way I assume the point system is correct either, but again I am just using this is a layman's point system to make it easier to understand.
The second "you" was placed intentionally just to show that many sites may be inserting or changing to another word in between the article content, I am sure their are many like I, my, our, etc...
The following is just my example of algorithm for Googlebot:
Possible points for sand box = -10,000 points
Ranking on page for keywords top = 1
Penalty for points in word sequence are is follows:
if = 1
you = 2
use = 3
the = 4
same = 5
words = 6
in = 7
an = 8
article = 9
you = 10
after = 11
the = 12
first = 13
hit = 14
time = 15
and = 16
date = 17
than = 18
your = 19
site = 20
points = 21
will = 22
be = 23
deducted = 24
this = 25
many = 26
points = 27
total = 28
deduction = 29
of = 30
points = 465
Highest return on page due to wording sequence is on page 4 3/5th.
If total matches other = page post to newest first hit time/date ahead for wording sequence. Comparative analysis for other sites = below or above, number of sites listing the exact keywords.
Grade for age of > one year, < one year.
Add 5,000 to sites that have exact wording content using first find time/date = 0 points for posted ad within 24 hours, credit points for sites that have more than 5,000 new visits per day per page.
Now you all need to think outside of the box. You must assume that Google is a company that needs to make money, in advertising, statistics, along with many other ways and this is done by organizing sites to the best possible stats that interest their users. The problem with this, if it is figured out completely, they will just change the algorithm again.
Now another way to look at this is Craigslist. Why does Craigslist give away free postings? I would hope you all know. This is so they get a lot of hits to the site, and the content remains fresh to get a high Google rating, so they can charge to post paid ads in major cities, with a minuscule effort in advertising.
I would like for any of you to respond that either may agree or disagree with this? I know it is along way from exact, but this is about what I am understanding.