Forum Moderators: open

Message Too Old, No Replies

Effects of continuous small changes on page to ranking

         

klogger

11:20 pm on Aug 31, 2004 (gmt 0)

10+ Year Member



I target a 3 word keyphrase for one of my Web sites pages and as it is a fairly competitive phrase I have been watching the competitions Web sites.

One site in particular is interesting in that it does not appear to be well optimised (links are graphical, only 180 words on the page and very few backlinks - 21), it does however stay constantly at position number 1 in Google for the keyphrase and has been there for over a year.

I got curious as to how they were doing it and so have been running difference compares on a saved version of their page each day.

What I have found is that they change a very small part of the page everyday that will have no noticeable effect on the page. The changes are along the lines of:

1) The entire site if positioned centrally using JavaScript but they will change it's exact location by 4px.
2) They will change the name of their images folder from a date format (such as 31-08-2004) to a keyword format such as keyword1-keyword2-keyword3.

I have only been following their code changes for a little over a week but I assume that this is why they are number 1 when visually and content wise the page does not change.

I am wondering three things about this technique:

1) Is this the reason why they are no 1 in Google?
2) Does it work?
3) Is it legal or should they be banned in time?

and

4) Any other thoughts on this technique?

DerekH

9:25 am on Sep 4, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



klogger wrote

1) The entire site if positioned centrally using JavaScript but they will change it's exact location by 4px.
2) They will change the name of their images folder from a date format (such as 31-08-2004) to a keyword format such as keyword1-keyword2-keyword3.

1) Is this the reason why they are no 1 in Google?
2) Does it work?
3) Is it legal or should they be banned in time?

and

4) Any other thoughts on this technique?

Google probably doesn't care two hoots about the Javascript changes, so you can discount that.
As for the rest, well, the first thing is to determine whether Google is actually seeing the changes. If you retrieve the Google cache of their home page, does its "date of this cache" change every single day. If it doesn't, you can discount all your ideas! <smile>

Even if their cache changes every day, Google can take a month to change PR and SERPs ranking orders, so I think you can discount all your ideas there too! <smile>

The principal route to achieving a good ranking is to be an authoratitive site, and that means incoming links. How many do they have, and are they from well-respected sites?
DerekH

karmov

11:40 am on Sep 4, 2004 (gmt 0)

10+ Year Member



very few backlinks - 21

Are you sure there are only 21 backlinks? If you used Google to find that number, I can assure you that there are most likely more and the ones you're seeing are not that site's quality backlinks.

Also, as DerekH mentions, where those backlinks are coming from makes a big difference. Raw numbers of backlinks don't matter as much as some top quality on topic backlinks.

klogger

2:06 pm on Sep 4, 2004 (gmt 0)

10+ Year Member



Thanks for the comments, I thought Google showed all the back links that they counted towards increased RP from the link command. When I used it these were the PR's of the connecting pages:

21 links in total
---------------

14 * PR 0 or not indexed in Google
2 * PR 3
2 * PR 2
2 * PR 1
1 * PR 6 (DMOZ)

In Yahoo a linkdomain: gave a total of 457 back links which is a little to many for me to trawl through just now.

The date of the last cached version in Google is:

23 Aug 2004 12:52:43 GMT

I also checked the page to see if it actually contains the term 'keyword1 keyword2 keyword3' and it does not, it also only has 'keyword1 keyword2' in it twice in the body content and 3 times in the head (meta tags). The following is the number of occurrences of each keyword on the page in isolation from the other keywords:

keyword1: 17 (Is on default word stop list at Google Ranking so no reliable figure for density available)
keyword2: 27 (density 11.25%)
keyword3: 5 (density 2.08%)

It seems a little odd that they still rank no. 1 when the 'keyword1 keyword2 keyword3' term isn't on the page and for the 'keyword1 keyword2' they don't show up in the first 1000 results (using Google Rankings).

I checked a random set of links in Yahoo and most of them are simple without any optimisation and just images with hrefs or of the nature


<a href="thecompany.com">thecompany.com</a>

without any other link or descriptive text. I know I have only looked at a small random selection of links out of the total but considering that the majority of the sites (60 - 70%) are not even in a related field I have to remain sceptical on this.

Today they added the following to several <divs> that had no value before:


visibility:visible

this of course being the default value.

So why would they change the page in such a little way that is unnoticeable to any visitor every day or sometimes even twice daily?

isitreal

2:51 pm on Sep 4, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Nice little study, if one keeps in mind that google 'likes fresh content', and that 'frequently updated pages tend to do better', and then ask how google knows when something is 'fresh' or 'updated', the most likely answer is that google is creating sometype of generated id based on the contents of a page, then matches it against a new generated id from a new spidering. So the small changes will do a few things: change the file size of the page slightly, and change the generated id slightly.

Since I've seen duplicate content penalties be applied to sites that have different templates for that content, I'm going to guess that google is using some form of generated hash to determine if a page is identical to itself or any other page, so these changes are apparently enough to make google think the page is being changed daily, very nice job on their webmaster's part by the way, and very easy to implement with some type of random generator script run server side, and as far as I'm concerned, totally legitimate, but also totally easy to copy yourself.

Or it could be as simple as comparing byte size for the page daily, which would be the simplest method, or both methods mixed.

The easiest model I can find to explain google or any other search engine behavior is that these are very stupid programs, because they have to process such a huge amount of data, so complex explanations probably aren't right, no matter how much these companies want you to believe that.

Lorel

1:51 am on Sep 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hi Klogger,

I thought Google showed all the back links that they counted towards increased RP from the link command.

Google has changed how it displays the link command with very few sites above PR 3. And it now appears all the other commands have been changed to emulate the link command. I checked one of my sites with the following commands yesterday and they results were all identical:

link:www.domain.com
allinurl:www.domain.com
allintitle:www.domain.com
+www.domain.+com

If anyone knows of a better command to use in Google to relay *all* the links to a site please let us know.

Lori

Lorel

1:57 am on Sep 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month




Nice little study, if one keeps in mind that google 'likes fresh content', and that 'frequently updated pages tend to do better', . . .

Re Google liking fresh content--I update several major pages on my site at least once or twice a month (legitimate changes in format or text) and my site continually gains traffic from Google. But soon as I stop the traffic slacks off again. I assume this is because Google keeps coming back to sites that are constantly changing and thus the pages get spidered often.

Lori