| 5:48 am on Jan 11, 2005 (gmt 0)|
I'd be cautious about this. It could look like cloaking. Visit with Googlebot, get one thing. Visit with IE, get another. Of course, visit with ANYTHING and you get something different every time, but you see how they could be confused.
On a more general level, you're doing something strange to manipulate Google's ranking. That's explicitly what they're looking for, so you better cross your fingers.
| 6:36 am on Jan 11, 2005 (gmt 0)|
I think it's a terrible idea. It's not like Google remembers the best PR they gave to a page and keeps it that way. If you remove a link, Google might remove the PR. I consider a stable and well-conceived site architecture and figure out how to make it work.
Search on the site for "Search engine theme pyramids and Google."
| 7:28 am on Jan 11, 2005 (gmt 0)|
Ah - both good points. I'm adapting the system to print out static pages with randomly generated links instead. It's a bit more work but you're right that dynamic links are definitely not the way to go. I was thinking that the PR would just be recalculated differently each month (or how every often they are doing it internally now) but in retrospect there would probably be other complications. Thanks for the quick input.
| 8:31 am on Jan 11, 2005 (gmt 0)|
Tried this a while back - conclusion:
Great : to get new pages and channels picked up
Terrible : to get them ranked consistantly anbd therefore appearing with any conviction in the SERPS
Think of the analogy involving being a jack of all trades rather than a master of one!
I would choose the key areas and get those fixed and perhaps use random content around these 30 odd links. If you have many more then use sucessive pages.
| 9:14 am on Jan 11, 2005 (gmt 0)|
It's not necessarily a bad idea but it's the implementation you need to work out.
It is most likely that google likes to see longevity of links from one URL to another URL. This would indicate the continued value of the resource thus the continued presence of the link.
Randomization is fun, just don't knock the wind out of your own sail.
| 3:27 pm on Jan 11, 2005 (gmt 0)|
I'm only going to be using the randomization to shuffle link text and breakup blocks of identical links now. It's only going to be used to target less than a dozen pages total so it should work out OK. I'll keep you posted if anything unusual happens.
| 3:33 pm on Jan 11, 2005 (gmt 0)|
I think you should'nt randomize it on a visit-basis.
This could cause terrible confusion either with your visitors and with Googlebot, too.
Doing it on a let's say weekly or monthly basis is okay in my opinion because good sites change daily!
You could write a php skript that generates your content in html and puts randomized links in the html.
Each time you generate your site again the links will change. You can choose the timeframe.
| 4:16 am on Jan 12, 2005 (gmt 0)|
In theory I also worry about it. I figure: if something is designed to fool Google, worry about it.
But, on second thought, haven't we all seen sites that list random users? That seems pretty common. More common than that are "X of the Day" links, whre something is plucked out of the database and presented as news. So, there certainly are sites that use similar techniques.
I say, if there's a non-SEO reason for it, go for it.
| 5:28 am on Jan 12, 2005 (gmt 0)|
nuevojefe dances on the edge of a very good point, clever chap that he is.
But add my vote to the side of 'don't do it.' In essence, it's not a real link...it's more like an ad. Little power, like a shifting breeze.
| 6:06 am on Jan 12, 2005 (gmt 0)|
>> Little power, like a shifting breeze.
Can be done very effectively if you rotate those links every 2 to 4 weeks depending on what sort of crawl you're in and how much value you can pass to those links.
Don't do it randomly with every reload. Look carefully at 10 focus nodes in your tree and point to them from your home page for 2 to 4 weeks. Leave the rest of the tree as is.
You'll soon start seeing whats effective..
| 8:29 am on Jan 12, 2005 (gmt 0)|
Thanks for the great feedback guys. I've decided to use a script to generate a single random set of links for each page that is then saved as a static HTML file. Links will be random between pages, but persistent over time. I'll let you know how this works out.
| 10:00 am on Jan 12, 2005 (gmt 0)|
Thanks for the kind words.
Please do report back. Good luck too!
| 6:56 pm on Jan 12, 2005 (gmt 0)|
>Can be done very effectively if you rotate those links every 2 to 4 weeks depending on what sort of crawl you're in and how much value you can pass to those links.
Interesting variation I had not considered. Watching crawll patterns, however, probably requires more time and effort than I'm capable of, especially when dealing with many sites at once.