Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Random or Structured Internal Links for Related Products?

         

Simsi

3:17 pm on Aug 4, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hi

I am currently looking at a product site I have and the internal linking structure. There are around 200 products many of which cross-relate. Currently, when you visit a product page, it serves up 6 random links (in a list) to other products that are relevant. Random as in a selected 6 from perhaps 20 that match (think Amazon and the "Customers who purchased this item also purchased..." concept).

Obviously this means each time the page is loaded, those 6 links vary and each time GBot visits it will see a different set of links. Does that have an effect on SEO? Would it be better to write a function that always displayed the same 6 alternative product links on a given page, spreading the links evenly around the site so each product has approximately the same number of internal links pointing at it? Or is random OK?

And following on, is 6 links a sensible number bearing in mind I could have up to 20?

Cheers,

Simsi

tedster

11:43 pm on Aug 4, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It will have some kind of effect. Whether that effect is disruptive or not will probably depend on at least a couple factors:

1) What percent of the total page content is made of these links. If the page is relatively thin, then there's more potential for something crazy to happen, IMO.

2) How often googlebot spiders the page. If it's several times every day, there's a higher chance of tripping a flag, rightly or wrongly. But I don't think the chance of trouble is all that high, no matter what.

If I were in this situation, I would look for a static solution. I just don't like random elements very much - they make it really hard to analyze the data a site generates. So I'd probably approach it by assuming that the 200 products have different degrees of relatedness - that is, some are more closely related than others.

That means first of all, that I wouldn't assign the relationships randomly, even if they were going to be static. I'd set a static link list for each of the pages, and then watch the data to see if those relationships can be tweaked.

This way you can learn a bit more about your offering and run no risk of messing up the SEO. And while you're at it, you also have a situation where you can do a solid split test of how many related links is optimum for each page.

Lorel

1:53 am on Aug 5, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



How about if you have those links set up via an include? Is that considered static?

tedster

1:58 am on Aug 5, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yes - the links would simply be in the source code and Google has no way of seeing that they are placed on the page by using an include. So if the included links are randomized every time the page is served, all my comments above would apply.

Simsi

3:16 pm on Aug 5, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Thanks Tedster. Makes sense.

jimbeetle

3:27 pm on Aug 5, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Obviously this means each time the page is loaded, those 6 links vary and each time GBot visits it will see a different set of links.

What about your site visitors? You know, like me, where as I'm clicking on a different link I say to myself, "I'll check that other link when I come back to this page."

I'm really, really ticked if it ain't there.

Robert Charlton

9:35 pm on Aug 5, 2010 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I've seen setups like this and have never quite been able to figure out why someone would do this. My impression has been that it's an effort to make best use of limited site architecture, and perhaps save the effort of doing the work of prioritization and organization that ought to be done. Like tedster, I don't like this kind of randomness.

That said... and playing devils advocate here... I've occasionally seen relatively large sites with randomly rotating product menus for their less important related products, which overall appear to do surprising well, well enough in fact that I've wondered whether they know something that I don't.

I've always assumed that when Google doesn't find a link path to a page, the link credit via this path would disappear and the particular boost in the rankings would disappear.

Perhaps, though, there's a kind of residual memory for the link credit that helps the page hang in there for a while. If the residual effect is long enough, it might hold up until other pages are refreshed a sufficient number of times for the link to reappear. I should add that some sites I've seen have random menus on multiple pages, which further complicates analysis... and then there's the question of what link is in place when the spider reappears.

This isn't the kind of arrangement I could live with, but I'm wondering whether there's any thought about why it might apparently work (if it does). Maybe enough monkeys at enough typewriters.

1script

9:59 pm on Aug 5, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



This isn't the kind of arrangement I could live with, but I'm wondering whether there's any thought about why it might apparently work (if it does). Maybe enough monkeys at enough typewriters.
I actually have a number of forum sites using a somewhat similar feature - Related posts. I would not call the links random of course because the degree of similarity differs and some post are really more related, so only those are shown. But those links aren't static either because with new discussions the list of related posts always updates.
I've never tracked any effects on ranking except for maybe one obvious one: you can't rank for anything at all if Googlebot hasn't seen the page, and I hope that those links help Gbot learn about more URLs on my site. Getting them to index more pages has always been an uphill battle for me, so I've kept these links just for indexing sake.

tedster

10:12 pm on Aug 5, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Another approach that you might consider is not randomizing the list of links with every page load, but keeping them the same for a week and then re-shuffling the deck for the next week, etc. That would keep googlebot's crawl experience more stable and also help the user who revisits the page, such as jimbeetle described above.

Simsi

12:09 pm on Aug 6, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've seen setups like this and have never quite been able to figure out why someone would do this.


This particular set of pages uses a keyword sniffer and provides links that seem more relevant to the keywords containned in the referal string.

HuskyPup

2:06 pm on Aug 6, 2010 (gmt 0)



There are around 200 products many of which cross-relate. Currently, when you visit a product page, it serves up 6 random links (in a list) to other products that are relevant.


It's very doubtful a site would rank highly for any of the products since the bot would see a different result each time it spidered, keeping the links static would make it much more of a relevant or even authority site about those widgets.