|Will randomly generated links confuse rankings?|
I'm SEO expert but now I'm in the middle of one really big SEO project and need some advice from other experts already done projects like this.(I can't share the URL,because is client website.)
Basically,we made a huge international business directory with more than 100 000 categories and subcategories and we filled the directory with millions of businesses worldwide (we didn't scrape content from the net,but bought huge companies database)
All the pages (categories + listings) are seo optimised - have keyword rich URL's + metatags and all of them are static.No duplicate content at all.
Also the directory has 100% hierarchical structure.
Everything looks fine,but there some things that bothers me:
1) First,do you think that when a website appears from nowhere,with millions of pages,Google and others SE will consider it as spam?The pages are not autogenerated,all have static content,but as long as i know,new untrusted websites with more then 10 000 pages published at once are suspicious in the eyes of SE.
2)We place 2 blocks on all pages with random generated internal categories and listings URL's (40 in total)
On page: mydomain.com/New_York/Category1/Subcategory1/
apart from static content,there are 40 internal links,different every time when you refresh the page.
20 internal categories URL's:
... and so on
and 20 listings URL's,like this:
So my question is:
Do you think that the PR value of the site will be splited and SE bots will be confused because of this random generated internal links?
In a word - yes. Random links have often created ranking troubles, especially if the links are newly generated with each page load. Especially at launch, when you have so many challenges to deal with just to begin to establish visibility and trust, I would not suggest doing this.
And even later on, it's better for the content to be more stable, in my opinion. Maybe the way to go is changing out the links every couple of weeks, making sure that they are still well themed with regard to the page where they appear.
Depends on the content and position of links. For example you could have an online store where the top-sellers change often because of orders placed. This is natural but best to position them after the main content and standard navigation links and is going to be a minor part of the page that changes.
I'm curious about this issue now. I have product category homepages which display a single random entry from a product database (thumbnail, title, text snippit, link). Refreshing the page generates a different product. I did this to keep the page fresh, in a manner of speaking. It's certainly not the main page content though. Could it be detrimental nevertheless?
|Could it be detrimental nevertheless? |
I don't think that one rotating link will be any problem for the SEs, but this is actually one of my pet peeves as a site visitor. As I'm clicking on a different link that highlighted product catches my eye. I'll be sure to check that out after I finish the page I'm going to, I say. Done, hit the back button, and it's gone. A waste of my time and possibly a lost sale to you. You might consider rotating the products per visit and not per page refresh.
Well, you already have an established site - and it's not a million URL directory but an ecommerce site. So you already are rolling, and you also have the benefit of seeing what is going on. It is a different situation.
My guess would be that one randomly generated link in the midst of a lot of stable content would only make minimal problems, if any. The main problem in all these cases would be that anchor text can be an on-page factor. So there's a chance that a page will rank for words that the visitor will them not see. With many such links, the page might not be clearly "classifiable" and thus not establish a stable indexing and ranking for a specific query. I'm talking about Google's back end meta-data here.
|I did this to keep the page fresh, in a manner of speaking. |
If it's not there for the visitor, then I would remove it or at least generate it for a more extended period rather than change it every time the URL is requested. Maybe that's just me, however.
Hmm food for thought indeed. Thanks for your replies. In fact the category/product explanation was only an general analogy. In reality it's more of an agency, and the reason for randomising the content is to be "fair" to the clients whose properties are being displayed. So not e-commerce as such. The idea is to stimulate interest by seeing different properties each visit, and we can't be seen to be favouring one client over another. The idea of displaying the same thing for 24 hours could work, but if Google returns the next day it'll be equally troubled by the change if it is in fact an issue.
Can you put these in an iframe, disallow the iframe pages in robots.txt?
Yeah something like that with a parameter or two would work. Thanks
Hi gays ?
Anyway, I've recently found a need to do this with a site though I much prefer static links whenever/wherever possible.
In my case I used to have a few pages with some affiliate items that updated frequently but the links pointed to the 3rd party website and I used nofollow. I wrote a script that instead features the items on my site and so all of the links point to pages within my own site, nofollow removed.
I noticed a fairly sharp and immediate drop in rankings for the pages with the list of items, presumably because they all contain newly nofollow free links?, and I found that placing links back to these pages from the item pages helped restore the balance.
Problem: the pages are fairly dynamic though they do draw descriptive information via rss from sections of my own site. Being that they are dynamic I couldn't easily assign a permanent link to each page and so the links are random, but relevant. I only have 100 or so random links in total and show 5 at a time.
So far so good, Google doesn't seem to be penalizing the site at all. It may take 3-6 months before I can know for sure but after 2 months all is well.
"Random" linking can be a great way to sculpt pagerank. But its hard to do right.
The links can't change when the page refreshes. The links can't change when you add new products. You have to have more links pointing to the popular products. You can't have popular products point to really poor products at all.
If you put the links in a header or footer (that is identical to the rest of your site) Google will parse it off and not worry about it. They will however, try to index those target urls.
Google is brilliant at spotting patterns and fuzzy patterns. They are experts at Id'ing headers, footers, menus, and other "dupe snippets" on your site. If you put a random link in there, your rankings will not be effected. Scroll to the top of this screen - look to the right of the screen, and press reload. no issues.
If you do it in the middle of a page full of content - that is a different story and as others have said above - could be an issue.
The way I did this once was to create a seperate script with a single parameter that was then called within a transparent iframe on each page. It randomised links, keeping relevancy (hence the parameter inut) and I didn't see any negative impact but of course you are not passing juice either.
This looks like some very important info I have missed in the past. I too have "featured client" type items on a number of pages, including the home page, which change with every page load. It was done to give every client a chance to be featured - seamed natural enough to me.
They go in an iframe from now on or will be replaced with static information.
Thank you for the info posted above.
No, I do this on a site and it got increased traffic. So by itself they don't harm you