Forum Moderators: martinibuster
I came across something weird for he past few months:
I have these guys with a huge website network that want to exchsnge links, all their links back have nice pagerank -
However, the pages where they intend to put the link back, although they always have page rank, are NEVER in Google cache!
How is that possible? I find no robots.txt of any kind, how is that possible? Am I being paranoid?
They have a bunch of link pages on their sites and it is crazy but even PR4 pages have no cache, each time they propose a link exchange...nice PR but no cache - the network is operated by an SEO company...which recently gained nice rankings.
<-begin google quote->
Important, high-quality sites receive a higher PageRank, which Google remembers each time it conducts a search. Of course, important pages mean nothing to you if they don't match your query.
<-end google quote->
Thus, it is easy to gain great PR for a page designed for a never-searched term, so a page made for the search-term "I will never search for this in my whole life" can have a Google PR 5+ and get absolutely no traffic.
Ok maybe not quite, but that's the gist of it.
A cache has nothing to do with PR and never will, the page only needs to be in the index. If you want an example that nobody can argue look at the NY Times website (nytimes dot com), it's a PR10 and there isn't one page that is cached.
However, he said something to where why would you let someone copy all your content? also something to the extent of liability for user submitted content.
After I read his statement I agreed and quickly placed code on all my pages to get rid of the cache. If someone wants to see something on my site, they can click on the link and go there themselves.
Also, TBPR isn't all that important. If the link is there and no funny stuff is happening (javascript, too many links on page, rel="nofollow", etc.) go ahead and do it. A link is a link is a link...