Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Cannot see any cache links for my site -- why?

         

i_am_dhaval

3:09 pm on Dec 18, 2005 (gmt 0)

10+ Year Member



I have one site…

today I c my site

I will enter : site:www.mydomain.com
and what I m c my all pages cache link r gone.

only site map I can c cache link. in other page I cannot c way it happen?

when it’s come again?

g1smd

9:46 pm on Dec 18, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I have recently been looking at a site that has 800 pages, of which 200 are real content, and 600 are spam-laden cloaked pages.

The site has been indexed OK, and used to show a fresh date on every page every few days. The site also ranked well before this, but as of nearly two weeks ago, all of the spammy pages have lost their cache link and no longer show any fresh dates. There is a lot of content duplication within the spammy pages too.

The normal content pages still have a cache link, and still show a new fresh date in the SERPS, every few days.

It looks like the spam has been detected and Google is heading towards wiping it from their SERPs.

.

My guess is that if your site has already been around for a while, that some sort of penalty is about to hit it. Use Xenu LinkSleuth to check all your internal links, and WebBug to check that you have a proper 301 redirect from non-www to www for all pages of the site. Make sure that every title and meta description is different from that on every other page of the site, and that each tag accurately reflects the content of the page that it is on.

If the site is very new, then other factors may be at work (but do the above checks anyway).

soapystar

9:56 pm on Dec 18, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



its amusing on an ironic level..my noindex pages are all being cached......

:(

g1smd

10:19 pm on Dec 18, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Errr, if those pages are supposed to have <meta name="robots" content="noindex"> on them, then check for a typo within that tag. That tag usually causes pages that are already indexed, to drop out within just a few weeks.

If you are doing the exclusion via the robots.txt file, and these pages are new, then it is possible that Google hasn't yet seen the exclusion in that file or acted on it.

Note: For old pages that are already indexed by Google, adding an exclusion in the robots.txt file will NOT cause those pages to be unidexed. They will stay indexed forever, because Google indexed them at a time when you allowed them to be indexed.

soapystar

8:11 am on Dec 19, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



<meta name="robots" content="noindex, nofollow">

thats the exact tag. The tag has been on the pages since they went up.