Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google caching problem

Big old site having probs

         

Crush

5:07 pm on Jun 21, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



We have a nice old site that is done in many languages and had no caching issues until now. Our main English pages are getting cached every day but when it comes to index_fr, index_de, index_es etc google is not caching these pages. They still have nice pr and good internal backlinks but google will not cache them.

The only thing we did recently was 301 some old urls ( same domain) to the new stucture. Funnily enough these are now a cached but the important landing pages with lots of links (not 301'd are) not. We lost serps in 17 languages now :(

Any ideas?

Crush

5:39 am on Jun 22, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



well this got buried with all the bourbon hoo har. bump

Clint

12:49 pm on Jun 22, 2005 (gmt 0)



Here's another "bump" for ya. ;)

All I can say is maybe remove the 301's if that's all you've done. It could also be an issue with you being another victim of this Bourbon update.

lammert

1:02 pm on Jun 22, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I have had this kind of problems with one site in the last months. The site has always been fully indexed until some months ago when pages started to get URL only. Only two languages are used on my site, not 17, but the same type of problem as you describe. Several pages unindexed for one language despite equal PR etc.

According to my investigation, Google was thinking that there was some sort of duplicate content despite the different languages. I added a meta description tag for each page, and the problem disappeared. It seemed that adding a unique description in the local language for each page was enough evidence for Google that each page was different.

After adding the descriptions all pages were fully indexed within 2 weeks.

Clint

2:28 pm on Jun 22, 2005 (gmt 0)



>>>I added a meta description tag for each page, <<<

Are you saying you never had the "description" meta tag and you added them, or you had them then changed their text?

lammert

3:52 pm on Jun 22, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Are you saying you never had the "description" meta tag and you added them, or you had them then changed their text?

I never had description meta tags on the pages of this site.

Crush

4:24 pm on Jun 22, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Great help. That is what we are thinking. Google thinks they are duplicate. Going to put some text back.

SlyOldDog

7:31 pm on Jun 22, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If the pages were duplicate would google still not cache them and just hide them? (to show them use &filter=0 in the query)

lammert

5:10 am on Jun 23, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



In my case the pages went URL only and the cache disappeared. The pages were still crawled at the same rate as the other pages, but it seemed that the bot refused to cache them.

I first added more links to the missing pages to increase PR because I thought that could be the reason, but that didn't help. Then I read some posts on this forum that descriptions might be a reason for Google to decide what is duplicate content. Since some months Google is showing the description of a page in the SERPs for many search queries, so it seems that the description tag has gained value for Google. I decided to add a description for all the problematic pages and they all were cached within two weeks.

I am now adding descriptions to all other pages, just to be sure this type of problem won't happen again. Adding a good description also increases the number of users that visit the site from Google, so the work will eventually be paid for.