Forum Moderators: open
I have several sites that have many product pages or just static content pages. A lot of them recently turned into just plain URLs where a title and a description of the site was. Why is that happening and what can I do to fix the problem?
What I have seen since then is some updates, along with some URL's that have now gained the DMOZ details. So instead of just the URL, there is also now the DMOZ description. Yet the actual page details are not there yet.
A very odd change. I will be intersted to see how the sites pull out of this one.
KG
Googlebot visits have fallen from daily to once a month.
Actually, I have a plan B which is in place and looking good, but I'll report back on that when I can justify that what I see is repeatable!
DerekH
I am actually seeing daily and hourly changes for sites so to put a finger on an exact change would be a guessing game that may hurt in the long run. I must say that my site changes have happened within the week where if you know the bot hasn't come in months then obviously it is a different problem.
At least that's what they wrote in a email response to this question.
Whatever they say, try these things.
Remove any excessive <!-- comment tags --> that are global, especially if they are within the head tags.
Keep on page js script and css script to a minimum. Call these as a remote file.
Reduce the amount of repetitive code at the top of your code. This includes repetitive meta code, navigation code etc . . . Navigation code can fall after your unique body copy using a simple tr span.
Of course make sure your html is clean.
Doing this has solved 70% of no title/description issues I have worked with.
Usually it takes under a week to see if the problem is fixed. You will start to see batches of pages reappear with the title and description. The bigger the site, the longer it takes before they are all back.
Rite now even i m facing the same problem with my site....only url and no description and title....
but when i tried to start the thread..on it...i was kept on hold for some reviewing reason.....and that thread never appeard in the forum of webmasterworld...it was in oct...i guess....
wel neways...no obligations on that issues....
but i guess what minnapple says is correct...
we shd check that .....
Minnapple Says---Remove any excessive <!-- comment tags --> that are global, especially if they are within the head tags.
Keep on page js script and css script to a minimum. Call these as a remote file.
Reduce the amount of repetitive code at the top of your code. This includes repetitive meta code, navigation code etc . . . Navigation code can fall after your unique body copy using a simple tr span.
Of course make sure your html is clean.
Doing this has solved 70% of no title/description issues I have worked with.
KaMran
One of the sites I had that went walkabout had no Javascript, all CSS in a remote file, modest comments, no HTML errors...
I moved the site, unmodified, to a different ISP, and 66% of it was indexed in 4 days, instead of 1/3 of a percent per month...
It's not a PR issue, it's something more subtle than that.
DerekH
There was a specific issue with my server - a bad memory chip [webmasterworld.com], fixed Dec 9 - and had put the lack of description down to this. However, after 26,000+ bot accesses in the last 3 weeks there are still only ~400 pages with description. What is very worrying is that the PR for the domain shows as zero.
Reply from Google is just usual boiler-plate.
Google bot access-numbers for latest 3 logs:Totals:
66.249.64.: 1193
66.249.65.: 15052
66.249.66.: 8853
216.239.3 : 929
--------- ------
Dec 5 - 26 26027# ls -l com-access_log*
-rw-r--r-- 1 root root 36796350 Dec 26 01:46 com-access_log
-rw-r--r-- 1 root root 36830865 Dec 19 04:02 com-access_log.1
-rw-r--r-- 1 root root 35322992 Dec 12 04:01 com-access_log.2
-rw-r--r-- 1 root root 39731196 Dec 5 04:01 com-access_log.3
So, is this a bug in the algorithms or a policy change?
I took minapple's advice and it worked! ... forum pages ... were url-only in the serps. ... the meta-description and the h1 were the same for all pages. I deleted the meta-description and changed the h1 ... now in the serps with title, snippet, and cache.
Of 15,600 my-site-only SERPS, just 400 have title, snippet, and cache. After a quick check on the page source of a handful I can confirm that *all* 15,200 url-only SERPS have unique meta-description + h1 title (the php-coding was designed explicitly to do this, and the actual page source confirms it).
(With just one caveat) perhaps of more influence was the fact that the header text **changed** - just that, it changed.
The caveat: all of the pages on my site can give identical header + body text under 2 different urls:
In addition, the site was--some years ago--with a free-host; to prevent 404's, that host re-directs to my site, so
Are duplicated header+body text the issue here?
Plus, are any other webmasters getting totally brassed-off with Google?
It don't make no difference for Google.
The issue is memory based. These days it takes Google 3 or 4 cycles to completely process your pages. That means months and months of wait time.
These days it takes Google 3 or 4 cycles to completely process your pages. That means months and months of wait time.
adfree:
Do a "site:widgets.com" search and then a "site:widgets.com common keyword" search ... using the mutual keyword, all pages appear with title and description.
Sadly, the latter is the one that gives 400 title+desc SERPS; the former gives 59. Both show 15,600 results.
you may need to wait another month or two for the 'undigested' URLs to become processed.
It is evidence like this that causes me to think that this situation is far more complex.
My main complaint is that we are having to go through a stab-in-the-dark-and-listen-for-a-yelp process. My site is as clean as I can make it and, AFAIK, contravenes no published guidelines. Google is also becoming very unresponsive, and this does not auger well for the future.
Do a search for "SES URLs" for php - you should be able to find some info on this. Once you convert everthing to php make sure you create permanent redirects - otherwise you will have duplicate content.
Index page is not at Google Index [webmasterworld.com] (Dec 28, 2004) should actually be in this thread, as also identical issue.
sasha
Oh, I see what your problem is: you need to make your URLs 'search engine safe'.
The one thing that I would underscore in this farrago is the way that Google is switching from being the Internet's friend to being just another corporate fiend. Such a shame, and so easy to avoid (just be upfront so that everyone knows what is happening).