Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

My titles are showing up lowercase in the SERPs

But are title case on the site

         

hannamyluv

1:11 pm on Nov 17, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I looked around for a thread on this, but did not find one. Found one about Yahoo, and for that they said it was a bug in Yahoo that eventually got fixed. Anyhoo, if there is a thread already on this, please point me in that direction.

I have a blog that uses wordpress. I have several sites that use wordpress, so I don't think that is it but just noting in case. A few weeks ago, I started to notice that the titles from my blog in the SERPs were all lower case. I checked and on the site, the title tags are in title case.

It is not a huge deal, but rather annoying. The titles look sloppy in the SERPs being lower case. I looked at everything I could think of... feeds, sitemaps, code, anchor text (which was Yahoo's problem) and I can't find a reason for it.

tedster

5:28 pm on Nov 17, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Are the titles in lowercase even on a site: query?

hannamyluv

7:35 pm on Nov 17, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Yes, they do appear in lower case for site: query.

Another thing I just noticed is it seems to only be pages that were indexed or reindexed recently. And puncuation has been removed as well.

tedster

7:41 pm on Nov 17, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Just as it was in the Yahoo situation, Google does not always use the title element from a page as the clickable "title" in the SERP, although they certainly do so the great majority of the time. So now I wonder does that same phrase appear on your page, or feed, or internal backlinks, or even external backlinks - but in lowercase? Also, are you using Feedburner, which is now a Google property?

Just trying to poke around into all the corners, you know.

hannamyluv

8:27 pm on Nov 17, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I found something else, but it does not make sense. It appears that a few weeks ago, when I had made a change to the robots.txt, I accidentally disallowed the folder where new pages are stored.

The weird thing is, since the folder was disallowed, why were these pages showing up in the index at all, lower case title or no?

I fixed the robots.txt so my problem is fixed, I think.

tedster

8:36 pm on Nov 17, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think you've got it! Urls can still show in search results, even though their indexing is disallowed in robots.txt. But in cases like that, Google does not display any content taken directly from the page. Instead, they will adapt the text that they find elsewhere, around links that point to that url.

So the content of those urls is not indexed in a case like this, but the EXISTENCE of those urls is still noted.

hannamyluv

9:36 pm on Nov 17, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Funny thing, that actually answers another question I was just wondering to myself. Does Google count links to another page is the page is unindexable? This shows that they do.

But it does trouble me some that the pages, even just the URLs, appear in the SERPs. Granted they would not perform well, if the content is not indexable, but it has always been my thought that to disallow a folder in the SERPs was to keep the search engines out and prevent the pages from being indexed at all.

URLs showing up in the SERPs is still indexing, even if the content is not used to judge it, no?

jdMorgan

9:55 pm on Nov 17, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> URLs showing up in the SERPs is still indexing, even if the content is not used to judge it, no?

No, not according to the major SEs. Indexing is fetching, parsing, and analyzing the contents of the page, independent of noting the existence of the URLs. They started doing this during the big "deep Web" hoopla a few years ago.

To keep the URL out of the SERPs, you must Allow the page in robots.txt, and then add the <meta name="robots" content="noindex"> tag to the <head> of the page itself.

I liked the old way better... Much less wasted bandwidth. :(

Jim