Forum Moderators: Robert Charlton & goodroi
I have a blog that uses wordpress. I have several sites that use wordpress, so I don't think that is it but just noting in case. A few weeks ago, I started to notice that the titles from my blog in the SERPs were all lower case. I checked and on the site, the title tags are in title case.
It is not a huge deal, but rather annoying. The titles look sloppy in the SERPs being lower case. I looked at everything I could think of... feeds, sitemaps, code, anchor text (which was Yahoo's problem) and I can't find a reason for it.
Just trying to poke around into all the corners, you know.
The weird thing is, since the folder was disallowed, why were these pages showing up in the index at all, lower case title or no?
I fixed the robots.txt so my problem is fixed, I think.
So the content of those urls is not indexed in a case like this, but the EXISTENCE of those urls is still noted.
But it does trouble me some that the pages, even just the URLs, appear in the SERPs. Granted they would not perform well, if the content is not indexable, but it has always been my thought that to disallow a folder in the SERPs was to keep the search engines out and prevent the pages from being indexed at all.
URLs showing up in the SERPs is still indexing, even if the content is not used to judge it, no?
No, not according to the major SEs. Indexing is fetching, parsing, and analyzing the contents of the page, independent of noting the existence of the URLs. They started doing this during the big "deep Web" hoopla a few years ago.
To keep the URL out of the SERPs, you must Allow the page in robots.txt, and then add the <meta name="robots" content="noindex"> tag to the <head> of the page itself.
I liked the old way better... Much less wasted bandwidth. :(
Jim