homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

Only the URL is shown in the SERPS
Does this spell DOOM?
Watcher of the Skies

 2:37 pm on Aug 22, 2005 (gmt 0)

I have a large, sub-domained site that's had sections which have been languishing around on the third and fourth page for a few months since popping out of the 'box. Just a few days ago, I've noticed that instead of a title and a description and cached date, etc. it merely shows "yechh.blah-blah.com/". I've seen other sites listed like this occasionally but I'm not sure of the significance. Have we done something wrong? Are we about to go "poof". Or is it something else? Any help/words of comfort would help, thanks.....



 2:11 pm on Aug 24, 2005 (gmt 0)

Check your log files and make sure your robots.txt file validates correctly. You may have a roadblock that is preventing the google spider from crawling the pages.


 4:48 am on Aug 26, 2005 (gmt 0)

one of the reason may be duplicate content.. for url only listing (partially indexed pages)...

or [google.com...]

There's no description of my site.

The Google index contains two types of pages -- fully indexed and partially indexed pages. Your page is currently partially indexed, which means that although we know about your site, our robots haven't read all the content on your pages in past crawls. This doesn't adversely affect your PageRank or your inclusion in our index. It does mean that we don't have detailed information about your page, so we display its URL as the title and omit a description. We understand the frustration this may cause you, and we're always working to increase the number of fully indexed pages in our search results.


 11:47 am on Aug 26, 2005 (gmt 0)

Watcher of the Skies - this happened to me on two of my sites. They became URL-only for several months.
In desperation, I moved one site to a different ISP, and it recovered instantly. Just when I felt I knew what was the matter, the other site recovered all by itself.

Just sometimes a site that doesn't have particularly high PR seems to get caught between two stools - Google notices the pages are different but doesn't come back to fetch them.

Whether there's a provable theory here or not, one thing is certain - for some sites at least, the problem will heal up by itself.


 8:05 pm on Aug 26, 2005 (gmt 0)

Yes, I think it might spell doom. This happened to my site and I then I was gone from the SERPS for 7 months. I waited and waited then simply got fed up and moved the location of the page i.e... www.site.com/folder/page.html to www.site.com/page.html and was back in Google in about three days and looking healthy. The bad part is I lost 99% of Yahoo traffic and have yet to recover. MSN picked up the change fast as well. I didn't do a permanent redirect though perhaps I should have done one after Google picked up the new pages, but before Yahoo dropped the old ones... my stupid gaff...


 5:50 pm on Aug 27, 2005 (gmt 0)

If the pages can be accessed using both www and non-www without a redirect, then it is that duplicate content that is hurting you.

Set up the 301 redirect to fix that particular problem.


 5:55 pm on Aug 27, 2005 (gmt 0)

>>>Set up the 301 redirect to fix that particular problem.

Yep - and Wait - and Wait and Wait........

Oh and then wait a bit more....


 5:56 pm on Aug 27, 2005 (gmt 0)

After reading this thread I realized that 2 of my sites had no robots.txt! Both of these sites are not important to me because I had given up on them long ago but opted to leave them running to see what happens. Both were listed with URL only so I added the robots.txt, 301'd the NON www to the www and submitted a G Sitemap for both... amazingly enough just a short 48 hours later both had listings. Thank you for starting this thread because sometimes even experienced webmasters forget the small details and this prompted me to check.


 6:39 pm on Aug 27, 2005 (gmt 0)

>> Yep - and Wait - and Wait and Wait........

For several sites with the redirect added in March all was fixed in the SERPs by the end of May. The trick to fix the last few "stuck" entries was to make a fake sitemap pointing to the URLs that were to be delisted and put that sitemap up on another site for a few weeks. That fixed it.

However Google then reverted the SERPs on some of their datacentres back to how they were at New Year, which unfixed things for a few weeks for some of the sites. Since then, everything has been fine for the last month or more again.

There are still a couple of datacentres running on very old data though, and I have commented on that before, in several earlier threads.

Watcher of the Skies

 10:29 am on Aug 29, 2005 (gmt 0)

Thanks all, for your help. (By the way, all of the dozens of sub-domains are PR5 but a few which are PR4. Also, we rank extremely well in MSN and are 100% ABSENT from Yahoo!) Here's what I'm hearing so far....

1.) Redirect either non-www to www or vice-versa. As 95% of my links are solicited to non-www, I will do a 301 redirect (whatever that is, but sounds extremely simple to learn) from www to non-www.

2.) Though the site is no more than three levels deep (index/detail/order), and is thoroughly and properly linked, ensure I have a sitemap to all pages.

Follow-up Questions:
1.) I have no robots.txt file. This is because we're a very small company (two plus outsourcing here and there) and I thought it a bit of a luxury. How would this help, exactly? Isn't it particularly to "exclude" things, and not necessarily to aid spidering? Is it worth it to add one just to get the spider's attention, even if the content is nominal?

2.) For the 301 redirect, I'll read about this and figure it out and do it. (Yes, I'm sure this is basic stuff, but I learn what I need as it comes up.) But, are there any non-intuitive tips I should know first, particularly as might relate to Google?



 1:32 pm on Aug 29, 2005 (gmt 0)

>> But, are there any non-intuitive tips I should know first, particularly as might relate to Google? <<

1. Find out what the server defaultname is: is it www or non-www?

2. Make sure that links to folders always include the trailing / on the end of the URL.

3. If you link to index pages, do not include the index file filename, end with the folder name and a trailing / on the end (Sidenote: Google sees "/" and index.htm etc, as being separate pages; and using "/" allows you to change to index.php, any time in the future, without having to change any internal or external links).


Problem scenario: (been there, done that {this was a site I helped out back in March}):

Default Server Name was www.domain.com

Owner wanted all listings to show up as just domain.com

There was no redirect, so both www and non-www pages were showing up in the results, many as URL only, and many pages were not indexed at all.

All pages of the site were index files in keyword-named folders linked like /atopicfolder/akeywordfolder that is, without a trailing / on the end.

Adding the redirect from www to non-www in the .htaccess file was a trivial job, and within days the non-www pages were being better indexed, and gaining titles and descriptions where there were none before.

I ran Xenu over the site and was amazed to find the page count to be exactly double what it was supposed to be. I let Xenu generate a sitemap, and found that exactly half of the "pages" listed had a title of "301 Moved".

What was happening was that a link to /atopicfolder/akeywordfolder was being picked up by the server and an automatic redirect to www.domain.com/atopicfolder/akeywordfolder/ (remember the server default includes the www on this site) issued before the instruction in the .htaccess file then forced the redirect to the required domain.com/atopicfolder/akeywordfolder/ page.

An easy fix was to make sure that the links always included the trailing / on the end of the URL. The Xenu report was then perfect. Any link to a page like /atopicfolder/akeywordfolder/ then took you directly to domain.com/atopicfolder/akeywordfolder/ exactly as required.

The correct way to fix it would have been to also change the defaultname for the server, but I didn't have access to that. However, if you always include the trailing / on the links then the problem "goes away".

Google then took about three weeks to properly index all of the non-www pages, and about six weeks to drop all the www listings; but has reverted back to old listings for a few days from time to time since.

Watcher of the Skies

 3:02 pm on Aug 29, 2005 (gmt 0)

Wow...thanks a ton for your help. I had been digging since my previous post and had found your advice for identifying the server default and have sent an email off to my web host for help. We'll see what they say. What I learned, btw, was that my SUBDOMAINS are all, as I wish, listed as non-www so I won't have to change the defaults for the dozens of subdomains. Ironically, the "root"(?) domain, was listed as www, rather than non-www and I have asked the host to change the default for this. So, htt*://mystuff.com actually has a default of htt*://www.mystuff.com. However, the subdomain htt*://example.mystuff.com has that as its default. In either case, I want to do a 301 redirect of all references to any www file to the non-www "version". Apparently, as I've scanned several threads (most with very helpful suggestions from yourself and others) it may not be as "easy" as I presumed and while I'll spend some time figuring it out, I do have one initial question: Do I have to do this on every!@#$% page or do I do this globally/semi-globally? Isn't that what happens if you put it in an .htaccess file? Thanks...now I'm back to the other three threads with info about this....and trying, above all, to remember to clear the cache when I check it as someone suggested. :-)


 5:31 pm on Aug 29, 2005 (gmt 0)

I would make www the default for the root. I almost always do so (this one site wanted the opposite of that). Most people assume that a web URL will include a www. For other subdomains, www isn't always assumed.


Redirects are easy, and work for all pages on the sub-domain:

Options +FollowSymLinks
RewriteEngine on
RewriteCond %{HTTP_HOST} ^widgets.com [NC]
RewriteRule ^(.*)$
http://www.widgets.com/$1 [L,R=301]
RewriteCond %{HTTP_HOST} ^www.red.widgets.com [NC]
RewriteRule ^(.*)$
http://red.widgets.com/$1 [L,R=301]
RewriteCond %{HTTP_HOST} ^www.blue.widgets.com [NC]
RewriteRule ^(.*)$
http://blue.widgets.com/$1 [L,R=301]

Watcher of the Skies

 7:07 pm on Aug 29, 2005 (gmt 0)

Aha. So I CAN "mix and match". I will do this and can see that I must add a redirect for each individual subdomain - no problem. Hopefully this will be my last question (yeah, right) but which .htaccess file do I put this in: etc, www, public_html or all of the above?


 7:22 pm on Aug 29, 2005 (gmt 0)

Yes, it goes in the .htaccess file found in the web root: "public_html", or "htdocs", or "www", or whatever it is called on your site. This only works with Apache webservers.


Make sure you add the instructions to any existing file (rather than simply uploading a new file which overwrites whatever the host has already set up). Be aware that many FTP programs are set to hide from directory listings any filename that starts with a dot (just because you can't see a file there, doesn't mean that there is no file!).


 7:31 pm on Aug 29, 2005 (gmt 0)

g1smd, thanks for all the info. If you can't see the file how do you change the settings to view it?


 7:42 pm on Aug 29, 2005 (gmt 0)

Look in the prefences of the FTP program.

Look for stuff like:

[X] Hide filenames that start with a dot.


 7:50 pm on Aug 29, 2005 (gmt 0)


Your information and help is a credit to you and Webmaster World


Watcher of the Skies

 2:17 am on Aug 30, 2005 (gmt 0)

Well said, Earwig!

I second that. Thank you g1smd - you're a star.


Watcher of the Skies

 7:43 am on Aug 30, 2005 (gmt 0)


htt*://www.mysite.com/subdomain-name/index.html seems to be yet ANOTHER path to the same content as the "subdomains" seem to be set up as "subdirectories" (which i thought was standard). Must I redirect all those as well or will the script above take care of that?

So the following:
1.) htt*://www.mysite.com/subdomain-name/index.html
2.) htt*://mysite.com/subdomain-name/index.html
3.) htt*://subdomain-name.mysite.com/index.html
(as well as all the above without the index.html) all go to the same content. Does it ever end?


 9:58 am on Aug 30, 2005 (gmt 0)

"Cheap" hosts often don't provide "real" subdomains, they alias a subdirectory name for a real folder, and leave both accessible.

You could block the folder version using a Disallow for each in the robots.txt file, or you could redirect to the subdomain using .htaccess, or if it is a PHP site simply get the script to self-check and see if the page is being served as a folder page instead of a subdomain page and then add <meta name="robots" content="noindex"> to the page if the folder version is detected as being the requested URL.

There are many options, and each can have unintended consequences if you are not careful. There may be other methods that I haven't thought of. (Can you completely block access to the folder version using a DenyFromAll statement in .htaccess too, I wonder?)

Again, whatever you do, when you link to an index page, do not include the index file filename itself in the link. For links to folders, always include the trailing / on the URL.

Watcher of the Skies

 10:40 am on Aug 30, 2005 (gmt 0)

thanks, again - i'd better get down to it, will let you know how it goes :-)


 1:20 pm on Aug 30, 2005 (gmt 0)


I wondered why my traffic went down the drain the last two weeks - now i know. I do have exactly the same problem as you have.

In my case it's definitely my provider which is responsible that i loose A LOT of money this month.

What i have learned now: large website - run your own server.


Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved