Forum Moderators: Robert Charlton & goodroi
I think the last time Google crawled my site with a non-www URL was April 20th, oddly enough I see evidence of this same quirk on another of my sites, I'm wondering what happened on or about April 20th.
Hope this helped a little.
Using the www is only because this seems to be the version that most users are familar with - one could redirect to the non-www version just the same, if this is what most of the incoming links are to. All these threads on 301's lately, and the need for a www, have maybe got some people a little confused on the reasons for it. One just wants consistency; there's no inherit need for a www subdomain.
Have a look at your backlinks, with special attention to the PR of the page they're coming from, and the URL version. If the best ones are without www, then reconsider your redirect choice.
The same goes for 403, 404, and 410 responses using custom ErrorDocuments -- It's a good idea to check them.
I second what Stefan mentioned. Don't rely soley on a 301; try to get as many incoming links corrected as possible.
Jim
I strongly recommend that you should check both the www and non-www URLs with the server header check utility. A client recently setup 301 redirect for non-www successfully. However his www version was now having a 302 redirect which is a fix worse than the problem itself.
The best time to add a non-www to www redirect is before the site goes live. Combined with inconsistent incoming links or (even worse) inconsistent internal linking, there is potential for problems like this. Unless you've used a lot of questionable SEO on your site, this situtation should resolve itself, but it will take time. Don't reverse course, or that time will be extended. If this was the right decision for the long term, then the payoff will be worth the wait.
I suffered the loss of 50% of my traffic for several months because of changes like this on one site. Now the number of visits I lost during that time period amounts to only one or two percent of its current traffic levels. So, it's a matter of perspective.
Jim
[edited by: lawman at 4:59 pm (utc) on June 17, 2005]
I have a look into site:www.yourdomain.net (in your e-mail profile); there are over 69,000 pages indexed by Google. Just 3 days after you had 301 is too short to be the issue. IMHO, it is an coincidence.
This is something new I think I'm seeing on my sites; URL only pages that don't show a title in SERPS but seem to still show up in the SERPs for keywords in the Title.
I have a unique string in all my titles and nowadays some URL only listings are shown even when I search for the unique string in the Title. Just an odd tidbit.
Any ideas?
In addition the ip addy is open and returning a 404 but that may not be a problem. You might want to run the ip addy through a header checker there may be a 302 involved depending on how things are setup.
Good luck.
Anyhow, I did do an ISAPI rewrite on all my dynamic pages which now have a "page-38.asp" format. I'm hoping this will resolve the issue. I do have over 3000 pages, so this may take some time.
In the headers there are a few things to look at.
date: same as last time you uploaded it
any robot directions
code 200 , 404 ect.
base: location url of the file
these are common areas that I pay a lot of attention to for siderability (is that a new word?).
Not to say that there couldn't be other problems showing up in the header info.
I like that poodle predictor tool because it shows the complete header - including Meta - all in one page. But for return code (or other problems) you might want to try WW header checker just to make sure you are not bypassing something with poodlebot (redirect). Im not sure how foolproof it is for that.
it'd be good to hear a little tutorial from someone who knows how to read header info for other possible spiderability problems you could locate from that data. Other than what I mentioned already.