Forum Moderators: Robert Charlton & goodroi
Just go to the old uri and see the change:
[google.com...]
or go to:
[google.com...]
Preferred domain [?]
If www.mysite.com and mysite.com point to the same site, you can tell us here how you want URLs to display in our index.
Display URLs as www.mysite.com (for both www.mysite.com and mysite.com)
Display URLs as mysite.com (for both www.mysite.com and mysite.com)
Don't set an association.
Note: Once you specify your preference here, it may take some time for changes to be reflected in our index. While Google doesn't guarantee that we'll show your URLs in the form that you prefer, we will use your choice as a suggestion to improve our indexing.
When I set my preferred domain I noticed an option in the subnav for "Crawl Rate" that wasn't in the subnav on other pages. Going there resulted in a 404, but it looks like something they're working on. The odd part is that they ignore the "Crawl-Delay" setting in my robots.txt file.
I also added
RewriteEngine On
RewriteCond %{HTTP_HOST} ^mydomain.de [nc]
RewriteRule (.*) [mydomain.de...] [R=301,L]
to my root-htaccess-file a few months ago. I hope to make clear that I wish thee www-version to be the default one, but I have some fears to run into this canonical issue. It would be helpful if one of the googlers could point us to what exactly this feature is aiming at and what is expected from webmasters.
I can't think of any reason why a webmaster would define two different sitemaps for the www and non-www-version, as the sitemaps account would suggest, but correct me if you know any.
Glad to see this topic on the front page of ww.
Informally, I think of this as the "Dayo_UK" feature, and I'm really excited that we're providing this option. Bear in mind that you're indicating a preference, so it's not a 100% guarantee, plus it can take a few weeks for the preference to go fully into effect.
But this should help a lot of webmasters who have wanted to indicate which version (www or non-www) to use.
GoogleGuy, can you please give us a rundown on the most recent visible PR updates and perhaps request an across the board update so we can verify what is being shown in the Webmaster Central console? Thank you.
[edited by: SEOcritique at 9:38 pm (utc) on Aug. 5, 2006]
- Deep crawl complete
- PageRank update complete
- XX days have gone by since you last submitted a sitemap
Display URLs from 2005
Display URLS from 2006
The www thing is great if they obey it, but wouldn't it be just peachy if you could tell them to use URLs that ACTUALLY EXIST and not rank pages that haven't existed or have been inaccessible since December 2004.
How about:
Display only URLs that exist
Display URLs that haven't existed for 18 months
For example, I try to prune out dead urls pretty regularly, and I was trying to remove an url tonight using the normal url removal tool as I always do, but it kept telling me that it still existed on the website. Well it doesn't exist on the website; it returns a 404 - I host the website and the page no longer exists, and when I click on the link in Google, it takes me to the 404. But the tool still thinks it's live for some reason.
It'd be great to have some place to report stuff like this; I'd think that Webmaster Central would be the perfect sort of place. But there's no easy way to do this.
Also, I sure wish they'd give an option to put all the sites on one page, instead of having to scroll through pages. I have 61 sites in Sitemaps right now, and it'd be a lot easier if I could just have them on one page when I want to quickly find the site I need to check.
But overall, it's good to see updates.
One concern I have about site maps particularly is if Google places any weight on the order of listings in a site map. The reason is that with really big site maps it is much easier to sort the site map alphabetically to see what is what. If Google places an importance on the order of pages in a sitemap, this is really important to know.
One additional thing I'd like to see with sitemaps is the ability to tell Google to not list/ignore any page that is not on the site map.
[edited by: KenB at 12:00 am (utc) on Aug. 6, 2006]
It may still emerge in wider circulation, depending on how people like it.
netmeg, I like your suggestion to roll the url removal tool into one place with GWC. I'll pass that on.
[edited by: GoogleGuy at 12:05 am (utc) on Aug. 6, 2006]
This is EXACTLY the type of communication I believe many of us wanted. Thanks GG.
Next up, the new broken site: command that shows pages from some other domain, and supplemental pages too boot. It's not bad enough that Google forces supplementals on me for pages that don't exist but now is ever so not-sweetly forcing some other domains supplementals on me too!
+ Tools
Link, but it seems to be deactivated altogether, no url shows in the status bar. No drop down.
Is it broken already?
Or perennial beta?
Installing Sitemaps worked its magic and the second level pages were re-indexed within days and all seemed to be peachy.
Last September I noticed a problem with both www and non-www results appearing in Google, so fearing a dupe penalty, I added 301 redirection code to .htaccess, which has been working as expected. This caused all the non-www dupes to end up as Supplementals, which I hope will soon be deleted (as per message #3035710 from GoogleGuy above).
However, my current concern is this:
From June till last week, the "Statistics" tab >> Crawl stats reported a table with "Your page with the highest PageRank". It had entries for June, July and August. However, ever since I indicated a preference for www over non-www, and Sitemaps installed the non-www as a separate site to "manage", the table indicating highest Pagerank has disappeared from the www Stats and is now showing in the non-www instead - with identical data to what was previously in the www's stats.
How can this be, as all non-www requests have been auto-redirected to www since September 2005 and all the non-www pages are Supplemental and have been for a long time?
KenB wrote:
One additional thing I'd like to see with sitemaps is the ability to tell Google to not list/ignore any page that is not on the site map.
I'd like to second this request - it'd help solve a lot of problems.
[edited by: Mokita at 3:10 am (utc) on Aug. 6, 2006]
This is the computer age, anything less than real time is worthless. It's sad that yahoo is more up to date than they are, but hey they "renamed our sitemaps thingy!"
Get search right, then deal with the rest of the junk.
:)
I am with Steveb - I will send flowers to Google (actually I wont - but will praise Google) when we see sites that got ranking issues due to the split start to improve.
Encouraging though.