Thanks for pointing that out. Had not been to sitemaps for a bit.. Some interesting information there.
I'm curious why they are including this information in sitemaps. Did G just decide to display this info and figured sitemaps was the best place to put it? It doesn't seem to have much to do with the actual sitemaps as the crawl errors are all pages that aren't in my sitemap.
Part of a bigger plan?!
I like the information being provided on query stats the most, it's only a little glimpse, but lets you know what keyword you are showing up most for.
Sounds like a lure to attract more people to using sitemaps?
Certainly makes it tempting!
Based on the appearance of the updated site, it seems to provide a lot more info. I'm getting metrics for average pagerank of my sites' pages and metrics about the type of content. Very nice.
The stylesheet is different, so it required a hard refresh when I visited the page. But other than that, well done. :)
Here's the details on the new stats:
I like it!
Both the Top Search Queries and the Top Search Query Clicks include the top 5 of each for your site.
I love the feature, definately nice to see what people are clicking on compared to our actual ranking.
Following on from the announcement earlier in the year [webmasterworld.com], Google have now updated the tools. This latest announcement was made at the WebmasterWorld Pubcon in Las Vegas today.
There are new "query stats" that show top Google search queries that return pages from a site. It also shows the top search queries that drove users to click on a site.
Crawl errors: Will now show, for example, specific HTTP errors Googlebot runs into when crawling a page.
Looks useful to me.
Log in here https://www.google.com/webmasters/sitemaps/login
that seems to become very interessting, but:
"Data is not available at this time. Please check back later for statistics about your site."
"The PageRank of your pages in Google ¦ Distribution"
With me: the majority is "Low" ... #*$!IGO?
Nevertheless: thanks for the heads up, not everybody is drinking as it seems :-) ...
With the implementation of new Sitemap features, we can be sure that it will be another weighing factor in SERP algo. I can see how Google would make use of this new feature. And I suspect that they implemented another parameter not visible to webmasters: Time spent on a page accessed through keywords.
If visitors reaching your page through a query string spend considerable time at your site, then google would regard it as an indication of relevancy, and would increase weighing factor of the string in question for your site. Conversely, if the visitors spend little or no time, then google would suspect that the association between the query string and your page is weak, that the relevancy is low, and accordingly will decrease its weigiht.
IMHO, this new feature is a real breakthrough in Google's fight against spam. It can easily turn out to be the spammers' nightmare if my assumption about the use of time parameter is correct.
Nice upgrade. I'm still waiting on the sitemaps feature to let me know which pages were crawled organically vs naturally. My particular problem is that G says I have 30,000 pages when my sitemap only has a few thousand.
It's great! I like the "Top search queries" vs "Top search query clicks" feature. That's data that hasn't been available before.
>> With me: the majority is "Low" ... #*$!IGO?
Pontifex : Perhaps they revealed the *real* PR for a second or two through a buggy sql query? ;)
Looks good but not to sure how to use it yet...
I like it ;-)
Fantastic, I'm really going to have to look at this in more detail :D
As far as I can tell 'low' pagerank counts as either a 3 or a two. Not quite sure which though.
A shame it doesn't explicitly state which pages googlebot has hit though, I've never been able to figure that out except for a 15 day trial of analyse spider.
|Data is not currently available. Please check back later for statistics about your site. |
I submitted the sitemap a few days ago. Is this delay normal?
Matt Cutts has a bit about it on his blog
|I signed up today and itís pretty sweet. For example, you can now see crawl errors, timeouts on pages, robots.txt errors, unreachable urls, etc. Just really useful hard data that tells you if you have crawl problems and what they are. And you do not need a sitemap to use this functionality. You just create an empty file to verify that you own the domain. |
|And you do not need a sitemap to use this functionality. You just create an empty file to verify that you own the domain. |
That is absolute gold-dust...
Are all stats shown if you don't have an empty file (currently I only see query stats - but will that change?)
What's the difference between " Top search queries" and " Top search query clicks"... I'm seeing terms which my site doesn't even come up for under the latter column.
Problem with the system:
It mixes image and search click results (at least) but has a link only to the main search results. This makes isolating the two very difficult.
[edited by: vincevincevince at 10:44 am (utc) on Nov. 17, 2005]
|That is absolute gold-dust |
... because it tells google who owns what domain (better than WHOIS data ever could) and therefore who is cross-linking all their sites, so that they can take this into account when assigning values to links...
|... because it tells google who owns what domain (better than WHOIS data ever could) |
That's a good point. Remembering the smart pricing approach, penalties could now easily be reflected across all your sites. A good case for multiple sitemap accounts?
I think I'll be creating one account per website unless Google can make a statement to the effect that they will not use the data we verify here to relate one site to another, and I'd suggest others do the same.
I'm getting this for most sites (no sitemap, but are verified):
|Data is not available at this time. Please check back later for statistics about your site. |
|Data is not available at this time |
|It's great! I like the "Top search queries" vs "Top search query clicks" feature. That's data that hasn't been available before. |
Too right it's great! So what I'm assuming it's saying is "here's a list of the queries you're turning up in the strongest" (presumably a combination of position and frequency of the search?) and "here's a list of the queries where people are actually clicking on your result". Obviously we know the second one from our logs, but the first bit of information is fantastic. It shows one of my sites to be appearing strongly for the query "widgets" (and indeed, on checking, it does). But I never knew about this, because nobody ever clicked on my Google result. If I just go away and play with improving the presentation of that result, perhaps I can change all that!
|If visitors reaching your page through a query string spend considerable time at your site, then google would regard it as an indication of relevancy, and would increase weighing factor of the string in question for your site. Conversely, if the visitors spend little or no time, then google would suspect that the association between the query string and your page is weak, that the relevancy is low, and accordingly will decrease its weigiht. |
I don't think that would be the case as there is no way for google to know the amount of time that a user spends on any particular site. Unless of course that site is running the new google analytics code or using adsense.
It would be pretty foolish for google to add a weighting factor to their algorithm that they could only determine for a fraction of their index.
>It would be pretty foolish for google to add a weighting factor to their algorithm that they could only determine for a fraction of their index
Really? Are you talking about what used to be the old well reputed search facility some university guys set up or are you talking about the exceptionally large business that seeks only one thing as all business entities do ... profits?
|... because it tells google who owns what domain (better than WHOIS data ever could) and therefore who is cross-linking all their sites, so that they can take this into account when assigning values to links... |
I have 12 domains in my sitemap account. My company owns 3 of them, the rest are our clients.
Data which was available earlier today has been removed again. Either something's wrong or the server has taken a severe battering (it is sluggish now).
It seems to be hit or miss on whether data displays for me. More often I get "Data Not Available" than any actual data.
Our huge site is part of a much larger corporate site, so I uploaded the blank html file for verification to:
The sitemap statistics tool will not let me access query stats, crawl stats, or page analysis. The error message reads:
"If you verify at [domain.com...] we will add it to the Site Overview page and show you errors with URLs for the entire site, as well as a greater variety of site statistics "
But I don't want site statistics for [domain.com...] I want them for [domain.com...]
Google treats www.domain.com and www.domain.com/site/ differently in the index and I actually do not have access to the main corporate directory. Very frustrating. And I was so hopeful.
We haven't submitted a sitemap because we get crawled just fine. Does anyone know if the results are better if you do have a Google sitemap?
Is anyone experiencing this same problem? Is there a workaround?
I sent Google a note asking how to get my site stats, but I have little hope I will get more than an autoresponse.
| This 40 message thread spans 2 pages: 40 (  2 ) > > |