System: The following message was spliced on to this thread from: http://www.webmasterworld.com/google/3450399.htm [webmasterworld.com] by engine - 3:35 pm on Sep. 14, 2007 (utc +1)
I just saw GWT has a new look. Navigation on the left and more streamlined. Easier to find stuff.
The "What Googlebot sees" shows no new info, it's just more organized.
Looks kinda nice.
I like blue.
GWT is cool, no matter what anyone says.
A great tool to see what has been happening on your site two to four weeks ago. No, I mean it, I'm kinda thankful for Google.
Not that it had made any of the sites rank better, but at least I can learn a lot about *marketing* mistakes... typos, bad wording choices, ... other markets shooting for the same phrases...
It's a great SEM tool.
I also noticed a "Subscriber Stats" option which I don't think I've seen before unless I missed it. I don't do any feeds at present so that's possible.
"If your site publishes feeds of its content, this page will display the number of users who have subscribed to these feeds using Google products such as iGoogle, Google Reader, or Orkut."
The "PR distribution" tab seems to have been removed.
The subscriber stats thing seems very useful! I use Webmaster Tools on my blogs, and it immediately told me how many subscribers each had, which is something Blogger doesn't do at all.
Full of Bugs.
Google shows more than 200 URLs as 404 not found for one of my domains.
When I look into my access_log for this day, there was not a single 404 error. Seems like they have a new problem to fix. The affected domain already lost its index.html because of this bogus Google 404 error reports.
Additionally they mix up URLs which are not found with different domains. what a mess!
Anyone seeing this? Why do they release a buggy tool?
[edited by: SEOPTI at 12:12 am (utc) on Sep. 15, 2007]
> The "PR distribution" tab seems to have been removed.
No, it is under crawling statistics.
> Seems like they have a new problem to fix.
In a way, yes, but this is not new. The "Problem statistics" particularly 404s seem to oscillate around certain values for no obvious reason. I'd suspect this has to do with broken links on crappy external websites, but I am not sure.
In this respect I observed another thing, which I however never reported to google: In parts I use very long URLs. The anchor text of those long problematic URLs is abbreviated with dots, because otherwise it would not fit into that one-line-scheme. But in some cases, clicking on that link leads my browser to an URL with such abbreviation-dots in the middle, and this URL of course does not exist. It could also be the case that google has a problem with deciding those abbreviated anchor-texts from the true URL. Somewhere in their databases both these items seem to get mixed up. Seems quite likely to me, but I cannot imagine noone at the plex ever noticed this before, though to find the cause would be another matter;)
BTW: I also noticed quite a number of googlebot requests in my logfiles seeking for directories instead of full .html-URLs, and some of these were also reported in webmaster central. Many of my directories did NOT contain an index-file, because I thought my internal link-structure would not deserve any. But you never know, what other scraper-sites do whilst linking to you, so a few weeks ago I wrote a little php-file, which automatically creates such index-files (with a noindex- but follow-metatag) in order to help googlebot crawl my site properly. There was this ww-thread where googleguy or some other insider was complaining about the rotten syntactical status and link schemes of half of the web, which made it almost impossible to write a 100% accurate crawler. Adding those index-files has diminished those error reports considerably.
regarding the 404's
some of this data may be weeks old. I've noticed that myself.
Bewenched, if the data is old, why does it affect current URLs?
Really strange, this is a serious bug.
I agree ... it is a bug. What's really scary ..is that if such public functionality is so seriously buggy (talking about webmaster tools) Then just how buggy is their search algo?
[edited by: Bewenched at 1:22 pm (utc) on Sep. 17, 2007]
|Why do they release a buggy tool? |
I believe it is partly so because of the schedule (similar to how every project has time frames). Maybe Google could do more testing, but looks like webmasters (real users) do this job better than QA. Eventually they will fix bugs. i hope :)
Google has a lot more free tools/services to test except this one. I'm glad they add features from time to time, and would prefer to have slightly buggy tool than nothing.
PS: I haven't seen a free tool which is not buggy :-) Maybe I use too few. proverb: if the program has no bugs - it is useless.
One more bug. Looks like sometimes they use anchor text instead of url. For example on some forums long urls are shortened. So url is http://example.com/blah/blah/blah/testpage.html and anchor text becomes http://example.com/blah...blah/testpage.html For some reason Google tries to find http://example.com/blah...blah/testpage.html on the site, producing "not found" errors.
joelgreen, yes, this is what I meant. Sry for producing so many words myself;)
I liked the old webmaster tools better. But it is probably a matter of getting used to the new look and feel.