Forum Moderators: Robert Charlton & goodroi
But there's still a lot of anxiety out there by everyday folks on how to impliment and manage a website. It kinda reminds me of the days before Microsoft brought in Windows when people were struggling with complex language to do their tasks.
It would be good if there was improvement in the systematic method of identifying key quality control elements from a site owners perspective and a way in which this can be better presented, than the way things are currently done.
Maybe a check list of pass and fail.
Some of this is well on the way - [ hat's off ] - but as some is way off, isn't this the time to put your 2 cents worth in and give some valuable feedback, perhaps with some urgent priority areas?
For me the critical areas for systematic analysis are:
-Site Architecture [ big improvements here recently ]
-IBL's [ better analysis to ascertain quality ]
-Duplicate content [ and the various types ]
-Banning
-Penalties
Maybe some comments on the effects of each error would be good, so that webmasters can plan for the implications of these issues.
What would you like to see happening to support you better?
Dixon.
So reporting comprehensively on bans, penalties etc at this early juncture is just so important, in my view, ....
errrr ..... think you missed the fact that this would be a VERY useful tool for the spammers ...... more useful to spammers than to the majority of webmasters
the majority of competent webmasters should be able to build a website and find it appear in google and other engines in due course
and if they can't, most likely they ain't as competent as they like to think
i'd rather google just fixed their f*****g spider to stop it hammering my servers with 30,000 hits on a 1000 page site ..... (crawl delay would be nice!)
oh and if they'd just list the pages like other engines do - i mean, what's all this supplemental nonsense? (dont answer!)
i don't feel a need to watch googles attempts to get webmasters to eat, live and breath google - cos not matter how much time you spend at webmaster central (or whatever they call it this week), it's far more productive to move on, look at the wider internet picture and just wait for everything to fall into place - works fine for me
[edited by: Tomseys at 2:26 pm (utc) on Aug. 31, 2006]
I posted in this locked thread about the +Tools dropdown not working in FFox.
[webmasterworld.com...]
Today, it is working! Without changing anything at my end.
[edited by: Angonasec at 10:28 am (utc) on Sep. 1, 2006]
I am just hoping that this is almost live cos the language in the screen has changed,,
One hopes its not a one off update before it slides into stasis for the next few months or so,,
Please don't be like that other pseudo search engine , with toolbar generated rankings that no one pays any attention to :-)
Something is happening indeed! When I get back into the office on Tuesday (on vacation today), I'm going to do a blog post that explains the details. We just refreshed the statistics and have changed the way we aggregate them. Previously, the data was averaged over three weeks. Now, the data is averaged over one week. This more granular data should be more useful overall and will have a faster refresh.
And great, Angonasec. I was going to ask you if that issue had been fixed. Thanks for reporting back.
more than one week old,,,
Are we back to the 4 refreshes per year that we're granted for
PR, back links, the related sites command that,,,
Any moment now, someone at google will put the big red webmasters tools statisticsc refresh button,, any moment now,,,,,,,,