Forum Moderators: Robert Charlton & goodroi
But there's still a lot of anxiety out there by everyday folks on how to impliment and manage a website. It kinda reminds me of the days before Microsoft brought in Windows when people were struggling with complex language to do their tasks.
It would be good if there was improvement in the systematic method of identifying key quality control elements from a site owners perspective and a way in which this can be better presented, than the way things are currently done.
Maybe a check list of pass and fail.
Some of this is well on the way - [ hat's off ] - but as some is way off, isn't this the time to put your 2 cents worth in and give some valuable feedback, perhaps with some urgent priority areas?
For me the critical areas for systematic analysis are:
-Site Architecture [ big improvements here recently ]
-IBL's [ better analysis to ascertain quality ]
-Duplicate content [ and the various types ]
-Banning
-Penalties
Maybe some comments on the effects of each error would be good, so that webmasters can plan for the implications of these issues.
What would you like to see happening to support you better?
Google knows (a) what links I have on my site, and (b) what status those URLs return when Googlebot tries them. It would be nice if Google saved me the trouble of checking all those links myself by providing a list somewhere in sitemaps.
I think this would be a great help to webmasters. I've lost count of the number of times I've tried to follow a link from a site and the target isn't there any more.
Here are two thoughts that come to mind:
It would also be nice to be able to clear, filter or mark errors as acknowledged or corrected so we no longer have to look at them.
Vanessa, I suppose it would be too much to ask for a button that will update our visible PageRank. <G> :)
My suggestion?
Would be great if we could tell google to index *ONLY* the urls which are on the sitemap file. Why? Because some people try to generate duplicate content by adding extra parameters to my urls.
One thing I noticed odd with one of my sites that is missing from the index is that it was last crawled Dec 31, 1969.
[edited by: Tomseys at 11:20 am (utc) on Aug. 26, 2006]
edit:Forgot to mention, that my real ranks changed since 3 weeks
[edited by: Tomseys at 1:24 pm (utc) on Aug. 26, 2006]
1, configurable like in analytics
2, or weekly
Too much happens in even 1 day for a 3 week average to be worth anything to me , anyway.
Probably good for a giant website, but would they actually be interested wi this tool with all the professional tools an personell they might have already.
This tool must sure be for the one man or very small operators
Let us see on witch page a link exists to a non existing page (404)
I'm with ya on this one.
My site keeps showing errors for a few pages that are dynamic and the querystrings are broken all to heck... it would be nice to see where google got that link since it isn't in the sitemaps and the links like that dont exist on my site.
Where did those links come from? some site scrapers website?
We are working to make the statistics as useful as possible. We've had a lot of feedback about that feature, so definitely let me know what your "dream" statistics would be.
Tomseys, does this happen with all of the index stats links? I can't replicate this in IE6, but I'll have the team look into it more on Monday.
We're working to get the display issue that's causing some sites to show 1969 dates fixed as quickly as possible. It would be pretty cool if we had actually mastered time travel -- I could make use of that in my personal life.
[edited by: Tomseys at 4:10 pm (utc) on Aug. 26, 2006]
[webmasterworld.com...]
[edited by: Tomseys at 4:25 pm (utc) on Aug. 26, 2006]
I'd like to see Google develop a way to tell sites why they've been algorithmically downranked. If necessary this could be done in general terms to avoid giving away secret sauces.
Ironically there is now better "service" via reinclusion requests for penalized sites and people who are highly manipulative than those that are downranked for secret algo reasons (but still in the index) and are really trying to stay within guidelines.
I'd like to see Google develop a way to tell sites why they've been algorithmically downranked. If necessary this could be done in general terms to avoid giving away secret sauces.
Joeduck / Seocritique - Absolutely!
It brings tears to my eyes when i see very capable webmasters taking years to crack certain issues. There are whole organsations with information distribution that can be brought to their knees.
The implications for these failures are huge, but on the flip side the opportunity for efficiency is great.
Then there are developers building widely applied internet content applications that don't co exist with the SERP's [ like g1smd's thorough coverage of a key BB software flaw ]
This whole process is so fundamental to the online alignment of good quality control right through the internet. It gives a basis for the internet to co exist in an ordered fashion.
And heck, when you have 000's of motivated webmasters doing the right thing [ eventually millions ] , SPAM can be a thing of the past with their co operation [ my Utopia! ]
So reporting comprehensively on bans, penalties etc at this early juncture is just so important, in my view, and although things can't be turned on overnight, it would be good to share in some of the plans that can make that happen.
[edited by: Whitey at 11:42 pm (utc) on Aug. 26, 2006]
"No pages from your site are currently included in Google's index due to violations of the webmaster guidelines. Please review our webmaster guidelines and modify your site so that it meets those guidelines. Once your site meets our guidelines, you can request reinclusion and we'll evaluate your site."
I don't know how this could be. The site is a simple blog. There are no tricks or anything. I wonder if my other sites that fell out of the index will show this.
I don't know what I could possibly change.
Vanessafox, could you lend any advice?
Anyone else seeing this?
I will say that it is cool at least they are communicating what the deal is. Thank you for that.
[edited by: Tomseys at 1:58 am (utc) on Aug. 28, 2006]
This is a very good step in the right direction though. At least I know something is wrong.
[edited by: Tomseys at 2:27 am (utc) on Aug. 28, 2006]
One thing weird. My google account keeps changing to the Portuguese language.
edit: It's the sitemap part of the account that stays Portuguese. I can't get it to go English.
I recently added and verified a site with a Portuguese name about Brazil. Could that have set something off?
[edited by: Tomseys at 4:45 am (utc) on Aug. 28, 2006]
Maybe define what "last crawl date" means, does it mean it is from which Googlebot - the adsense bot, the supplemental bot, the index bot?
I find it misleading that Google bot hits my site hundreds of times a day and my "last crawl date" was two weeks ago. Let me know which bot was two weeks ago.