Welcome to WebmasterWorld Guest from 220.127.116.11
Hi reseller (and everyone!),
I'm here and reading what everyone has to say (and taking lots of notes about what you would like). I'll do my best to answer questions, although I can't promise snazzy videos like Matt put together. :)
Matt and I think Google Sitemaps is a good vehicle for webmaster communication because it's a scalable way to get input from you and give information back to you. You know how we like scalable things at Google.
There's definitely a lot more we can do regarding communication, much of which has been touched on in this thread. I know that with the name "Google Sitemaps", it may sound like the product is about creating and submitting Sitemaps, and these other communication features are an afterthought. But in reality, our whole team is looking at better ways to communicate with webmasters, and the creating and submitting Sitemaps is just one small piece of that overall communication.
We want this product to include all the tools and diagnostics you need to learn everything we have available about your site's crawling and indexing, as well as include ways for you to provide input to us (beyond submitting Sitemaps and filing reinclusion requests). So keep your ideas coming...
[edited by: tedster at 5:37 am (utc) on Aug. 2, 2006]
"If www.xyz.com and xyz.com point to the same site, you can tell us here how you want URLs to display in our index."
I haven't noticed that feature before and though I'm not sure what it does exactly but I like it.
I find it intresting that it has been added to the sitemaps console - could read into it that Google needs help in this area still (which we sort of know but Google never really admit the problems they have had with non-www versus www)
Well I have selected the www option and we will see what happens.
There is also a crawl rate button that goes nowhere from the preferred domain page.
"I find it intresting that it has been added to the sitemaps console .."
IMO, that option has been added recently as a gift from the kind Googlers to you personally. After they got tired of your messages during the last year: Listen Google... etc.. :-)
New Sitemaps Motto:
Tired? Depressed? Check Your Sitemaps :-)
>could read into it that Google needs help in this area still...
Exactly. And I have the feeling that this help is (will be) well appreciated at google's, though we shouldn't expect anyone to admit that any problems exist. I suspect the crawling/indexing algorithm has emerged to a very complex program with permanent quick crawls and permanent indexing as opposed to the regular monthly crawling/indexing a few years ago. It seems as if googlebot is not only swallowing the stuff, but has a very long tail ready to pre-digest it.
We all know how difficult it is to find out about errors beyond 10k of coding and I think this program is a lot more complex. It seems as if the program is working quite fine on a broad level (except some wiped out babies..;) but that there is nevertheless a serious glitch concerning the handling of very basic error-codes (301,302,403,404...), resulting in a lot of wasted disk-space and some other problems reported in ww. If our feedback on the sitemaps-reports may help to solve these problems, we are happy to participate. The more specific feedback is sent to google (either informally here or by the tools reseller mentioned), the earlier the reported inconsistencies will be gone.
>Well I have selected the www option and we will see what happens.
>There is also a crawl rate button that goes nowhere from the preferred domain page.
Same for me so far.
What the heck?! They actually listened! Less than 24 hours after the suggestion, they make googlewebmastercentral.blogspot.com! Now to see if they actually do the rest of the suggestions...
>>>that allowed comments
There not doing very well on leting us communicate with them, that's for sure. Yahoo's better than them there.
>>>where you give up updates on fixes to search results, which would include for example the sub-domain SPAM stuff, and explain what a 'bad data push' is, stuff we're all interested in but Matt Cutts won't even talk about.
*yawn* So far they've been just like Matt Cutts, not telling us about stuff we're really interested in, like where they're at with fixing the current search results screw-ups.
[edited by: Jesse_Smith at 5:02 pm (utc) on Aug. 7, 2006]
the pages are still supplemental but with cache's from 2006 instead of 2005. And instead of about 300 pages showing up in site command we have over 3000.
It would be better for Google to mix it up by exchanging old and new algos monthly giving us all a fair share, this 27thJune and 27thJuly Algo is just dreadful and I am using Yahoo for a while until results are better.
joined:June 11, 2005
It would be nice to receive a 'We are experiencing problems at present' for once, instead of 'You are .....'.
I, for one, am just glad that there is always Ebay to fall back on for business these days.
All the best
I recommend there being guidelines in place for siteowners to qualify so that the whiners would be eliminated. Accordingly, I stand by my post at Matt's blog on May 1:
Great to hear that Google is becoming more proactive in notifying legitimate sites about problems. However, this still remains a one-way street . I would like to suggest a way to complete the trip, making it two-way communication.
How about allowing legitimate site owners to petition Google why their site has plummeted in the rankings for an extended period of time. I am not talking about the whiners who complain about everything, nor am I talking about short term deterioration in the SERPS.
I am speaking about site owners who formerly were on the first page of the SERPS for at least six months who have fallen off that pedestal for at least three months. Perhaps something was done outside of their knowledge that resulted in a penalty.
That way the legitimate site owners have a solid forum to get the problem fixed. Anyone that has gone through this agony for more than 3 months deserves feedback from Google. As you can guess, my site is still going through this agony.
Thanks again for enlightening us. Keep the window shade open."
(Note: My deterioration is 30 positions in the serps and the clock is now at 8 months and ticking. Only now am I finding from my old programmer who knows SEO how badly the contract programmers screwed things up. I am sure other legitimate site owners out there can feel my pain.)
I don't mean to rag on you, I just don't see how your idea is practicable.
joined:July 19, 2002
The Tools would be great if they correctly showed useful information, specifically, why a site, with decent pagerank is being penalized or filtered in the SERPS. If honest webmasters had this information I think you would be surprised how many of us would make a serious effort to clear up the issues that are causing these penalties and filters. This rampant speculation of why this and why that is happening and all the testing of “theories” is what gets us upset and in many cases creates many of the black hat tactics to get our pages reinstated. Put tools in place that provide us with information we Can Use to fix problems with our sites. Obviously Google knows very clearly why a penalty or filter is being generated, why can’t Google provide us with this information so we can fix the problem?
Here are the answers to your questions:
"Who gets to determine what a legitimate site owner is" = Google
"by what criteria?" = The cumulative track record of the site over at least one year, preferably several years. Google has this data.
"...able to count on sitting in position ... forever" = Heck of course not. I love competition. If someone takes over my spot legitimately I work even harder to get back in front of him. What is not fair is to be penalized for something totally out of my control and not know why I am being penalized.
"What about other legitimate site owners who come along and want to rank for the same term? Will they be SOL no matter how high quality their pages might be? = No. Again that is what competition is all about. And what relevancy is all about. Web sites' and SEs' mission is to look out for the end user--the visitor. If a legitmate site owner has a quality site, he will rise in the rankings on merit, not on spam or because someone did him in or some outside force out of his control negatively impacted him.
My idea is not only practicable but also badly needed. Why should legitimate site owners who ranked well in G for years then suffered a sudden drop in the serps not be advised of why the sudden drop. G is the doctor and the legitimate site owner is the patient. If my health drops suddenly even though I did nothing out of my normal routine, I want the doctor to examine me and see what caused the problem and tell me how to rectify it.
I'd like to turn the throttle wide open. I'll even through in a bottle of Nos to give the bot a thrill ride if it will just please spider the pages we've had set up in sitemaps since November of last year.
" Yes, Massive updates of Supplementals by the looks of it underway....... "
This seems to have stopped in some cases reverted back. Just checked some DC's
"as retrieved on 19 Aug 2005"
So much for March 2006.
They never said HOW long they would fix it. I wish someone would buy google a big magnet. Take the big magnet and go over to the server farm that has all of the old pages from 2003,2004, and 2005. Delete all the old junk, then when their search engine takes a dump, they can't revert back to old crap. Heck returning blank pages would be better than stuff that was 404'd 2 years ago. Then again whenever I want to get nostolgic and go relive the past, I just google.
The Tools would be great if they correctly showed useful information, specifically, why a site, with decent pagerank is being penalized or filtered in the SERPS.
We have had what is obviously a penalty slapped on our site since April 26. Since then we picked over our site again and again, trying to discover WHAT CAUSED THE PENALTY.
What can we do to get this information from Google? We want to make our site better for everyone, why not let us know what tripped the wire?
Reinclusion requests are ignored as we have not actually disappeared from the index - it is just that no page is now placed higher than about page 4 or 5, even when searching on long, unique text strings.
If Vanessa, Adam or anyone in Google could please take the time to send me a sticky, my faith in Google would be restored. It has been impossible to get anything more than an automaton reply for over 3 months now.
"Reinclusion requests are ignored as we have not actually disappeared from the index - it is just that no page is now placed higher than about page 4 or 5, even when searching on long, unique text strings."
IMO, you can file a reinclusion request though your site is still on the index. You just write that you have cleaned your site of .... and of ..., and you assure Google that it will never ever happen again in your physical life :-)
However, if your site has been hit by the algos, then no reinclusion request on this planet would help, unfortunately.
I understand but as of today almost ALL dc's have reverted back and show 2005 results. I just did a check and all but 1 showed current results and I expect that one to dissappear soon.
#1. Google is just losing it. Life as a search engine has got the best of them and their technology has not caught up with what they are attempting with the search algo.
#2. G is just trying to spread the wealth and giving the other webmasters a change to be on top. After all who says once you work your way up you should be king forever? Is that some kind of right? nooooo, so sorry charley. Lots of spam sites are not run by spam kings but by little guys (junior adsense spammers!) who can really use the spike in cash flow.
#3. G is just trying to encourage you to use adwords to keep yourselves on a much more even keel with your FOR PROFIT website. After all why should you get a FREE ride. That will be reserved for non commercial sites from now on out.