| 10:53 am on Aug 5, 2006 (gmt 0)|
Personally, I forsee Google Sitemaps being a must have in any webmaster's toolbox. As reseller mentioned, I'm already seeing major improvements in terms of Site problems reporting. In fact, just now I found a 404 page I accidentally created thanks to Sitemaps.
"If www.xyz.com and xyz.com point to the same site, you can tell us here how you want URLs to display in our index."
I haven't noticed that feature before and though I'm not sure what it does exactly but I like it.
| 10:56 am on Aug 5, 2006 (gmt 0)|
Not sure arubicus.
I find it intresting that it has been added to the sitemaps console - could read into it that Google needs help in this area still (which we sort of know but Google never really admit the problems they have had with non-www versus www)
Well I have selected the www option and we will see what happens.
There is also a crawl rate button that goes nowhere from the preferred domain page.
| 12:28 pm on Aug 5, 2006 (gmt 0)|
"I find it intresting that it has been added to the sitemaps console .."
IMO, that option has been added recently as a gift from the kind Googlers to you personally. After they got tired of your messages during the last year: Listen Google... etc.. :-)
New Sitemaps Motto:
Tired? Depressed? Check Your Sitemaps :-)
| 3:45 pm on Aug 5, 2006 (gmt 0)|
> I haven't noticed that feature before...
Yes, it is new right today.
>could read into it that Google needs help in this area still...
Exactly. And I have the feeling that this help is (will be) well appreciated at google's, though we shouldn't expect anyone to admit that any problems exist. I suspect the crawling/indexing algorithm has emerged to a very complex program with permanent quick crawls and permanent indexing as opposed to the regular monthly crawling/indexing a few years ago. It seems as if googlebot is not only swallowing the stuff, but has a very long tail ready to pre-digest it.
We all know how difficult it is to find out about errors beyond 10k of coding and I think this program is a lot more complex. It seems as if the program is working quite fine on a broad level (except some wiped out babies..;) but that there is nevertheless a serious glitch concerning the handling of very basic error-codes (301,302,403,404...), resulting in a lot of wasted disk-space and some other problems reported in ww. If our feedback on the sitemaps-reports may help to solve these problems, we are happy to participate. The more specific feedback is sent to google (either informally here or by the tools reseller mentioned), the earlier the reported inconsistencies will be gone.
>Well I have selected the www option and we will see what happens.
>There is also a crawl rate button that goes nowhere from the preferred domain page.
Same for me so far.
| 4:24 pm on Aug 5, 2006 (gmt 0)|
Count me among the impressed. I put my sitemap back into my root domain, downloaded it to Google sitemaps, and was verified (something I had not been able to do repeatedly over the past six months) all within something like 2 minutes.
| 5:00 pm on Aug 7, 2006 (gmt 0)|
>>>maybe make a blog
What the heck?! They actually listened! Less than 24 hours after the suggestion, they make googlewebmastercentral.blogspot.com! Now to see if they actually do the rest of the suggestions...
>>>that allowed comments
There not doing very well on leting us communicate with them, that's for sure. Yahoo's better than them there.
>>>where you give up updates on fixes to search results, which would include for example the sub-domain SPAM stuff, and explain what a 'bad data push' is, stuff we're all interested in but Matt Cutts won't even talk about.
*yawn* So far they've been just like Matt Cutts, not telling us about stuff we're really interested in, like where they're at with fixing the current search results screw-ups.
[edited by: Jesse_Smith at 5:02 pm (utc) on Aug. 7, 2006]
| 6:30 pm on Aug 7, 2006 (gmt 0)|
Anyone else seeing their supplemental cache's refreshed to much more current cache's? I am wondering if what googleguy was talking about is happening for our site right now.
the pages are still supplemental but with cache's from 2006 instead of 2005. And instead of about 300 pages showing up in site command we have over 3000.
| 7:23 pm on Aug 7, 2006 (gmt 0)|
On certain DC's I have no cache at all for supps, others it has updated so yes I see some movement. Thank god.
| 7:26 pm on Aug 7, 2006 (gmt 0)|
Yes, Massive updates of Supplementals by the looks of it underway.......
| 7:28 pm on Aug 7, 2006 (gmt 0)|
Just keep going! :)
| 7:37 pm on Aug 7, 2006 (gmt 0)|
A site of mine that was 99% supplemental with 10% indexed in any way is now only 90% supplemental with 25% of the content indexed.
Quite a change. Positive for a change.
| 3:12 am on Aug 9, 2006 (gmt 0)|
OK my 2cents worth. I dont think that GG, Adam, Vanessa etc know anything, the SE Algo's are tweaked all the time and somtimes you do well and somtimes you so real bad and somtimes you do even worse, I think it will always be mixed up as a static Search Engine with top sites showing for certain keywords "forever" would be pretty boring.
It would be better for Google to mix it up by exchanging old and new algos monthly giving us all a fair share, this 27thJune and 27thJuly Algo is just dreadful and I am using Yahoo for a while until results are better.
| 3:33 am on Aug 9, 2006 (gmt 0)|
It always 'Erks' me that when all hell breaks loose in the Google index, we're told the problem is ours and that we should refer to the webmaster guidelines. After however many weeks of trawling our sites for errors we find that Google decides to change their setup again and. hey presto, all of 'our' site errors are forgiven.
It would be nice to receive a 'We are experiencing problems at present' for once, instead of 'You are .....'.
I, for one, am just glad that there is always Ebay to fall back on for business these days.
All the best
| 11:01 am on Aug 9, 2006 (gmt 0)|
There is a very important piece of communication that Google could add to the communication with webmasters. And that is to communicate with legitimate, whitehat webmasters who formerly ranked well in the serps but now, perhaps for reasons out of their personal control (eg. programmers who do not know proper SEO and the pitfalls of bad redirecting, etc.) have "fallen and can't get up".
I recommend there being guidelines in place for siteowners to qualify so that the whiners would be eliminated. Accordingly, I stand by my post at Matt's blog on May 1:
Great to hear that Google is becoming more proactive in notifying legitimate sites about problems. However, this still remains a one-way street . I would like to suggest a way to complete the trip, making it two-way communication.
How about allowing legitimate site owners to petition Google why their site has plummeted in the rankings for an extended period of time. I am not talking about the whiners who complain about everything, nor am I talking about short term deterioration in the SERPS.
I am speaking about site owners who formerly were on the first page of the SERPS for at least six months who have fallen off that pedestal for at least three months. Perhaps something was done outside of their knowledge that resulted in a penalty.
That way the legitimate site owners have a solid forum to get the problem fixed. Anyone that has gone through this agony for more than 3 months deserves feedback from Google. As you can guess, my site is still going through this agony.
Thanks again for enlightening us. Keep the window shade open."
(Note: My deterioration is 30 positions in the serps and the clock is now at 8 months and ticking. Only now am I finding from my old programmer who knows SEO how badly the contract programmers screwed things up. I am sure other legitimate site owners out there can feel my pain.)
| 2:15 pm on Aug 9, 2006 (gmt 0)|
Who gets to determine what a legitimate site owner is, and by what criteria? And for that matter, if a legitimate site owner is ranking highly for a search term, does that mean they should be able to count on sitting in position (give or take a few points) forever? What about other legitimate site owners who come along and want to rank for the same term? Will they be SOL no matter how high quality their pages might be?
I don't mean to rag on you, I just don't see how your idea is practicable.
| 2:22 pm on Aug 9, 2006 (gmt 0)|
and I say again to Vanessa, Googleguy and any other G employees that may be trawling these forums:
The Tools would be great if they correctly showed useful information, specifically, why a site, with decent pagerank is being penalized or filtered in the SERPS. If honest webmasters had this information I think you would be surprised how many of us would make a serious effort to clear up the issues that are causing these penalties and filters. This rampant speculation of why this and why that is happening and all the testing of “theories” is what gets us upset and in many cases creates many of the black hat tactics to get our pages reinstated. Put tools in place that provide us with information we Can Use to fix problems with our sites. Obviously Google knows very clearly why a penalty or filter is being generated, why can’t Google provide us with this information so we can fix the problem?
| 3:04 pm on Aug 9, 2006 (gmt 0)|
Here are the answers to your questions:
"Who gets to determine what a legitimate site owner is" = Google
"by what criteria?" = The cumulative track record of the site over at least one year, preferably several years. Google has this data.
"...able to count on sitting in position ... forever" = Heck of course not. I love competition. If someone takes over my spot legitimately I work even harder to get back in front of him. What is not fair is to be penalized for something totally out of my control and not know why I am being penalized.
"What about other legitimate site owners who come along and want to rank for the same term? Will they be SOL no matter how high quality their pages might be? = No. Again that is what competition is all about. And what relevancy is all about. Web sites' and SEs' mission is to look out for the end user--the visitor. If a legitmate site owner has a quality site, he will rise in the rankings on merit, not on spam or because someone did him in or some outside force out of his control negatively impacted him.
My idea is not only practicable but also badly needed. Why should legitimate site owners who ranked well in G for years then suffered a sudden drop in the serps not be advised of why the sudden drop. G is the doctor and the legitimate site owner is the patient. If my health drops suddenly even though I did nothing out of my normal routine, I want the doctor to examine me and see what caused the problem and tell me how to rectify it.
| 5:51 pm on Aug 9, 2006 (gmt 0)|
I'm sorry, I still can't get past the idea of the legitimate site owner. Because I think what Google might consider to be a legitimate site owner is very likely to be different than what you or I or a hundred other people might consider to be a legitimate site owner. I mean, in a sense you could say that's what they're doing now - and look at all the people who are unhappy with it. EVERYONE thinks their own site is legitimate, other's opinions of it notwithstanding. Cumulative track record of what? All you need to do is look over in the Adwords forum over the past month to see the huge difference of opinion and contention over what is considered a 'quality' site and a 'quality' user experience. Some people think it's sales conversions, and nothing else. Some people think it's unique information. Some people think it's whether or not the site is pretty. Some are affiliates, and some hate affiliates.
| 6:08 pm on Aug 9, 2006 (gmt 0)|
" Yes, Massive updates of Supplementals by the looks of it underway....... "
This seems to have stopped in some cases reverted back. Just checked some DC's
"as retrieved on 19 Aug 2005"
So much for March 2006.
| 8:54 pm on Aug 9, 2006 (gmt 0)|
When I logged into sitemaps the other day there was a tease link "Crawl Rate" it went to a missing page. :(
I'd like to turn the throttle wide open. I'll even through in a bottle of Nos to give the bot a thrill ride if it will just please spider the pages we've had set up in sitemaps since November of last year.
| 12:56 am on Aug 10, 2006 (gmt 0)|
Well I still don't get it, use to rank really well. I recently joined Sitemaps, they seem to have all my pages listed- 65pgs (just a small niche site) and they crawled my site on the 4th August (last crawl was way back in June) there is nothing wrong in the results for the crawl, so whats a person suppose to do, obviously theres something wrong. I use to rank 7 or 8th now my rank is 333.
| 3:55 am on Aug 10, 2006 (gmt 0)|
|" Yes, Massive updates of Supplementals by the looks of it underway....... " |
This seems to have stopped in some cases reverted back. Just checked some DC's
"as retrieved on 19 Aug 2005"
So much for March 2006.
They never said HOW long they would fix it. I wish someone would buy google a big magnet. Take the big magnet and go over to the server farm that has all of the old pages from 2003,2004, and 2005. Delete all the old junk, then when their search engine takes a dump, they can't revert back to old crap. Heck returning blank pages would be better than stuff that was 404'd 2 years ago. Then again whenever I want to get nostolgic and go relive the past, I just google.
| 11:55 am on Aug 10, 2006 (gmt 0)|
|The Tools would be great if they correctly showed useful information, specifically, why a site, with decent pagerank is being penalized or filtered in the SERPS. |
We have had what is obviously a penalty slapped on our site since April 26. Since then we picked over our site again and again, trying to discover WHAT CAUSED THE PENALTY.
What can we do to get this information from Google? We want to make our site better for everyone, why not let us know what tripped the wire?
Reinclusion requests are ignored as we have not actually disappeared from the index - it is just that no page is now placed higher than about page 4 or 5, even when searching on long, unique text strings.
If Vanessa, Adam or anyone in Google could please take the time to send me a sticky, my faith in Google would be restored. It has been impossible to get anything more than an automaton reply for over 3 months now.
| 12:36 pm on Aug 10, 2006 (gmt 0)|
"Reinclusion requests are ignored as we have not actually disappeared from the index - it is just that no page is now placed higher than about page 4 or 5, even when searching on long, unique text strings."
IMO, you can file a reinclusion request though your site is still on the index. You just write that you have cleaned your site of .... and of ..., and you assure Google that it will never ever happen again in your physical life :-)
However, if your site has been hit by the algos, then no reinclusion request on this planet would help, unfortunately.
| 6:04 pm on Aug 10, 2006 (gmt 0)|
"They never said HOW long they would fix it. I wish someone would buy google a big magnet. Take the big magnet and go over to the server farm that has all of the old pages from 2003,2004, and 2005. Delete all the old junk, then when their search engine takes a dump, they can't revert back to old crap. Heck returning blank pages would be better than stuff that was 404'd 2 years ago. Then again whenever I want to get nostolgic and go relive the past, I just google. "
I understand but as of today almost ALL dc's have reverted back and show 2005 results. I just did a check and all but 1 showed current results and I expect that one to dissappear soon.
| 7:47 pm on Aug 10, 2006 (gmt 0)|
I think there are in fact 3 possibilities here for what is going on with Google.
#1. Google is just losing it. Life as a search engine has got the best of them and their technology has not caught up with what they are attempting with the search algo.
#2. G is just trying to spread the wealth and giving the other webmasters a change to be on top. After all who says once you work your way up you should be king forever? Is that some kind of right? nooooo, so sorry charley. Lots of spam sites are not run by spam kings but by little guys (junior adsense spammers!) who can really use the spike in cash flow.
#3. G is just trying to encourage you to use adwords to keep yourselves on a much more even keel with your FOR PROFIT website. After all why should you get a FREE ride. That will be reserved for non commercial sites from now on out.
| This 116 message thread spans 4 pages: < < 116 ( 1 2 3  ) |