Welcome to WebmasterWorld Guest from 54.158.228.55

Message Too Old, No Replies

Sitemaps Renamed -- now it's Google Webmaster Central

     
4:47 pm on Aug 5, 2006 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



In a key piece of "rebranding" Google has renamed their Sitemaps program. It's now Google Webmaster Tools at Google Webmaster Central.

Just go to the old uri and see the change:

[google.com...]

or go to:

[google.com...]

8:16 am on Aug 7, 2006 (gmt 0)

10+ Year Member



There is also a uk version

[google.co.uk...]

should uk sites by manged here or does it make no difference?

8:26 am on Aug 7, 2006 (gmt 0)

10+ Year Member



Whitenight

Intresting example with the nytimes - it is hard to find examples which are OK to discuss at WebmasterWorld but NYtimes should be fine.

In a way it is intresting that they have not already been penalized like the majority of sites that have had this issue.

I guess a PR10 on the www and a PR7 on the non-www are enough to overcome a lot of penalties.

Lots of webmasters have been crying out for a fix for this issue for ages - whether this sitemaps preferences results in improvements to the ranking of sites effected by the bug is the real question in my opinion - if it is succesfull then perhaps it would be correct for a general announcement by Google that using the webmaster central is the way to deal with any issues.

10:56 am on Aug 7, 2006 (gmt 0)

5+ Year Member



I went through it and i guess it is ok. I still have no intention of verifying my sites, but I can see some benefits to using it. Right now, my favorite thing to do is look at competitor stats and look over their robots.txt files. While it is basic, I can still at the very least be able to use common operators on a competitors site and also see the robots.txt no follow directories all at one place.

Granted that has always been out there, but being able to go to one place on google and a couple of clicks is pretty cool.

11:48 am on Aug 7, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Vanessa asked:
"Angonasec, what browser are you using? I'll check into why the tools doesn't expand for you when you click the +."

I'm using FF 1.5.0.6 (The latest) on Mac OSX 10.4.7 also the latest.

It is a browser compatability bug, because I do see the drop down menu when I use Safari.

If only I could get a helpful reply to my emails asking why our NFP site is being punished by the algo...

11:59 am on Aug 7, 2006 (gmt 0)

5+ Year Member



I'm also using FF 1.5.0.6 on Mac OSX 10.4.7 and haven't noticed any problems. Specifically, the +Tools does exapand on the My Sites page. But I've been all over the rest of the site without noticing any other problems.
12:14 pm on Aug 7, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Thanks jay5r that's helpful to know.

One additional factor that may help Vanessa, is that I use the FF extensions "Noscript" and "Adblock Plus", but with both set to allow the the Sitemaps pages to run whatever they like. ie. Unblocked and allowed.

Despite this the + Tools menu does not drop down.

7:10 pm on Aug 7, 2006 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



>> Problem is these redirects don't always resolve the problem very fast. I've seen some sites penalized up to 9 months when trying to fix a www vs. non-www issue with many of the old domain pages still in cache up to 3 years later never being updated. <<

As far as I can tell, seeing old non-www URLs in the SERPs as Supplemental Results isn't usually a problem in and of itself. Google holds on to those for years. That is to be expected; but GoogleGuy has just confirmed that are going to be updated to more recently spidered data soon.

The correct measure as to whether the problem is fixed, is to see how many www pages get listed. Every time that I have added a 301 redirect to a site that has never had one before, the number of www pages listed has rapidly increased, the URL-only www listings have turned into full listings, and the number of www Supplemental Results has rapidly decreased.

The non-www URL-only results have declined in number too. The amount of non-www supplemental results has varied, sometimes staying static or sometimes falling, but often increasing by a small amount a month or two after the redirect was first implemented.

Where those non-www Supplemental Results appeared in the SERPs, the redirect on the site still manages to deliver the visitor to the correct www version of the page.

For most sites without the redirect in place, I almost always also found a poor usage of the <title> tag, the same meta description in use on multiple pages, and an incorrect link back to the root index page: always use the "http://www.domain.com/" format, always omitting the index.html index file filename from the link itself.

Clearing up all of those issues, has always helped a site get over the listings problems. Xenu LinkSleuth has often been a great help here too.

Finally, remember that when you link to a folder to always include a trailing / on the URL. A link to just /folder on a page at www.domain.com could see the link pointing to www.domain.com/folder which then gets automatically redirected to domain.com/folder/ (without the www!) to add the trailing / and then redirected onwards to www.domain.com/folder/ to add the trailing / back on again.

The intermediate step at domain.com/folder/ could kill your listings.

7:19 pm on Aug 7, 2006 (gmt 0)

5+ Year Member



All of that has been accounted for and the problem has still existed on some sites.

The url's are redirected properly, the title, meta, slashes are all correct. The only problem is what is showing in the index. All links have been checked and give the proper 301 through Xenu. This was all checked a very long time ago.

Google simply has problems and its obvious. Hopefully the new options fixes the issue.

9:36 pm on Aug 7, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I just setup my site with Google's new control tool (sitemap) and I love it! Anyone know how long it will take G to crawl/index the content I reference in my sitemap?
10:46 pm on Aug 7, 2006 (gmt 0)

WebmasterWorld Senior Member bigdave is a WebmasterWorld Top Contributor of All Time 10+ Year Member



a.) Of the millions of websites, how many of them have different pages for www.domain.com and domain.com?!

b.) Assuming the answer to the above is none, why haven't G engineers figured this out yet?

That is a terrible assumption to make. I can think of quite a few very important domains where that is not the case. this is especially true with domains that have several levels of subdomains, like large corporations and .edu domains.

Historically, there weren't many entities that would route the traffic to different machines based on the port, you routed it based on the domain name. If anything, the default domain was used as the email machine. Then you would have a few default machines with domain names for the services that they offered, such as ftp.example.com. The new, funky thing called the World Wide Web got its own server, usually set up by some geek in a lab that just wanted to play with it. Naturally they gave it a subdomain of www.

Granted, the majority of sites have both the primary domain and the www subdomain pointing to the same site, that does not mean that assuming that they are the same is the correct way to handle it.

While google is aware that they have a problem that they need to sort out, I would rather have them come up with something that will work right in all cases, not only the cases where the webmasters or hosting services are ignorant.

7:17 am on Aug 8, 2006 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Okay, I believe most/all U.S. users should see radically fresher supplemental results now. The earliest page I saw was from Feb 2006, and most of the ones that I looked at averaged in the ~2 month old range.

As data gets copied to more places, the fresher supplemental results should eventually be visible everywhere, not just the U.S.

7:51 am on Aug 8, 2006 (gmt 0)



Can't wait to see my USA supplemental results from 2005 get sorted out.

"fresher supplemental results" is that good or bad?

8:23 am on Aug 8, 2006 (gmt 0)

10+ Year Member



GG

Is that data heading out from the 72.14.207.99 dc (it has hit a fair few more now too)?

It is just that for me the data refresh on the 27th July (the second 27th one) seemed to hit all DCs but not that one - so is that DC (and associated ones) still at a less advanced stage in some aspects but a more advanced stage in others?

2:37 pm on Aug 8, 2006 (gmt 0)

10+ Year Member



woohoo supplemental result clean up - finally! All my pages until 2.Jan.2006 are gone. Also ranking of my websites jumped up.

Now a million dollar question - when / how to get rid of supplemental result after 02-Jan-2006?

Ideas for sitemaps team:

1) Realtime url removal tool from index or any kind of url removal / xml support

2) Duplicate content indicator

3) https/http issue - [domain.com...] == [domain.com...] (like [domain.com...] == [domain.com)...] this will save us alot of CPU :)

2:47 pm on Aug 8, 2006 (gmt 0)

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Okay, the main site I was worried about went from 640 results to 12,200 overnight, most of which are supplemental cache from March 2006, which is definitely a step in the right direction. I'm pretty sure I know why they went supplemental, so I'm pretty sure I can get most of 'em out. On the other hand, another client went from 9600 pages to 78, so now I gotta figure that one out.

GG, it'd be a good thing if they added the URL Removal tool to WebmasterCentral, and I'm all for it, since I really hate how many different times I have to log in to different places (under different Google accounts, mind you) in order to get things done, but my post was actually about the fact that I was having TROUBLE with the url removal tool, and there's no place I can report it or get help with it. That's mainly what I was asking for to be added to the overall picture.

This 167 message thread spans 12 pages: 167
 

Featured Threads

Hot Threads This Week

Hot Threads This Month