Forum Moderators: Robert Charlton & goodroi
Seems strange that Google are introducing this service now and not try to improve googlebot when following links.
[google.com...]
[google.com...]
[qoute]Google Sitemaps is intended for all web site owners, from those with a single web page to companies with millions of ever-changing pages.[/qoute]
[qoute]You want Google to crawl more of your web pages.
You want to be able to tell Google when content on your site changes.[/quote]
Google Sitemaps at [google.com...]
Any experiences anybody?
Google Sitemaps is an easy way for you to help improve your coverage in the Google index. It's a collaborative crawling system that enables you to communicate directly with Google to keep us informed of all your web pages, and when you make changes to these pages.With Google Sitemaps you get:
* Better crawl coverage to help people find more of your web pages
* Fresher search results
* A smarter crawl because you can provide specific information about all your web pages, such as when a page was last modified or how frequently a page changes
[google.com...]
This could be VERY helpful for sites. Using an XML formatted file you can tell Google what pages you would like crawled, when they were last updated, and what their relative importance is to the site. There are even facilities to see when the sitemap was last picked up and processed.
If this service works, it could finally solve some big problems I've had with identifying high importance over low importance content on my site. The fun part is going to be building auto-generators so that the sitemap is updated with correct info as I update my site.
Is anyone noticing an improvement in crawling or ranking as a result of using one?
Does anyone have any reservations?
[google.com...]
Interested on opinions on the new XML (/text) sitemaps integration from Google. Is anyone using them yet and do you think this is likely to help improve SERPS?
I get crawled every couple of days anyway but wondered if its potentially worth setting one of the Google ones up specifically. Be interested in peoples opinions but am thinking about trying the text version for a starter, despite it being given lower priority as i understand it.
Cheers
Simmo!
The real test will be to see how long before pages picked up through this service show up in the SERPs. This could be a very big improvement over post it and wait for the crawler to find you operations.
Or whether it is in fact a tool aimed at identifying spam/scrapers in some way.
I doubt that this is behind the sitemaps.
A lot of private small sites have no capabilities of running python scripts at a shell prompt or even know about.
Just tried and submitted for one of my sites and watching the logs now what will happen then ...
Google first says something about
# Better crawl coverage
# Fresher search results
# A smarter crawl
on one page and later says
# Please note that we do not add all submitted URLs to our index, and we cannot make any predictions or guarantees about when or if they will appear.
Lets see ...
Regards,
R.
Google Sitemaps
Anyone using them yet?
Looked at the info... don't think I'll touch this one with a ten-foot pole.
'Something doesn't sound right' ... Just my opinion. :)
I have run the sitemap program and it's identified 57823 pages on my site, which seems accurate.
I'm going to set up a cron and have it run once and hour, since it doesn't seem to put any undue load on the server. We add about 20 to 30 new pages of content per day, so hopefully it'll speed up how soon those pages get indexed.
I'm still concerned about the effects of Bourbon, but hoping (as is my wife and bank manager) that this will normalize the SERPs for my site.
my wife and bank manager
[google.com...]
7. Will participating in this program change my pages' ranking in Google search results?No. Using Google Sitemaps will not influence your PageRank; there will be no change in how we calculate the ranking of your pages.
Just noticed this... interesting how they use PageRank and "page's ranking" in the same statement.
Perhaps I'm just analyzing the FAQ to much :) I think most of us agree that PageRank has nothing to do with a page's rank - in most cases ( at least for now ).
But I wrote them just to be 100% clear on it. No response yet.
It's in beta stage, so yes, we'll try if we're cleared to. It won't do anything to improve our after Bourbon position, I'm sure, but hopefully some new pages will get picked up faster. Same for new sites, I hope it picks these up faster too.
Google wont take webmaster's feeds as gospel truth however, pages will still need to be found through normal linking structure. Otherwise massive manipulation would occur, with unscrupulous webmasters generating massive unusable content in the hopes of getting rankings.
The major downside is it requires python. Hopefully PHP and the other popular scripting languages are not far behind.
i think this has promise.
The major downside is it requires python. Hopefully PHP and the other popular scripting languages are not far behind.
I wrote an app in VB in about 20 minutes to parse my log files and bam -- instant site map. I based priority off how many backlinks there were.
Took longer to parse a month's worth of logs than it took to write the parser.
S
The major downside is it requires python. Hopefully PHP and the other popular scripting languages are not far behind.
Can you make a simple text based list of every URL on your site that you want crawled? That works just as well.
I'm going to be automating mine over the next week, so that when something changes, so does the sitemap, but for now I just change my froogle feed generator slightly to also make an URL list.
The page priority setting might be useful in some circumstances. If you have two or more pages listed on the same page of the SERPS, you should be able to use the page priority to list the better one, However I find that Google seems to list the right one on all my sites.
Using the Google Sitemaps will mean that more of your pages get indexed, but I can't see that doing much good. Any pages that aren't indexed won't rank high because they won't have enough Link Popularity (Page Rank, trust rank, or whatever Google uses these days).
Google on the other hand benefit, because they'll use up less bandwidth (they'll be able to crawl pages less frequently). Also their index size will increase quite substatially, which'll come in handy next time they want to get a bit of news coverage.
Having said all that, I'm writing an ASP.Net Control as we speak to generate a Google Sitemap for some of my sites to see what happens :-)