|xml sitemap parameters. are they required ?|
I am using XML sitemap generator to generate xml file and i guess many others also do the same. But i dont understand the meaning of parameters like Frequency, LastMod or Priority, are they really required ?
My site is not that big around 400-500 pages. So the first point is if XML Sitemap is required ?
No, and the search engines will probably ignore them anyway. If it's a choice between what g### thinks is important and what you think is important, who do you think will win? :)
I think LastMod is certainly worth using if you are able to provide it accurately and dynamically (so less likely to be the case if using a generator). One of my sites has a relatively large forum, which is mapped with LastMod set to the date of the last comment on the thread, which works really well.
An intelligent search engine, assuming that it has gained trust in my sitemap can then allocate resources more towards indexing new content and my main pages more frequently than it otherwise might do so.
I don't use Frequency (redundant anyway if you support LastMod) or Priority at all, but I think the intention is just to serve as hints to search engines, which are pretty good at working that kind of thing out for themselves!
[edited by: dmorison at 11:27 am (utc) on May 22, 2013]
Are u saying this with context to parameters or whole sitemap.
|No, and the search engines will probably ignore them anyway. |
Offcourse g###, its their game only.
|who do you think will win? |
|so less likely to be the case if using a generator |
Can you explain how you setting the lastmod to the last comment date, i also have a forum section in my site.
|One of my sites has a relatively large forum, which is mapped with LastMod set to the date of the last comment on the thread, which works really well. |
|Can you explain how you setting the lastmod to the last comment date, i also have a forum section in my site. |
When I created mine I studied the code that generates the index page for humans to find out how to query the thread list and date/time of the last comment and then it's a case of creating a script which executes that query and outputs the results in the required format.
Have a search for "[your forum script] dynamic xml sitemap lastmod" - somebody may already have created similar for whatever system you're using.
Don't forget that you can have multiple sitemaps - the only restriction is that they can only map URLs at or below their own location, so if you found a dynamic script for your forum software, and it is installed in the /forum/ folder you could still use your generator script (with the /forum/ folder excluded) to make:
...and at the same time serve:
...to map forum pages with LastMod
|if you are able to provide it accurately and dynamically |
Can you tell for sure that the search engine is using the information from your sitemaps and not from its own observation? You said "gained trust in my sitemaps". I've wondered about that occasionally. Is some part of google's computer set aside for comparing site-map statements with observed fact, so after a time it can "decide" whether to trust the sitemap when a new page is listed?
Seems like it would be awfully hard to test this without deliberately putting wrong information into a sitemap and then watching the search engine's behavior. And you couldn't do it for long or you'd shoot yourself in the foot :(
Actually i am not using any software for forum section. There is web pages navigation system like on main forum page contains the list of all categories-->next page contains list of subcategories based on the category which is clicked---> next page contains all the post related to the category and subcategory from the last two pages--> next page will show the actual posts and option to give reply on that. I am using URL routing and some code logic to create seo friendly URLs for each of the page. So there is only single page for all the posts in the background. Its such that the page shows corresponding post, its title, url for each post.
|dynamic script for your forum software |
|Can you tell for sure that the search engine is using the information from your sitemaps and not from its own observation? You said "gained trust in my sitemaps". I've wondered about that occasionally. Is some part of google's computer set aside for comparing site-map statements with observed fact, so after a time it can "decide" whether to trust the sitemap when a new page is listed? |
I totally accept your point that an intelligent search engine in conjunction with a well structured website should be able to work it all out for itself - case in point for example; if somebody replies to a 2 year old thread it is, just like here on WebmasterWorld, "bumped" to the top of the forum index page which is crawled regularly, so even if a search engine has known about that URL for 2 years that should be enough of a hint that something might have changed.
Rather, I see LastMod as being more useful to a search engine in terms of saving it from going through the motions of determining a change frequency for itself (a straight forward algo, but does take time and resources). But no, I haven't as yet studied crawling patterns before / after using a dynamic LastMod sitemap so could never say for sure that it makes any difference at all, but that's why I would only do it if it can be done accurately, and if that's the case it certainly can't do any harm.
Regarding the "trust" issue, yes - that's basically the assumption - I wasn't thinking so much in terms of any kind of intentional mis-information from a webmaster - telling porkies in your sitemap isn't going to get anyone anywhere; it's more for the search engine's own benefit to cover situations where perhaps a sitemap has been neglected by a webmaster - not a good idea but i'm sure it happens - they put one up when XML sitemaps were the latest hot topic in SEO and then forgotten about it - leaving it online with quite possibly historic manually crafted LastMod dates that don't even remotely correlate with observed fact!
If a search engine did asses sitemap quality before relying on it too much then it is of course possible that if it sucks then it could be seen as a negative quality indicator of the domain overall, but I wouldn't have thought it would be that highly weighted as one. Googlebot is quite capable of saying to itself - "Hey, great content here. Sitemap is as much use as a chocolate teapot but we'll let 'em off!"