Welcome to WebmasterWorld Guest from 23.20.165.182
Forum Moderators: Robert Charlton & goodroi
Seems strange that Google are introducing this service now and not try to improve googlebot when following links.
[google.com...]
I've tried several- some take forever, some hang, some don't work- that one performs well
B
[edited by: lawman at 2:56 pm (utc) on June 8, 2005]
[edit reason] No tools please [/edit]
i am using (for the moment) the lame version of Google sitemaps (that’s the .txt option). Creating a full list of my 'important' content takes just a little more time.
I saw my Google ranking rise from 6 to 7 within one day after i put the Google sitemap (.txt) online. I had some big troubles with my 'mother site' and underlying 'daughter sites'. Now I can point out in one simple way all my content what was normally for the Google bot to deep to find.
So yeah I am working now on the xml version for even better coverage of the Google sitemaps.
Bjurn
Can you please use the domain name <snip> to generate xml sitemaps as the indext.php is a testing script (that I acidently left live and linked :)
If you can edit the previous posts to reflect this that would be great (I have removed that script for the minute so its a dead link.
Ta
Barry
[edited by: lawman at 2:58 pm (utc) on June 8, 2005]
[edit reason] No Url Drops Please [/edit]
I can't find anywhere it specifies whether it caan be .txt or if it should be .xml
Can anyone who's been successful with their text file sitemap sticky me with the first 10 lines or so of it?
Thanks
B
<?xml version="1.0" encoding="UTF-8"?><urlset xmlns="http://www.google.com/schemas/sitemap/0.84">
<url>
<loc>http://www.domain.org/page.html</loc>
<lastmod>2005-05-07T02:23:14-05:00</lastmod>
<changefreq>yearly</changefreq>
<priority>0.1</priority>
</url>
Simply paste that into notepad, paste a new <url></url> pair for each url and tweak whatever else you need, and save it as sitemap.xml. Can't be as good as generating it but gotta be better than just a list or urls.
Give my website a razz you can submit a domain name or a links directory (directorys only at the mo - make sure your default document is the one fired when you type just the sub-dir or domain - I am fixing this)
It has the limitations of most the online generators at the mo - ie can't decipher pag creation date or priority. But it will produce a easy to edit xml document like the one you describe - its also a little quicker than cut and paste
If you have any problems with the generator- give me a shout and I'll help you out.
I can soon have you xml file up there an indexed!
Ta
Barry
<snip>
[edited by: lawman at 2:59 pm (utc) on June 8, 2005]
[edit reason] No Signatures Please [/edit]
I just want to add some information about the php script mentioned earlier:
phpSitemap - "create your personal google sitemap file", current version 1.2.2
Current features:
- Set filter for file and directory names for files to be excluded.
- Reads the last modification time and sets the change frequency accordingly.
- You can specifiy the intial (fixed) priority of each page.
- Create a sitemap.xml file and submit this url to google.
New features:
- File information can now manually set (for each file): enable/dissable a file, last modification time, change frequency, priority
- All settings (also for files) will be stored and are used for further runs.
Known limitations:
- Not tested with huge sites
- Cannot handle dynamic links (like index.php?id=13) - this will be integrated in the near future.
<snip>
Regards,
Tobias
[edited by: lawman at 3:08 pm (utc) on June 8, 2005]
[edit reason] No URL Drops Please [/edit]
********************
The Google Services are made available for your personal, non-commercial use only. You may not use the Google Services to sell a product or service, or to increase traffic to your Web site for commercial reasons, such as advertising sales.
********************
How will this affect adsense publishers?
You don't need a Google account to participate in the Google SiteMaps program. Submitting sitemaps to Google helps Googlebot to adjust the crawling of your web site. It has no impact on your rankings. It does not directly increase traffic to your site. You give Google your sitemap.xml file for free, thus you're not advertising sales.
Since it seems that related links are tolerated in this thread, here is my Google SiteMap Tutorial:
<snip>
HTH
[edited by: lawman at 3:09 pm (utc) on June 8, 2005]
[edit reason] Link Drops Are Not Tolerated [/edit]
If your working environment and language is English, your computer probably interpret characters as either ASCII, "Latin-1" or utf-8. ASCII defines the first 128 characters; unaccented letters and numbers and punctuation, such as what I am writing here. Latin-1 (a.k.a. ISO-8859-1) defines an additional 128 characters that contain some symbols and accented characters used in languages mostly derived from Latin (French, Spanish, Italian, for example). UTF-8 defines pretty much all characters in use in modern languages today, including "CJK" (Chinese, Japanese and Korean).
ASCII is a subset of UTF-8.
URLs which typically don't have more than letters, numbers and the special characters like //: &? = and so on. So unless you see accented characters in your URLs, you should be fine. Make sure not to use MS Word to create your file as it may do unexpected things to letters -- a text editor (like notepad in Windows) would be better.
It's unclear (to me) what would happen if your text file had URLs that contained accented characters -- you might need to do some fancy entity escaping or encoding, in which case, I recommend just using XML (which you can also, which declares the character set being used and is only slightly more verbose than typing the URLs in by hand.
May want to look at - <snip> Python version 2.2 [google.com...]
[edited by: wakahii at 2:25 pm (utc) on June 8, 2005]
[edited by: lawman at 3:09 pm (utc) on June 8, 2005]
[edit reason] No Link Drops Please [/edit]
lawman
Anyway I have now added a feature thats picks up the PR of you page and assigns it to the priority tag of the xml file that you want to produce. Although this is not perfect it should make SitemapsPal a lot easier to edit large files. (thanks to those that have requested larger outputs and shown an interest)
If anybody has anymore questions on Sitemaps and how to use them - fire away. I plan to stick around this forum - not just advertise my latest creation.
Its a good job somebody posted a link to my site from here otherwise I'd never have found ya.
Ta
Barry
SitemapsPal
I am awaiting a response from a moderator before posting the url.
Check it out here:
[ethangiffin.com...]
Opie
I created an xml sitemap, and submitted it to Google last night, but as I'm no expert at this, I'm not sure if I did it right.
However, Google has dowloaded it, as says its "ok", and I have no errors in that section of "My Sitemaps".
Does this mean I've done it right, or is my sitemap in storage until Google finds the time to check it?
I'd be grateful if anyone could shed any light on this, thanks in advance, FTWB05.
I've just checked Google to see if it had cached any more of my pages since I uploaded my site map last night, and guess what?
Every single page that I included in the sitemap has been cached, whereas before I only had 5 or 6, and was pulling my hair out as to why....
The site is only a couple of months old, so I was just waiting for Google to take its natural course, but submitting my sitemap has definitely speeded things up, off to check where those pages are appearing in Googles listings now....
....and to add more pages to my sitemap (I had to hand do it, couldn't figure out how to dynamically create one!)
Thanks, FTWB05
All is not rosey though - I havn't seen a big jump in traffic, and another, older site that I submitted a sitemap for hasn't been cached.
Maybe I'll have to wait for an update of the listings - just too impatient!
Using this, I have managed to get several of my URLs crawled and indexed, which were URL only or supplemental results for a long time.
However I got into another trouble now. As I used the Python based script provided by Google, it included ALL the files in my directory. Several of these were identical to the home page (used for testing or PPC) and because I was not very sincere in editing the list produced by Sitemap generator, these pages have also been indexed and I am afraid, I will get another round of duplicate content penalty.
What next?
I removed these pages from the site location. I have also deleted the reference from the sitemap.xml file. But how do I get them removed from the index. Looks like I will have to fiddle with the URL remove utility once again (scary thought, eh?)
So be very diligent in checking your auto-generated XML file and remove all the references to files you don't want indexed - before you submit it to Google.
Someone mentioned this tool will help Google discover a large part of the hidden web. You bet!