Welcome to WebmasterWorld Guest from 107.20.54.98

Message Too Old, No Replies

Google Sitemaps for the mentally challenged

meaning myself..

     
3:42 pm on Nov 17, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Jan 14, 2003
posts:107
votes: 0


Can anyone dumb down Google Sitemaps for me? I'm limited in knowledge, but I would like some of my inner product pages crawled and this seems like the way to do it. I would have no problem just listing the URL's by hand somewhere, is that possible?

scot

4:25 pm on Nov 18, 2005 (gmt 0)

Inactive Member
Account Expired

 
 


You could try creating a text document, say in notepad, called urllist.txt, including a simple list of page urls. Upload it to your root directory.

Google robots should find it.

4:30 pm on Nov 18, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:July 18, 2002
posts:154
votes: 0


>Google robots should find it.

add "after a successful submission here: [google.com...] and it makes sense.

4:30 pm on Nov 18, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Aug 8, 2002
posts:156
votes: 0


There are a number of tools for generating sitemaps. Google gives a list in their Sitemap FAQs [code.google.com...]

I use a free utility called GSiteCrawler which is one of the tools listed in the Google FAQ (above link).

Basically the program crawls your site and generates a file called sitemap.xml. You then upload this to your root directory. Then you sign up to Google Sitemaps and a spider will visit your site and retrieve the sitemap.

4:36 pm on Nov 18, 2005 (gmt 0)

New User

10+ Year Member

joined:June 26, 2005
posts:9
votes: 0


What I want to do is generate and update the sitemap automatically.

We have a content rich site that needs crawling frequently - content changes several times a day - but have only a shared server so don't have access to the root to install/run Python and associated scripts.

Any advice, other than change my useless host, that would allow us to generate the map and then not have to manually update or upload it?

The site is in my profile, by the way.

4:50 pm on Nov 18, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Jan 14, 2003
posts:107
votes: 0


Mistah

I was able to get it done with third party tool help. Thanks!

Are there any third party Froogle feed helpers that make it simpler than doing an Excel worksheet? Trying to get one thing done at a time here.

4:56 pm on Nov 18, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:July 18, 2002
posts:154
votes: 0


You don't need Python, there are 3rd party tools written in PHP, ASP ...
I'll send you a link to start with.
5:52 pm on Nov 18, 2005 (gmt 0)

New User

10+ Year Member

joined:June 26, 2005
posts:9
votes: 0


Thanks Seb, I've had a look...

Should probably also have mentioned that our site doesn't run a database, which makes dynamic XML site mapping something like difficult. Starting to think we should build a database - it seems rather easier than all this, and would let us do RSS too.

Just need someone to key in thousands of pages of content... ;-)

But all this surely still depends on having root access to the web server, doesn't it, to install programs? This we haven't got.

So I need a database and a new ISP...!

6:42 pm on Nov 18, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:July 18, 2002
posts:154
votes: 0


You can create a URL list with tools like GSiteCrawler, then get it converted in a format allowing RSS output too. Although a RSS feed with tousands of items is, hmmm, unusual.
7:20 pm on Nov 18, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Mar 12, 2004
posts:58
votes: 0


>GSiteCrawler

I used this tool on my three year old authority site and submitted the sitemap and the next day my site was banned.

Reinclusion requests have been ignored.

7:42 pm on Nov 18, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Jan 14, 2003
posts:107
votes: 0


I don't think they banned you because of that. They even list it as a thirdy party help tool!
4:47 pm on Nov 19, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Mar 12, 2004
posts:58
votes: 0


There is no question that it is related to the use of the sitemap tool.

That is the only thing which changed in that time frame.

The new sitemap was submitted. Googlebot grabbed the sitemap, crawled the site, and the site was banned.

site:www.theexemplifieddomaininquestion.com show no results and searching for www.theexemplifieddomaininquestion.com says "We have no information about that domain.

Your search - link:www.theexemplifieddomaininquestion.com - did not match any documents.

Yahoo shows this:

Search Results Results 1 - 100 of about 13,600 for link:www.theexemplifieddomaininquestion.com

5:06 pm on Nov 19, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:July 18, 2002
posts:154
votes: 0


Look at the contents of your sitemap.
11:56 pm on Nov 19, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Mar 12, 2004
posts:58
votes: 0


The contents of the sitemap are currently exactly as I would expect them to be.

I am now using a different tool.

Trouble is, Google downloads the sitemap and then does not crawl the site at all, not even the main page.

9:11 am on Nov 21, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:July 18, 2002
posts:154
votes: 0


Look harder. Check the pages, linkage, ...
The tool used to create the XML has nothing to do with the effects you're describing.