homepage Welcome to WebmasterWorld Guest from 54.198.25.229
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Google Sitemaps for the mentally challenged
meaning myself..
SoleDrag

10+ Year Member



 
Msg#: 32076 posted 3:42 pm on Nov 17, 2005 (gmt 0)

Can anyone dumb down Google Sitemaps for me? I'm limited in knowledge, but I would like some of my inner product pages crawled and this seems like the way to do it. I would have no problem just listing the URL's by hand somewhere, is that possible?

 

scot

10+ Year Member



 
Msg#: 32076 posted 4:25 pm on Nov 18, 2005 (gmt 0)

You could try creating a text document, say in notepad, called urllist.txt, including a simple list of page urls. Upload it to your root directory.

Google robots should find it.

SebastianX

10+ Year Member



 
Msg#: 32076 posted 4:30 pm on Nov 18, 2005 (gmt 0)

>Google robots should find it.

add "after a successful submission here: https://www.google.com/webmasters/sitemaps/" and it makes sense.

mistah

10+ Year Member



 
Msg#: 32076 posted 4:30 pm on Nov 18, 2005 (gmt 0)

There are a number of tools for generating sitemaps. Google gives a list in their Sitemap FAQs [code.google.com...]

I use a free utility called GSiteCrawler which is one of the tools listed in the Google FAQ (above link).

Basically the program crawls your site and generates a file called sitemap.xml. You then upload this to your root directory. Then you sign up to Google Sitemaps and a spider will visit your site and retrieve the sitemap.

bigearz

5+ Year Member



 
Msg#: 32076 posted 4:36 pm on Nov 18, 2005 (gmt 0)

What I want to do is generate and update the sitemap automatically.

We have a content rich site that needs crawling frequently - content changes several times a day - but have only a shared server so don't have access to the root to install/run Python and associated scripts.

Any advice, other than change my useless host, that would allow us to generate the map and then not have to manually update or upload it?

The site is in my profile, by the way.

SoleDrag

10+ Year Member



 
Msg#: 32076 posted 4:50 pm on Nov 18, 2005 (gmt 0)

Mistah

I was able to get it done with third party tool help. Thanks!

Are there any third party Froogle feed helpers that make it simpler than doing an Excel worksheet? Trying to get one thing done at a time here.

SebastianX

10+ Year Member



 
Msg#: 32076 posted 4:56 pm on Nov 18, 2005 (gmt 0)

You don't need Python, there are 3rd party tools written in PHP, ASP ...
I'll send you a link to start with.

bigearz

5+ Year Member



 
Msg#: 32076 posted 5:52 pm on Nov 18, 2005 (gmt 0)

Thanks Seb, I've had a look...

Should probably also have mentioned that our site doesn't run a database, which makes dynamic XML site mapping something like difficult. Starting to think we should build a database - it seems rather easier than all this, and would let us do RSS too.

Just need someone to key in thousands of pages of content... ;-)

But all this surely still depends on having root access to the web server, doesn't it, to install programs? This we haven't got.

So I need a database and a new ISP...!

SebastianX

10+ Year Member



 
Msg#: 32076 posted 6:42 pm on Nov 18, 2005 (gmt 0)

You can create a URL list with tools like GSiteCrawler, then get it converted in a format allowing RSS output too. Although a RSS feed with tousands of items is, hmmm, unusual.

katheesue

10+ Year Member



 
Msg#: 32076 posted 7:20 pm on Nov 18, 2005 (gmt 0)

>GSiteCrawler

I used this tool on my three year old authority site and submitted the sitemap and the next day my site was banned.

Reinclusion requests have been ignored.

SoleDrag

10+ Year Member



 
Msg#: 32076 posted 7:42 pm on Nov 18, 2005 (gmt 0)

I don't think they banned you because of that. They even list it as a thirdy party help tool!

katheesue

10+ Year Member



 
Msg#: 32076 posted 4:47 pm on Nov 19, 2005 (gmt 0)

There is no question that it is related to the use of the sitemap tool.

That is the only thing which changed in that time frame.

The new sitemap was submitted. Googlebot grabbed the sitemap, crawled the site, and the site was banned.

site:www.theexemplifieddomaininquestion.com show no results and searching for www.theexemplifieddomaininquestion.com says "We have no information about that domain.

Your search - link:www.theexemplifieddomaininquestion.com - did not match any documents.

Yahoo shows this:

Search Results Results 1 - 100 of about 13,600 for link:www.theexemplifieddomaininquestion.com

SebastianX

10+ Year Member



 
Msg#: 32076 posted 5:06 pm on Nov 19, 2005 (gmt 0)

Look at the contents of your sitemap.

katheesue

10+ Year Member



 
Msg#: 32076 posted 11:56 pm on Nov 19, 2005 (gmt 0)

The contents of the sitemap are currently exactly as I would expect them to be.

I am now using a different tool.

Trouble is, Google downloads the sitemap and then does not crawl the site at all, not even the main page.

SebastianX

10+ Year Member



 
Msg#: 32076 posted 9:11 am on Nov 21, 2005 (gmt 0)

Look harder. Check the pages, linkage, ...
The tool used to create the XML has nothing to do with the effects you're describing.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved