Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google spiders site map, but not getting listed

         

Storyman

12:24 am on Jan 4, 2006 (gmt 0)

10+ Year Member



Google regularly spiders my sitemap, which is a text file. The site has been verified. It's been a couple of weeks and haven't been listed on Google.

Going through this forum someone provided a link for page validation and discovered all of the pages had something that needed attention. First, there was a missing div tag just before the closing body tag. Second, none of the images had an alt tag. All validate now.

Up to this point when site:nameofsite.com is entered it came back with nothing. I've typed in nameofsite.com and it offers a link to the site, which I take it to be a sign that the site hasn't been banned. Is this accurate?

The funny thing is that Google said that there were no reported errors. Does that mean Google doesn't attempt to validate the pages it spiders?

Is the reason none of the pages were listed on Google because they didn't validate?

Is there a long lag time between Google's visits and when it becomes listed?

imweb

3:39 am on Jan 4, 2006 (gmt 0)

10+ Year Member



Check your meta tags. Make sure title and description unique. Ensure meta tag robot content says "index, follow".

I made a few mistakes when I edited the page. Google put those pages in supplementals. When I check, I found those problems and corrected. One by one the pages show up again in index without supplementals.

For new page, my experience is it takes 2-4 weeks to be indexed. It starts as URL only, then appears with desc. and title.

Storyman

3:49 am on Jan 4, 2006 (gmt 0)

10+ Year Member



imweb, thanks for advice.

tigger

7:02 am on Jan 4, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>Ensure meta tag robot content says "index, follow".

do you really think that G will look at that tag & indeed take any notice of it - I've never used it but am have some indexing issues but I can't see how putting "index, follow" will make any difference

anyone else have any views on this

CainIV

7:22 am on Jan 4, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Index,follow is not needed anymore to help site indexing today as the SE's primarily index sites based on links and find associated root page urls via links to home.

SebastianX

11:09 am on Jan 4, 2006 (gmt 0)

10+ Year Member



"index,follow" or "all" is the default value assumed when there is no robots meta tag. Improving crawlability and linkage will help instead.

Storyman

5:36 pm on Jan 4, 2006 (gmt 0)

10+ Year Member



Do I understand this correctly? Crawlability means primarily eliminating javascript and tables whenever possible and using CSS to increase the content/code ratio (so it favors content). Linkage means that other sites that deal with the same topic as yours link to yours. On a lessor note it also means that all of the subordinate pages link back to the home page as well as other pages on the site.

Something that I've been wondering about. Does the index page need to reflect the material in all of the subordinate pages? For example, I've recently put up a site based on an American History textbook originally published in 1899. A dozen chapters are posted and all on separate pages. Although they all deal with historical events what they deal with are distinctly different. One page is about pirates and another about the Louisiana Purchase and yet another about General Custer.

Should the index page address something regarding all of the subordinate pages or does it even matter?

Do search engines recognize a general topic of history or does it see the pages as unrelated? I ask because it seems that search engines are moving in the direction of symantic indexing, which I take to mean that they look for the consistent use of related terms. So far I've only seen references to it for single pages, but it would stand to reason that it could be employed for the entire site. Just wondering.

One last thing. Is the meta tag 'aesop' used by Google?

stevelibby

12:27 pm on Jan 5, 2006 (gmt 0)

10+ Year Member



I also have mixed feelings about sitemap,
However i have noticed something strange, i have a standard navigation bar in the left hand side of my page, because the bar is shown accross the site in the root and sub folders all links are www.domain.co.uk/page.asp and so on, i have noticed in my errors list that G appears to be taking no notice of the www.domian.co.uk and just looking for the /page.asp! why is this
am i doing something wrong?

Storyman

5:52 pm on Jan 5, 2006 (gmt 0)

10+ Year Member



SteveLibby, I'm just wading into the immense pool of search engine knowledge and find what I thought and what the general thinking quite different.

After reading through the post I went back to validate my HTML pages and found a couple of errors. If you haven't already done so it might be something that could shed some light on the matter--but I'm just guessing at this point.

stevelibby

6:19 pm on Jan 5, 2006 (gmt 0)

10+ Year Member



Hi
All html is all validated and correct all links have been checked via link sleuth and even that program does not show the erros that g has shown me?

Storyman

7:08 pm on Jan 5, 2006 (gmt 0)

10+ Year Member



What's confounding me is that with the sitemap Google has started to spider two sites, but still neither has made it into Google's ranks.

Two weeks have passed since installing sitemaps. Google's spiders both sites daily. One site shows crawl stats, but site:www.site1.com doesn't even have come back with a link to the site like site2 does. Although site2 does have a link with site:www.site2.com it doesn't have site stats. All it says are that they aren't available at this time. In fact it says that it isn't available at this time for everything except index stats.

Am I expecting too much for Google to list the sites after two weeks? What could the reason be for site2 not having site stats?