Forum Moderators: Robert Charlton & goodroi
Going through this forum someone provided a link for page validation and discovered all of the pages had something that needed attention. First, there was a missing div tag just before the closing body tag. Second, none of the images had an alt tag. All validate now.
Up to this point when site:nameofsite.com is entered it came back with nothing. I've typed in nameofsite.com and it offers a link to the site, which I take it to be a sign that the site hasn't been banned. Is this accurate?
The funny thing is that Google said that there were no reported errors. Does that mean Google doesn't attempt to validate the pages it spiders?
Is the reason none of the pages were listed on Google because they didn't validate?
Is there a long lag time between Google's visits and when it becomes listed?
I made a few mistakes when I edited the page. Google put those pages in supplementals. When I check, I found those problems and corrected. One by one the pages show up again in index without supplementals.
For new page, my experience is it takes 2-4 weeks to be indexed. It starts as URL only, then appears with desc. and title.
Something that I've been wondering about. Does the index page need to reflect the material in all of the subordinate pages? For example, I've recently put up a site based on an American History textbook originally published in 1899. A dozen chapters are posted and all on separate pages. Although they all deal with historical events what they deal with are distinctly different. One page is about pirates and another about the Louisiana Purchase and yet another about General Custer.
Should the index page address something regarding all of the subordinate pages or does it even matter?
Do search engines recognize a general topic of history or does it see the pages as unrelated? I ask because it seems that search engines are moving in the direction of symantic indexing, which I take to mean that they look for the consistent use of related terms. So far I've only seen references to it for single pages, but it would stand to reason that it could be employed for the entire site. Just wondering.
One last thing. Is the meta tag 'aesop' used by Google?
After reading through the post I went back to validate my HTML pages and found a couple of errors. If you haven't already done so it might be something that could shed some light on the matter--but I'm just guessing at this point.
Two weeks have passed since installing sitemaps. Google's spiders both sites daily. One site shows crawl stats, but site:www.site1.com doesn't even have come back with a link to the site like site2 does. Although site2 does have a link with site:www.site2.com it doesn't have site stats. All it says are that they aren't available at this time. In fact it says that it isn't available at this time for everything except index stats.
Am I expecting too much for Google to list the sites after two weeks? What could the reason be for site2 not having site stats?