| 5:07 pm on Dec 13, 2006 (gmt 0)|
As for myself, all my sites are being indexed in the matter of days... Probably a week the most, I'd say that's kinda standard.
| 6:27 pm on Dec 13, 2006 (gmt 0)|
I've found them sometimes quick to index, other times very slow to index. I haven't any idea why.
| 10:07 pm on Dec 13, 2006 (gmt 0)|
Are you talking about getting new websites indexed? In that case I associate what has been called "sandbox," though disussions on this term have become quite quiet recently.
If you are talking about getting new pages of existing websites indexed: My last new page dates back to Dec 5th and, though not well established yet, a search for a very specifc combination of search terms on that page with only 6 competing pages in the results shows, that it is at least in the index.
did you submit a sitemap through your webmaster central console?
| 11:04 pm on Dec 13, 2006 (gmt 0)|
I launched a non-commercial wiki site in September, and it still has not been added to the index. The site has original content, plenty of good inbound links, a hundred visitors a day and no spam or ads. It gets spidered by Googlebot every day, but it never shows up in the index.
Maybe wikis have to sit in the penalty box due to all of the Wikipedia clones out there.
| 11:13 pm on Dec 13, 2006 (gmt 0)|
Age of the site makes a huge factor.
I have brand new sites with millions of URL's that hardly get crawled.
I have old sites with millions of pages that get crawled millions of times a day.
I have small new sites that have great content and take forever to get indexed and I also have small old sites with useless content that get crawled to death.
It seems to be about the age of the site, possibly the popularity could play a large part of the factoring as well.
| 7:56 am on Dec 14, 2006 (gmt 0)|
I was talking about new pages from existing sites. My experience is, if they don´t huge PR it´s hard to get them indexed...
| 8:19 am on Dec 14, 2006 (gmt 0)|
Thinkprog, I can't agree. New pages on existing sites (PR 5, 6, but even a blog with PR 3) do get indexed quite quickly, in matter of days. Once indexed, often they don't rank as high as they eventually would, but that's another issue. But if I search by inurl or some unique phrase I can see them in.
I am not saying you are making this up, but most likely it's not an issue of Google's indexing speed in general, but an issue of googles rules of index inclusion for some specific type of content/domains...
| 10:12 am on Dec 14, 2006 (gmt 0)|
I changed 6 December all my sites.
The major change was to replace 3 links to news by a link to the high resolution version of the main picture on the page.
When I search
site:example.com "Words of the link"
There are now 453 pages indexed.
But Google did not visit one of the new linked high resolution pictures.
search.livebot.com has already indexed maybe more than 90%
goo.ne.jp starts also great spider activities towards the
high res pictures.
| 10:57 am on Dec 14, 2006 (gmt 0)|
Googlebot is just more of a weakling than before. New pages on sites with good PR still get in permanently in a matter of days, but slower than the 24 you could count on before.
Good PR gets you less indexing these days because Google is focusing so heavily on crawling blog comment links. Link a new page from 10,000 PR0 blog comments and you'll get indexed immediately, and freshed up most every day.
Google's crawl priorities are screwed up and its spam-filled index is the result.
| 9:32 pm on Dec 14, 2006 (gmt 0)|
We continually see google include a page in its index that wasn't there before, and start to send it good traffic, and then the page is dropped from the index. The site is a little over four months old.
From all the research the two best options I have heard is get more deep links. Which is great, but challenging since we add 50-70 new pages of content per day.
Second, is keep the homepage fresher. We'll start doing this.
Our site is over 15K pages, the site: command shows 24K pages, but using negative matches it appears that there are about 3500 actual pages in the index and only 800 article pages.
Any other suggestions on getting more article pages in the index...we have about 5K of them? The other 10K pages is made up of ways we organize the articles...best, popular, etc. Each list is different.
| 3:44 am on Dec 15, 2006 (gmt 0)|
So now we have a new little twist.
The site command now shows 24K pages, but only about 3000 are in the main index. The rest are supplemental.
| 6:18 am on Dec 15, 2006 (gmt 0)|
I agree, PR is no longer a factor. Age and quality seem to be the main factors.
My newest sites are crawled far less than my old ones. Links seems to encourage crawling as well. My oldest sites with the most QUALITY links perform better than my old sites with less links. Strictly in terms of crawling.