Welcome to WebmasterWorld Guest from 18.104.22.168
Has anyone else experienced it?
What do you think are the causes? Insufficent servers?
If you are talking about getting new pages of existing websites indexed: My last new page dates back to Dec 5th and, though not well established yet, a search for a very specifc combination of search terms on that page with only 6 competing pages in the results shows, that it is at least in the index.
did you submit a sitemap through your webmaster central console?
Maybe wikis have to sit in the penalty box due to all of the Wikipedia clones out there.
I have brand new sites with millions of URL's that hardly get crawled.
I have old sites with millions of pages that get crawled millions of times a day.
I have small new sites that have great content and take forever to get indexed and I also have small old sites with useless content that get crawled to death.
It seems to be about the age of the site, possibly the popularity could play a large part of the factoring as well.
I am not saying you are making this up, but most likely it's not an issue of Google's indexing speed in general, but an issue of googles rules of index inclusion for some specific type of content/domains...
The major change was to replace 3 links to news by a link to the high resolution version of the main picture on the page.
When I search
site:example.com "Words of the link"
There are now 453 pages indexed.
But Google did not visit one of the new linked high resolution pictures.
search.livebot.com has already indexed maybe more than 90%
goo.ne.jp starts also great spider activities towards the
high res pictures.
Good PR gets you less indexing these days because Google is focusing so heavily on crawling blog comment links. Link a new page from 10,000 PR0 blog comments and you'll get indexed immediately, and freshed up most every day.
Google's crawl priorities are screwed up and its spam-filled index is the result.
From all the research the two best options I have heard is get more deep links. Which is great, but challenging since we add 50-70 new pages of content per day.
Second, is keep the homepage fresher. We'll start doing this.
Our site is over 15K pages, the site: command shows 24K pages, but using negative matches it appears that there are about 3500 actual pages in the index and only 800 article pages.
Any other suggestions on getting more article pages in the index...we have about 5K of them? The other 10K pages is made up of ways we organize the articles...best, popular, etc. Each list is different.
My newest sites are crawled far less than my old ones. Links seems to encourage crawling as well. My oldest sites with the most QUALITY links perform better than my old sites with less links. Strictly in terms of crawling.