Welcome to WebmasterWorld Guest from 23.22.182.29

Message Too Old, No Replies

How to Increase Google Crawling Rate?

     
6:55 am on Jun 10, 2008 (gmt 0)

New User

5+ Year Member

joined:Oct 27, 2006
posts: 39
votes: 0


Hallo Friends,

I have done optimization for my one website.

But in that site i don't know why Google taking so much time for crawling my webpages. Google crawling my Home page every 10 to 12 day and all inner pages crawling every 15 to 20 days.

Even I have already done deeplink submission for all my inner pages and also i am tweaking my pages every 3 days.

My site is one year old. My home page size is 38 KB and all inner pages size are 20KB to 30KB. What do you think? 38KB page size is difficult to crawl for Google. It creates problem in crawling my webpages?

I also checked in webmaster tools that my craw rate is normal.

Please suggest me your ideas that how i can increase my crawl rate in Google.

7:18 am on June 10, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


One step that can help is if you get new backlinks from quality sites. Also make sure that your titles and meta-descriptions are unique and specific to the page they appear on.

Make sure that you have no technical errors creating more than one URL for the same content. Googlebot crawls with a budget, and if it spends part of that budget downloading duplicate content from different URLs on your site, then it has less budget for crawling the rest of your URLs.

7:31 am on June 10, 2008 (gmt 0)

New User

5+ Year Member

joined:Oct 27, 2006
posts:39
votes: 0


My all meta tags are unique and also content of my all pages are unique.
7:54 am on June 10, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts: 37301
votes: 0


Are you 100% sure? Does the server handle these url issues without ever creating a second 200 response?

1. incorrect query strings
2. 404 pages (at both the server level and the platform level)
3. https protocol requests
4. different order of the parameters or virtual folders
5. index.html vs. directory root
6. mixed up letter cases
7. double slashes in the file path

If so, you have a rare creation. I have had many sites come to me for help, and almost all of them have at least one of these issues.

The best way to increase crawl rate, besides telling Google that you want it with your Webmaster Tools account and within your sitemap, is to have a very prominent site, well respected in your field and widely linked to. The more important Google sees your site, the better your crawl will be.

Also, make sure your server is handling if-modified-since requests properly, has not the least little DNS problem, no major server delays, no load-balancing issues, makes proper use of E-tags, and uses compression appropriately especially on text files.

If you've got all that handled on your side, then the ball is in Google's court. You cannot force a more frequent spidering, you can only invite it by building a technically solid site with truly world-class content.

8:10 am on June 10, 2008 (gmt 0)

Junior Member

5+ Year Member

joined:May 19, 2006
posts:109
votes: 0


You can force more frequent spidering through a variety of black, grey, pink and white hat techniques. There are literally hundreds of ways to get more backlinks to your sites if the purpose of those backlinks is to increase spider activity and not to build PR. The more times you get your pages into Google's pile of pages to be spidered then the more times you will be spidered, mathematically speaking.
10:22 am on June 10, 2008 (gmt 0)

New User

5+ Year Member

joined:Oct 27, 2006
posts:39
votes: 0


yea I am sure my all meta tags and content are unique. I have checked all my content in copyscape.com.

In webmaster tools the radio button of high crawl rate is disable to so no it is not possible.

and Server side my site is OK right now there is no any issue

10:43 am on June 10, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts: 37301
votes: 0


radio button of high crawl rate is disable

Only more backlinks/PageRank around your site will undo that grayed out choice. Build some buzz, as they say!

10:55 am on June 10, 2008 (gmt 0)

New User

5+ Year Member

joined:Oct 27, 2006
posts:39
votes: 0


There is PR4 of My Home page and all inner pages have PR3. I have already done directories submission, article submission, SBM and i am alway placing comment on my website category related blogs. and Task of Article Submission, directories Submission and SBM is still continue....
1:29 pm on June 10, 2008 (gmt 0)

Junior Member

5+ Year Member

joined:Jan 7, 2008
posts:164
votes: 0


One thing that has an effect on how often Google is willing to come (for us at least) is to reduce your page execution to a bare minimum (we shoot for 16ms) and ensure the load you experience from an increase from crawling is handled without an increase in page execution (most often db being the bottleneck).