homepage Welcome to WebmasterWorld Guest from 54.226.18.74
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Current optimal page size for Google?
ichthyous




msg:3693425
 4:51 pm on Jul 8, 2008 (gmt 0)

I recently put my site on a diet as I had some pages at 300-500K. I cut out any unnecessary CSS, javascript, reduced image sizes and quality, split some pages that had too much content into two pages, etc. Seeing that my site is basically an image gallery I need to have very high quality full size images on the site. This means that many of my pages are still 150K and a few are still around 200K. Are these file sizes still considered problematic for Google and the other search engines?

Thanks

 

Ajaxunion




msg:3693598
 7:22 pm on Jul 8, 2008 (gmt 0)

I would say that it depends on the speed of your server. If you dont have a fast server and your site is slow dont make your pages heavy the big g likes speedy sites. Many people say to try and limit your pages to 100k but i dont think that rule applies anymore. There are so many more important things to worry about than page size. If most of your pages are under 100k you should be safe.

Ill give you a hint. go to www.google.com and do a random search. Look at each result and see how much k each one is. If in your industry there are many people with 500k pages ranking number one (please show me this if you find it!) than you dont have to worry but in most cases you dont see pages more than 100k for very competitive keywords!

Good luck!

ichthyous




msg:3693610
 7:37 pm on Jul 8, 2008 (gmt 0)

My server is very fast fortunately. It would be hard for me to get most of my pages under 150K...my home page is now 158K. Continuing to whittle away at it though...I can't wait to see the effect that slashing my page sizes has on indexing speed and ranking.

Receptional Andy




msg:3693623
 7:55 pm on Jul 8, 2008 (gmt 0)

If in your industry there are many people with 500k pages ranking number one (please show me this if you find it!) than you dont have to worry but in most cases you dont see pages more than 100k for very competitive keywords!

But then, 500k pages are certainly not the norm, so it's not expected that you would see many prominently placed. They should be a rarity, purely based on natural distribution.

I need to have very high quality full size images on the site

Image size is irrelevant to how a page performs in search results: images are collected and indexed separately. The size in kb next to results is purely the HTML size, and even Google AdWords only measured the HTML download time the last time I checked.

Make the page load as fast as is humanly possible, since every visitor likes a fast page, and being popular is great SEO. But don't stress too much about an overall measurement of size: look at things like Wikipedia articles for competitive single words, or many major brand websites to see quick evidence that page size (in numbers terms) doesn't stop you performing for competitive phrases.

IMO, there's no optimal page size for Google (if indeed there ever was!): there isn't even an optimal amount of text. It's a judgement call based on the topic, the author and who's competing.

ichthyous




msg:3693652
 8:13 pm on Jul 8, 2008 (gmt 0)

If images are not included into the mix with Google then there are no worries as my html, css, scripts are around 50-70k combined. Still, some of my pages were verging on 600k before so I expect cutting that down by over two thirds will provide a bump. I always thought my bounce rates were unusually high and now I know why! Hopefully the site will be stickier now and more people will link to it/bookmark it

Receptional Andy




msg:3693695
 8:53 pm on Jul 8, 2008 (gmt 0)

Excessive page load times remain a serious usability problem, especially for sites that need to repeatedly acquire repeat visitors (first impressions count).

It's also an easy thing for a site owner to forget that when they view their own site, nearly everything will be cached, and load extremely quickly. A 500k page is likely to take longer than 10 seconds to fully load for the majority of users, and most new users don't have that kind of patience.

Of course, the numbers aren't really the problem for usability either: below the fold can take considerably longer to load if a user immediately sees the top half of a page that exactly matches their needs.

ayalon




msg:3694543
 4:35 pm on Jul 9, 2008 (gmt 0)

Do you use gzip output compression? This makes your pages much smaller...

Robert Charlton




msg:3695988
 3:48 am on Jul 11, 2008 (gmt 0)

File size may well affect how many pages get spidered during a Googlebot visit.

The higher your PageRank, the more time Googlebot will spend on site. So if you have a lot of pages and low PR, you don't want your files to be large.

Patrick Taylor




msg:3696141
 8:52 am on Jul 11, 2008 (gmt 0)

... how many pages get spidered during a Googlebot visit.

Is this how it works? I mean doesn't Googlebot visit pages rather than a 'website'?

I agree with the benefits of fast loading pages for humans who don't like to wait. I'm curious to know, though, how long Googlebot is prepared to 'wait' on a particular page. I assume Googlebot is basically a program that reads pages and puts the relevant content into a database (without downloading images) and that this program has a timeout set by Google.

tedster




msg:3696502
 5:39 pm on Jul 11, 2008 (gmt 0)

As I understand it, googlebot has a "crawl budget" for each domain. Most (all?) of its url requests come from a url list for the domain that was previously discovered, prioritized and filed away for future spidering adventures.

The crawl team has some pretty complex and evolving algorithms for prioritizing and budgeting, and it's not clear to me whether the budget sets a total time, total bandwidth, or (most likely) some combination of the two. Consideration is given both to Google's needs and to accomodating the server's abilities.

g1smd




msg:3696799
 12:01 am on Jul 12, 2008 (gmt 0)

I rarely have an HTML page with combined HTML code and worded content over about 25 to 30 KB.

All CSS and JS goes in external files, and images are of course separate too.

The bot retrieves the HTML and nothing else. Browsers pull everything needed to render.

Google seems to like to index 95% or more of the pages, though they take a few weeks to get everything, growing slowly.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved