Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Page size limit for Google indexing?

         

migthegreek

12:39 pm on Dec 3, 2008 (gmt 0)

10+ Year Member



I can't find any recent reliable sources on Google's current limit for indexing pages. I am told that it's 100KB by an SEO guy, but he doesn't actually know much about technical web dev, so I'm reluctant to simply take his word for it. That sounds very outdated to me... this is 2008, and surely acceptable web pages are expected to exceed 100K in a lot of cases.

What's the general consensus at the moment? Lots of things I've found are from around 2005/6.

Also, does this size limit include page components such as external JavaScript files (I mean jQuery alone adds 75KB in JS files for me), or is it just for the HTML and images (or CSS)?

tedster

6:38 pm on Dec 3, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hello migthegreek, and welcome to the forums.

I can find many pages in search results where the html file alone is more than 100K these days. And when you click on the cached page link, I see can the entire text content for those pages. If the page went past some cut-off point, then the cached page would cut off. I sometimes see this with very large pdf files, for instance.

Perhaps your SEO friend was concerned more about ranking, not just indexing. Very long pages can have a lot more trouble ranking, for several reasons. They are often not very user friendly, especially for people on a slower web connection. The topic often tends to cover more than one theme, and that can cause ranking problems, too, with too much semantic variation on one page.

If the size of the source code includes a lot of scripts or css, those can be moved into linked files to help bring down the size. But even just 100K of pure text makes for a lot of user challenges. I'd say breaking it up into more than one page would make sense in almost every case.