|Google and 100K page size limit?|
2 questions that came up in this week's cold calling seo sales rep rants:
1) Google originally stated a 100K page size limit "recommendation". Does that exist any more?
2) Does that only include code, or images?
One guy was adamant that my 69KB animated gif would cause me to be viewed as slower by Google and therefore it would rank me lower.
Total WebPage Size - 17594 (bytes)
Visible Text Size - 6376 (bytes)
Size of HTML Tags - 11218 (bytes)
Text to HTML Ratio - 36.73%
Number of Images - 19
Largest Image Size - 71372 (bytes)
Size of All Images - 128690 (bytes)
Grand Total: 146284 (bytes)
Total Size: 425874 bytes
|These days, Google will index more than 100K of a page, but there’s still a good reason to recommend keeping to under a hundred links or so: the user experience. |
If your page is close to a half megabyte there are probably things you can slim down or just remove from the page. This will improve user experience for users on slower internet connection and mobile users.
|While site speed is a new signal, it doesn't carry as much weight as the relevance of a page. Currently, fewer than 1% of search queries are affected by the site speed signal |
Technically Google does look at site speed but I doubt a 69kb image will significantly impact your rankings.
I would suggest trying to balance user experience with page size. Is the animated gif just pretty or does it demonstrate an important detail that users want to know? If your site is a wedding photographer's gallery then you probably need to load a large amount of images, but if your site is a local car repair shop you probably don't need many high resolution images.
The original 100k recommendation was for text and html not including images. At the point that Google originally made that recommendation, Googlebot would only index the first 100K of the page.
Today googlebot WILL index far more than 100k of the page. 100k downloads very quickly for most users, even those on cell phones.
I would also suggest paying attention to load time under real world conditions. There are lots of things to do to decrease page load time, only some of which involve decreasing the amount of content.
First, make sure your web server enables gzip compression. That can shrink bandwidth bills by 50% or more and really speed up downloads for visitors.
Use Gogole's Site Speed browser plugin or Yahoo's YSlow to see what is actually slowing your site down. Most of the time it isn't large sizes.
I don't know what is the problem but when i check the performance of my site in WMT it shows that my average site load is about 8-10 secs!
However when i check it with the google page speed extension, i get a 94/100 score, and the load time is about 1-1.5 sec . What is the problem, is WMT's site performace analyzis fake?
WMT user real uses with the Google Toolbar or Chrome browser installed. Their load time will include DNS lookup time and network latency that you may not see measuring your site yourself locally. Try some other browser testing sites and see if you site is slow to load from a variety of cities around the world.
WMT also may not be measuring for many users. It will say either "These estimates are of high accuracy (more than 1000 data points)." or "These estimates are of low accuracy (fewer than 100 data points).". If it is the latter, then you may not need to pay much attention to it, especially if the graph varies widely. It may just have happened to hit a few users from North Korea browsing on 2400 baud modems.
These estimates are of high accuracy more than 1000 data points.
So it is not good.
What i like to add that my server is in USA, but i'm in EUROPE, and still i get 1-1,5 sec speed.
I also tested it with several online page testers such as zoomp and gtmetrix. They both were all right.
In the old days, Google used to report the HTML page size in green after the URL. At least several years ago they were reporting sizes up to at least 300 kb, and then they removed the annotation - presumably because size didn't matter any more.
I found something:
-some of my ads are acting strange, at something like every 10-15 pageload, one of the ad is responding very slowly, i 'll test with it and see what happens
Also i have two other questions:
1.) Delayed loading of page:
Some part of my site(pictures, and ADS) is loading based on the browser:
These parts only appear once you really scrolled to that part of the page (made by an ajax script) So how does google measure these parts when it's calculating page speed.
2.)Goole Analytics script:
In google analytics bounce rate and visitor time of a page is false. But there is a script which triggers google analytics, and since i put this script(about 3 days ago) My bounce rate is really low, and page visit length is about 4 times like it was. But because of this script , there is a never-ending pageload, cause it loads GA every now and then. Does this affect the measured page speed? (I've just put this script on my site, so the WMT page speed problem isn't because of this.)
I'd be tempted to insert a few lines of code into your page generation script that says if the user agent is a bot, don't bother serving the ads to them.
Well first that would be cloaking, which can cause my whole website being penalized for it for good, so never do that!
And secondly that wouldn't do anything about page speed cause google calculates page speed based on real people using google chrome browser.
Cloaking is where you fill the page with text that the searchengine sees and indexes but which a regular user never sees.
I don't see a problem here.
Given that they now evaluate pages for "too many ads above the fold", I'm pretty sure that they would consider it cloaking if you didn't show the ads to the bot but did show them to the visitors.
Their help page defines cloaking in terms of "content", not in terms of "text": [support.google.com...] Although the examples they give are all about the addition of text when serving to the bot.
Why show an ad to something that will never click it? It's a waste of time, and falsely boosts the impressions figure: the ad was shown 40 000 times, but 35 000 of those were searchengine bots. And now that Google can run JS and perform POSTs, I guess they'll even appear to click the ads in some cases.
|And now that Google can run JS and perform POSTs, I guess they'll even appear to click the ads in some cases. |
I can see where that could cause some huge problems, too. Will AdSense disregard ad clicks on AdSense ads from Googlebot, or will Googlebot be programmed to not crawl AdSense code? Some affiliate programs track the number of clicks and compare them to the number of actions or sales, and pay per click based on that ratio. A bot running through a site can get your account closed due to invalid clicks.
I agree that ads shouldn't be cloaked to reduce page size, that could cause a lot of problems. Google will likely penalize sites that do this, determining that the ads are being hidden to avoid "above the fold" detection. Plus, I'm not sure page size really matters so much these days. One site I know of has been listed at #1 for years and the page is huge. They just keep posting new content to the one page, and have done so for years. Last time I checked, it was well over 100K, and that doesn't seem to have had a negative impact on it.