| 11:26 am on Mar 23, 2005 (gmt 0)|
Google usually only indexes the first 100K of your site. Anyting above this won't appear in the index so it won't be found.
| 11:37 am on Mar 23, 2005 (gmt 0)|
He is correct.
By reducing the image weight you make more easy to navigate your web site for the user: statistics display that if a page require more of 15 seconds to open ,it is abandoned.
So if you reduce the image weight it can be opened more quickly and the user feels better to navigate it.
Not only: Search Engines "prefer" and better index simple pages rich of good contents instead of a lot of images.
The codes in general have a "weight" for the Search Engines;this is out of question: too much codes or their improper codes could result in a penalty or at least in a difficult ranking.
Usually,to make the page "Search Engine friendly" its code have to be validated with purpose made software,that provide to "clean up" and make "slim" the page.
You could have an idea visiting this web site: www.
But is something that you shouldn't do if you don't know well what are you doing.
| 12:25 pm on Mar 23, 2005 (gmt 0)|
<statistics display that if a page require more of 15 seconds>
Off topic but I have it as 8 seconds.
| 1:08 pm on Mar 23, 2005 (gmt 0)|
The quicker a page loads, the better. You might consider not just shrinking the page but splitting it into several pages.
| 1:19 pm on Mar 23, 2005 (gmt 0)|
That's a very good suggestion;
You can so increase the search terms and the general visibility of the site.
| 1:46 pm on Mar 23, 2005 (gmt 0)|
Keep in mind that this is all relative. You can have a heavy page, the structure of that page will determine how it loads for the user.
For example, if you utilize source ordered html/xhtml, that 20k left hand navigation menu can be placed at the bottom of the html code. Or, that 30k top navigation menu can be placed down there too.
If for some reason you have pages that cross the 100k mark, source ordered html/xhtml might be in order. If that is not an option, then you should take your SEO's advice and start trimming out the fat. ;)
| 1:56 pm on Mar 23, 2005 (gmt 0)|
the size of the images is a useability issue (becoming less so with wider spread of broadband)
but it is NOT a seo issue, the 100k limit for pages that is bandied about does NOT include images, a spider does not care how big they are.
having said that smaller pages do often rank better (although this may be for reasons other than the fact the page is 'small'), but the size of the images are not a consideration in ranking
| 3:28 pm on Mar 23, 2005 (gmt 0)|
okay who do i believe?
| 3:32 pm on Mar 23, 2005 (gmt 0)|
>>the 100k limit for pages that is bandied about does NOT include images
This is true. Image size does not affect spidering. It is the size of the code alone.
| 3:50 pm on Mar 23, 2005 (gmt 0)|
I think Specter was referring to image problems WRT page load time as opposed to SE spidering. SE's definitely do not care about the size of images on your site.
| 4:13 pm on Mar 23, 2005 (gmt 0)|
i'd agree that images can be a problem wrt page load times ...
however in nearly all cases this can be addressed by compressing the images properly, buying a good compression software such as ulead smartsaver pro (i find it much easier to use than imageready/photoshop) can work wonders.
i often see 200k grafics which could be the same dimensions and quality at a fraction of the overhead if they were compressed properly.
sometimes there are too many on a page ... my favourite competitor has product image galleries showing about 50 a page with each one weighing in at around 100k+ - now thats one hell of a download for a single web page! i can't believe they sell anything from the website
| 4:13 pm on Mar 23, 2005 (gmt 0)|
okay so is my SEO incorrect to advise reducing the image weights and HTML code for the purpose of better indexing by search engines?
| 4:20 pm on Mar 23, 2005 (gmt 0)|
simple, do your favourite search on google, look at the page sizes of the results - they are always shown in k (this is the size minus the images)
compare the sizes with the size of your pages, if they are significantly smaller, then consider that he may have a point.
... but remember its not just the size its what you do with it that counts.
| 4:36 pm on Mar 23, 2005 (gmt 0)|
He was not wrong in advising you that it is probably better to keep your page size below 100K. Similarly he was correct in telling you to reduce the size of your images but if this advice was purely based on SEO decision then ...?
If you have a page with 200K of text only 100K will be indexed meaning that your site will not be found for any of the keywords or phrases that appear in the "bottom" 100K.
Google normally spiders up to 100K of a web page text. Images are not included so they have no impact on your final position in the SERPs. As has already been stated, large images mean that your page will take longer to load. Any longer than 8 seconds and you risk your vsitors surfing off into the sunset.
| 7:11 am on Mar 24, 2005 (gmt 0)|
|I think Specter was referring to image problems WRT page load time as opposed to SE spidering. |
Exactly.As I consider to optimize a web site not only for SEs but also for the users;you could have thousands of visitors per day but if your pages are "heavy" to open or the site is complicated or difficult to navigate you fail your goal.
|SE's definitely do not care about the size of images on your site. |
This is true.As i said:
|Search Engines "prefer" and better index simple pages rich of good contents instead of a lot of images. |