| 1:43 pm on Sep 3, 2012 (gmt 0)|
Welcome to the forums, ceazer2801.
There are many factors that play into page rendering speed and it can be difficult to focus on what your site's issues are with just a discussion. There are a few utilities online that can help you zero in on page speed problems - often based on the add-ons that were developed by Yahoo and Google.
I like GTMetrix [gtmetrix.com], which shows you page speed scores from both Yahoo's YSlow tool and Google's Page Speed tool in one analysis. In addition to your page's overall score, you get a lot of details about each of the potential areas that might be slowing down your site.
| 4:26 pm on Sep 3, 2012 (gmt 0)|
thanks for the welcome tedster and the useful info....... Ok so i have check out this GTMetrix....awesome..... now there are a few things causeing a page speed score of 66 (D),
#1 is image optimization.... i used ImageReady to "optimize" the images but i have a score of 1 (F) for that section, soooo.... i tried this Yahoo Smush.it which helped and got that section up to a 28 (F) but it still says i have a long way to go. Is there a better piece of software for lossless compression?
#2 Leverage browser caching... after a little googleing i came up with this bit of code....
ExpiresByType text/html "access plus 1 seconds"
ExpiresByType image/gif "access plus 1 years"
ExpiresByType image/jpeg "access plus 1 years"
ExpiresByType image/png "access plus 1 years"
ExpiresByType image/x-icon "access plus 1 years"
ExpiresByType text/css "access plus 1 years"
ExpiresByType application/x-shockwave-flash "access plus 1 years"
but do direction as to the placement of it in the page, i have messed around a bit with it but to no avail. Is this heading in the right direction for caching or should i start over on this issue? If this is some good code should it be in the header or with the JS at the bottom?
#3 Last but not least was my gzip compression.... this one is new to me but from what i can tell its basically putting your page into a "zip" file for the web... is this correct or do i just need to quit while im ahead?
Sorry if this is a bit much, I only sit down to do web design once every 3 -4 years and it seams like every time i do, everything about it changes completely,.... i miss the easy web design days of 2002.... lol, 5 or 6 hours with Frontpage and BAM your done! haha
| 7:32 pm on Sep 3, 2012 (gmt 0)|
You may be able to leverage browser caching, but it's not where I would focus my first efforts. The information you posted is for Apache server settings, not something you place in the source code of your page.
Images are one of the most frequent causes of slow page loading. From what I've seen, Photoshop is still the king of image compression. With a jpg file using Photoshop's "Save for the Web" (it taps into ImageReady) you can go to 40% most of the time and not see visible degradation of the image.
For gif and 8-bit png files, 16 colors is often all you need. But I've never seen an online tool that can do what Photoshop's compression algorithms can do - especially after a little learning and practice.
You don't need the latest version of Photoshop if cost is an issue. It's compression algorithms have been awesome for many versions.
| 7:40 pm on Sep 3, 2012 (gmt 0)|
i do have an older copy of PS (CS2) so i will try that instead, thanks
If Gzip is a server function then how would i utilize it? ...... looks like i need to do some homework on this!
Tedster, you have been very helpful as is this forum! Thanks to you and the rest of the gang that created, operates, and moderates this site!
| 8:56 pm on Sep 3, 2012 (gmt 0)|
Examine the image properties to make sure that you are not trying to maintain original resolution. If you are on Windows, download Irfanview (free) that is a desktop program that will help you set the image to proper web resolution. I have seen people using images that were far too big for any screen and still at 500dpi so the filesize might be several Mb. For most web images 96 or even 72 dpi is sufficient for a good representation of an image. If your PS lets you "save for web" it will help.
| 5:33 pm on Sep 9, 2012 (gmt 0)|
ok so i have verified that all the resolutions are the appropriate size and saved them all "for web" with photoshop at 30% significantly reducing the total image size for the site. this raised my score with gtmetrix but i still fail at the browser cache and gzip..... i have looked around the web for some info and for the life of me i can not find how to set the cache expiration for the browser..... can someone give me a nudge in the right direction here.... is this just a command that is put into the header or what?
| 5:35 pm on Sep 9, 2012 (gmt 0)|
ooo BTW it still takes almost 2 minutes to load the page (any of them)......i really think there is something screwy in the code....perhaps the background images.... i really dont know.
| 5:53 am on Sep 10, 2012 (gmt 0)|
If heavy images upload on you web site than remove it
| 12:08 pm on Sep 10, 2012 (gmt 0)|
2 minutes is absurd.
what are the actual image sizes in Kb?
i'd consider other issues, like the server you are using is bottlenecked - are you paying for hosting?
or maybe your own internet connection is the problem.
| 12:24 pm on Sep 10, 2012 (gmt 0)|
how long does it take a blank html page to load, that is
html ,head, and body tags
| 2:09 pm on Sep 10, 2012 (gmt 0)|
This is how I know it must be code....total PAGE size 83.8K!.... The server is great, I have another site on with them and no problems what so ever, my connection is also pretty good, I get 1.2 MB/sec DL and half that UL on a speed test, I have also tried.the site on several other computers at different locations....still the same. I know I'm not supposed to post urls but I want; yall to see what I see as to how basic the site is www.(removed so I don't piss off mods).com <---- and no I did not choose the worlds longest url...lol that was their decision.
[edited by: ceazer2801 at 2:38 pm (utc) on Sep 10, 2012]
| 2:19 pm on Sep 10, 2012 (gmt 0)|
Mods will remove your domain, but..meanwhile ..
From France everything loads reasonably fast ..under 4 seconds..except the "Spanish" button "hangs it" as do the other 3 buttons..takes load time to 20 seconds to get the "Spanish" one and and the 2 on the left ..another 10 seconds to get the one on the right..
Those are your bottlenecks ..right there ..HTH
These day 83k is not big at all ..most people are on fast pipes..sure, the smaller the amount of total data transfered on page call, the better ..but it is your buttons that are causing you the grief..
Once the buttons are in "cache" then a click on them to get elsewhere means they are instant on the next page..the new images then appear slower than them, but the new page still loads in under 5 seconds..
[edited by: Leosghost at 2:25 pm (utc) on Sep 10, 2012]
| 2:22 pm on Sep 10, 2012 (gmt 0)|
Ah ha.....thanks leosghost..... I wonder if its the css for "on hover" then
| 2:35 pm on Sep 10, 2012 (gmt 0)|
Quick look at source and code makes me think it might be because your on page code makes "relative" calls to all images ( including buttons )..but your CSS makes "canonical calls" ( full site URL ..http://www example.com/imagefolder/image.jpg)..
edit ..on a further look ..your page code is calling some of your buttons "canonical"..full path ( full site URL ..http://www example.com/imagefolder/image.jpg )
Make all the "on page" calls ( all the <img src= stuff ) to images ..buttons, the whole works, "relative", do the same in the CSS ..
Switch your CSS to reference "relative" to site root ..like your page code does..it may well get faster..
I'd also put the jscript call to the images at the end rather than at the beginning..let 'em have something to look at while jscript is working..
[edited by: Leosghost at 2:44 pm (utc) on Sep 10, 2012]
| 2:41 pm on Sep 10, 2012 (gmt 0)|
Awesome, thanks......I will do that once I get home (at work right now)
| 9:25 pm on Sep 10, 2012 (gmt 0)|
A simple way to see what is taking how long is to use chrome's inspect:
Right click on background > Inspect element > Network
Now reload the page
You'll see what is loaded in what order and how long it took.
I like that far more than a Yslow!'s letter grading of speed.
As for compression: take care it breaks some things in Firefox.