homepage Welcome to WebmasterWorld Guest from 54.226.168.96
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Site Speed Data Seems Off for PHP Redirects in WMT
smallcompany




msg:4045160
 11:52 pm on Dec 17, 2009 (gmt 0)

I see that HTML pages (of my sites at least) are having loading time of around 1 second which is by Google considered fast.

But my PHP redirects that track outgoing links are per Google tremendously slow.
Google shows 4, 5 even 10 seconds which is bit odd to me as I've never experienced that myself. I'm located away from the servers just like any other ordinary user.

1. Do you experience similar reporting in relation to your PHP files that are nothing else but redirects? How about PHP pages?

2. Is there a way of excluding files like PHP redirects from Google's speed calculation - legitimately?

Thanks

 

KenB




msg:4045229
 3:15 am on Dec 18, 2009 (gmt 0)

I think the site speed calculations are being screwed up by the fact that Google Analytics code and some other Google JavaScript based widgets don't get gzipped when they are served up to Google's bots. At least this seems to be the primary reason I'm getting dinged by Google Webmaster Tools' page speed report. I've run tests using Opera's Developer Tools and my page speed is way faster than being reported by Google.

I really wish they'd fix the gzipping issue with Page Speed so we'd get a real page speed data. Of course I'd also like Google to clean up and reduce the file size of their widgets while they're at it.

OutdoorWebcams




msg:4045392
 10:50 am on Dec 18, 2009 (gmt 0)

1. Yes, and it drags down the site performance tremendously.

2. I would like to know that also, at least excluding the redirect scripts via robots.txt does not help.

It seems that Google thinks the landing page belongs to our websites (including the time spent for DNS lookup, finding a route to the destination host etc. in the page speed calculation) - something like the old hijacking problem.

FranticFish




msg:4045460
 1:58 pm on Dec 18, 2009 (gmt 0)

It seems that Google thinks the landing page belongs to our websites

If the destination url is hidden from Google then it's hard to see how they could think anything else.

levo




msg:4045518
 3:17 pm on Dec 18, 2009 (gmt 0)

AFAIK time calculations are based on users with google toolbar (with advanced features enabled)

gzip issue is related to Googlebot testing pages with Firebug, and they said it will be fixed soon..

jdMorgan




msg:4045571
 4:18 pm on Dec 18, 2009 (gmt 0)

This may be a good time to mention that the method of exit link tracking makes a difference, too.
Take a look at how Google does it (or used to do it, anyway). If JS is enabled, replace the exit link tracking with a JavaScript onmousedown event referring to a small object, such as a 1x1 transparent gif image that is marked as non-cacheable on your server, and whose URL has a query string appended with the data you need to track.

If the link is clicked, it fetches that object *and* does a click-through to the target URL. In this way, the delay of running your server-side tracking and redirection script is removed from the time to click through to the target site. In other words, there is no redirect; it is replaced by a simple fetch of a tiny object from your server.

If JS is disabled, proceed as before with your current click-tracking-and-redirect script.

Details of the implementation are in this thread [webmasterworld.com].

Jim

smallcompany




msg:4045692
 6:50 pm on Dec 18, 2009 (gmt 0)

JavaScript onmousedown event

This means each link has to be maintained separately, right?

The point of PHP, at least for me, was:

- I maintain links in one or few files only
- I don't need to change links in HTML code (frequently)
- I nicely connect programing "rumble" from incoming script posting a cookie and entering data into DB, and exiting script picking up the data from a cookie

Still not sure where the anything over 3 seconds is coming from for PHP files. And I'm looking into different servers with different hosts.

OutdoorWebcams




msg:4045710
 7:31 pm on Dec 18, 2009 (gmt 0)

It seems that Google thinks the landing page belongs to our websites

If the destination url is hidden from Google then it's hard to see how they could think anything else.

As Google's data is (also) collected from the Google Toolbar, the destination URL is not hidden from Google and something like comparing the destination's domain name with our's could do the trick.

aakk9999




msg:4045753
 8:21 pm on Dec 18, 2009 (gmt 0)

If you redirect cross-domains, would the page speed be attributed to the requested URL or to destination URL? OR to both?

E.g. the page greenwidget.com/mywidgets.html redirects to bluewidget.com/mywidgets.html

Obviously, the redirect takes more time + extra DNS lookup. Would this timing be counted against greenwidget.com site or against bluewidget.com site?

KenB




msg:4045760
 8:31 pm on Dec 18, 2009 (gmt 0)

AFAIK time calculations are based on users with google toolbar (with advanced features enabled)

That would be an interesting way to handle it, but that would only provide a narrow subset of pages for any website. I'd think a more robust way to deal with it would be to use a combo of the Google toolbar and Google's bots

levo




msg:4045773
 8:58 pm on Dec 18, 2009 (gmt 0)

Combo would be "better," since my avg. page serving time to Googlebot is ~40ms, but they aim users experience. If most of your visitors (with toolbar) is located in India with dial-up connections, you'll get much higher page loading times..

"The performance overview shows a graph of the aggregated speed numbers for the website, based on the pages that were most frequently accessed by visitors who use the Google Toolbar with the PageRank feature activated ... For example, if your site is in Germany and all your users are in Germany, the chart will reflect the load time as seen in Germany."

[googlewebmastercentral.blogspot.com...]

If the destination url is hidden from Google then it's hard to see how they could think anything else.

"As the page load times are based on actual accesses made by your users, it's possible that it includes pages which are disallowed from crawling. While Googlebot will not be able to crawl disallowed pages, they may be a significant part of your site's user experience."

adamxcl




msg:4045976
 5:41 am on Dec 19, 2009 (gmt 0)

Every negative they give me is related to Google services, like analytics and custom search. And like mentioned, the load times are longer for them than for myself as well.

hugh




msg:4046273
 12:08 am on Dec 20, 2009 (gmt 0)

I've been looking at this metric recently and it makes no sense to me. Rating a website on a client's ability to load a page rather than the server's ability to send the data is just plain silly. If this ever becomes part of the algorithm what's to stop rivals installing it on 300 baud dial up connections and making a bee line for their competitors?

hugh




msg:4046282
 12:24 am on Dec 20, 2009 (gmt 0)

It also ignores the fact many pages will be usable to the reader long before they've finished loading if image dimensions are in use and javascript has been added at the bottom the page for instance...

levo




msg:4046314
 2:54 am on Dec 20, 2009 (gmt 0)

It also ignores the fact many pages will be usable to the reader long before they've finished loading if image dimensions are in use and javascript has been added at the bottom the page for instance...

I think they want to devalue pages with too many images/external resources, hence total load time..

If this ever becomes part of the algorithm what's to stop rivals installing it on 300 baud dial up connections and making a bee line for their competitors?

exactly!

smallcompany




msg:4057645
 7:16 am on Jan 10, 2010 (gmt 0)

BTW,

If I have a setting in my .htaccess like this:

AddType x-mapp-php5 .php .html .htm

which is to parse HTML as PHP, would that contribute to slowing down the site?

Thanks

jdMorgan




msg:4058054
 2:31 am on Jan 11, 2010 (gmt 0)

Yes, that will slow things down a bit, as the server has to parse each file character-by-character before sending it to the client, and look for <php or <? tags and then interpret anything after them.

So obviously, if you don't have any PHP in your .html or .htm files, then remove that filetype from the list.

If you have only a *very few* .html and .htm files with PHP code in them, and if you're running PHP as CGI, consider using the XBitHack method (see Apache mod_includes for details). Simply-put, if you use this option instead of including the filetpye in your AddType list, then you will set the Owner "Execute" permissions bit (the "X bit) on each .htm or .html file to indicate that Apache should parse that particular file for CGI code.

If you've got all of those php-containing .htm and .html files off in a separate directory or directory-structure, remember that you can have separate config code on a per-directory basis -- either in your server config files within <Directory> conatainers, or in individual, per-directory .htaccess files. This allows you to set and override the AddType for any directory or directories if your site is so organized.

If you're stuck somewhere in between "a few" and "many" .htm and .html files that need to be parsed, then consider renaming those files to .php as they should have been -- and remember that you can use mod_rewrite to rewrite the old (.htm and .html) URLs to the new .php files, so that you won't have to change your on-page links or worry about losing ranking while the search engines re-index them; You can do some internal rewrites, and the SEs won't even notice anything has changed.

Jim

smallcompany




msg:4058084
 3:25 am on Jan 11, 2010 (gmt 0)

Jim to the rescue!

It looks like it's the time to do internal rewrite and start runing those pages as native PHP.

I use few includes in templates (means all pages). I could switch that back to HTML.

I'll take two similar sites on same host and do one in PHP and put other into pure HTML and take out PHP parsing.

So far, the only thing I noticed was that my shared host is faster than my VPSs on two prominent brands.
That told me that PHP as sole was not an issue.

PHP files get reported as slow only on VPSs.

helpnow




msg:4060113
 6:54 pm on Jan 13, 2010 (gmt 0)

"These estimates are of medium accuracy (between 100 and 1000 data points)."

Gorg has adjusted their message in the WMT Site Prformance. ; )

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved