Forum Moderators: Robert Charlton & goodroi
But my PHP redirects that track outgoing links are per Google tremendously slow.
Google shows 4, 5 even 10 seconds which is bit odd to me as I've never experienced that myself. I'm located away from the servers just like any other ordinary user.
1. Do you experience similar reporting in relation to your PHP files that are nothing else but redirects? How about PHP pages?
2. Is there a way of excluding files like PHP redirects from Google's speed calculation - legitimately?
Thanks
I really wish they'd fix the gzipping issue with Page Speed so we'd get a real page speed data. Of course I'd also like Google to clean up and reduce the file size of their widgets while they're at it.
2. I would like to know that also, at least excluding the redirect scripts via robots.txt does not help.
It seems that Google thinks the landing page belongs to our websites (including the time spent for DNS lookup, finding a route to the destination host etc. in the page speed calculation) - something like the old hijacking problem.
If the link is clicked, it fetches that object *and* does a click-through to the target URL. In this way, the delay of running your server-side tracking and redirection script is removed from the time to click through to the target site. In other words, there is no redirect; it is replaced by a simple fetch of a tiny object from your server.
If JS is disabled, proceed as before with your current click-tracking-and-redirect script.
Details of the implementation are in this thread [webmasterworld.com].
Jim
JavaScript onmousedown event
This means each link has to be maintained separately, right?
The point of PHP, at least for me, was:
- I maintain links in one or few files only
- I don't need to change links in HTML code (frequently)
- I nicely connect programing "rumble" from incoming script posting a cookie and entering data into DB, and exiting script picking up the data from a cookie
Still not sure where the anything over 3 seconds is coming from for PHP files. And I'm looking into different servers with different hosts.
It seems that Google thinks the landing page belongs to our websitesIf the destination url is hidden from Google then it's hard to see how they could think anything else.
As Google's data is (also) collected from the Google Toolbar, the destination URL is not hidden from Google and something like comparing the destination's domain name with our's could do the trick.
E.g. the page greenwidget.com/mywidgets.html redirects to bluewidget.com/mywidgets.html
Obviously, the redirect takes more time + extra DNS lookup. Would this timing be counted against greenwidget.com site or against bluewidget.com site?
AFAIK time calculations are based on users with google toolbar (with advanced features enabled)
"The performance overview shows a graph of the aggregated speed numbers for the website, based on the pages that were most frequently accessed by visitors who use the Google Toolbar with the PageRank feature activated ... For example, if your site is in Germany and all your users are in Germany, the chart will reflect the load time as seen in Germany."
[googlewebmastercentral.blogspot.com...]
If the destination url is hidden from Google then it's hard to see how they could think anything else.
"As the page load times are based on actual accesses made by your users, it's possible that it includes pages which are disallowed from crawling. While Googlebot will not be able to crawl disallowed pages, they may be a significant part of your site's user experience."
It also ignores the fact many pages will be usable to the reader long before they've finished loading if image dimensions are in use and javascript has been added at the bottom the page for instance...
I think they want to devalue pages with too many images/external resources, hence total load time..
If this ever becomes part of the algorithm what's to stop rivals installing it on 300 baud dial up connections and making a bee line for their competitors?
exactly!
So obviously, if you don't have any PHP in your .html or .htm files, then remove that filetype from the list.
If you have only a *very few* .html and .htm files with PHP code in them, and if you're running PHP as CGI, consider using the XBitHack method (see Apache mod_includes for details). Simply-put, if you use this option instead of including the filetpye in your AddType list, then you will set the Owner "Execute" permissions bit (the "X bit) on each .htm or .html file to indicate that Apache should parse that particular file for CGI code.
If you've got all of those php-containing .htm and .html files off in a separate directory or directory-structure, remember that you can have separate config code on a per-directory basis -- either in your server config files within <Directory> conatainers, or in individual, per-directory .htaccess files. This allows you to set and override the AddType for any directory or directories if your site is so organized.
If you're stuck somewhere in between "a few" and "many" .htm and .html files that need to be parsed, then consider renaming those files to .php as they should have been -- and remember that you can use mod_rewrite to rewrite the old (.htm and .html) URLs to the new .php files, so that you won't have to change your on-page links or worry about losing ranking while the search engines re-index them; You can do some internal rewrites, and the SEs won't even notice anything has changed.
Jim
It looks like it's the time to do internal rewrite and start runing those pages as native PHP.
I use few includes in templates (means all pages). I could switch that back to HTML.
I'll take two similar sites on same host and do one in PHP and put other into pure HTML and take out PHP parsing.
So far, the only thing I noticed was that my shared host is faster than my VPSs on two prominent brands.
That told me that PHP as sole was not an issue.
PHP files get reported as slow only on VPSs.