Forum Moderators: Robert Charlton & goodroi
I myself find this frustration prevalent on busy sites, even if I know the site is worth the wait, there are times the wait exceeds my tolerance. Over time, as I an no longer stand the constant waiting in line, I go elsewhere, and eventually I am gone for good.
I fully believe errors in code slow the browser down. After all, the browser is an engine which must process the code and render it on a display: The less errors it encounters, the less it hitches. I further believe that Google and other engines realize this issue and can (and very likely do) affect the ranking of sites whose code is not up to par. A simple process really, a hyperbot is a code-sensitive robot which can easily be tuned to give a higher ranking to sites which validate.
So I ran my site through a recursive validator (capable of processing up to 100 pages at once), and wouldn't you know it... Thousands of errors, it was bad.
Now I could be wrong, but my main page has always had a pr5 and I wondered why other pages had a lower pr, thou I also knew my main page was the best on the whole site (i.e.: fully validated), however, it isn't necessarily the page that gets the most traffic from the engines.
I just fixed a slew of these errors, and it will be another few days before the site is literally at 100%, but I am fairly certain a 100% validated site gets more traffic. I also noticed, on the pages that are fixed, the hitches are gone, now it loads like BLAM! 8 seconds on a 14.4k connection ...
I feel it is far better to invest my time in fixing errors, a process for which I can not be penalized and which can only result in more traffic (as opposed to SEO techniques, for example). This, I believe, ultimately affects PR in a direct way (and if nothing else, certainly should result in a rise on the serps).
For those of you interested, here is the link:
[htmlhelp.com...]
I am fairly certain a 100% validated site gets more traffic
That's an interesting supposition.
Whilst I take care to validate all my sites, and although I work on accessibility issues too, I'm not sure I believe your supposition is much more than a guess.
Sure visitors *might* stay longer if a site is faster to load (though good use of CSS is a good way to achieve that), but I don't see it would affect traffic much at all. Can you offer any *real* reason, apart from download time, why *traffic* would increase?
This, I believe, ultimately affects PR in a direct way
I don't think I follow...
And a propos nothing much at all, I fed Google's homepage into the validation link you cited.
Not a pretty sight... <grin>
DerekH
But having pages that function well IS an issue for ranking. If the errors create page load problems, then lots of traffic may quickly Back Button to the search results to click on something else -- and THAT behavior can be tracked by Google and it may hurt you eventually.
However, the title of this thread asked about PR. PR has only to do with links, not content, text, coding errors or anything else but links. If an mark-up error is bad enough that googlebot can't read a link, then fixing that error can help PR for the page the link points to. That's the only connection I can see between validation and PR.
I feel it is far better to invest my time in fixing errors, a process for which I can not be penalized and which can only result in more traffic (as opposed to SEO techniques, for example).
And I'd rather invest my time in developing content that will increase targeted traffic and revenues. To each his own. :-)
And because PR is only about links between pages, that's why validation alone doesn't boost PR. It certainly can boost traffic, and your overall ranking may thereby improve -- but no direct PR influence comes from having valid mark-up.
If internal pages have pr, that means the g spider has gotten to them, and as such, it wouldnt matter if they loaded slowly or not - they already got the pr they deserved.
If your pages load too slowly on 56 modems, the visitors will simply leave. It would then follow that if you do not optimize at least your index page to be small on size (html / div etc and images included) then you will lose visitors. I would say that is webmastering 101/