I'll make yet a further clarification by reversing yours a bit, BlobFisk. ;) The true statement should be:
One in ten UK websites messes up in Firefox.
Hopefully this is being presented by the BBC as a problem with the websites, not the browser. But somehow I doubt it's being handled that way (haven't found the story myself to check).
Did the report mention how badly things get messed up? I mean, it could be something as simple (and relatively minor) as using the MARGINTOP type attributes (or whatever those things are, never used them myself) on the <body> tag.
It is being presented by the BBC as a problem with the websites, not the browser.
So they think that the millions of websites created before firefox came along, all working fine on everyones browser (a.k.a internet explorer) are at fault. Yes, heaven forbid that firefox would actually be developed with the capability of displaying these millions of websites - now that would just be daft ;)
This is largely because web developers are used to testing their sites just using IE rather than so-called standards-compliant browsers, which only use code ratified by the World Wide Web consortium.
Anyway, BBC and Scivisum talk about sites not working because of invalid HTML. I had a quick peek at some of the listed sites, and noticed that they use awfully outdated browser sniffing techniques. Fix that, and many issues would be solved.