Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: phranque
More than 10,000 websites have become unwitting hosts of malicious software, say security experts.
Those visiting the hijacked pages risk having keylogging software installed on their PC if it is not protected with the latest patches.
The webpages compromised are all legitimate sites devoted to subjects such as tax, jobs, tourism and cars. The sites are thought to have been booby-trapped using a malware kit, called MPack, sold commercially online.
10,000 Sites Are Unwitting Hosts of Malicious Software [news.bbc.co.uk]
10,000 hacked sites sounds like a very, very low figure to me - isn't it?
It's all relative. Don't forget the leveraging aspect. .. 10,000 site x ___ visitors per day. At 100 visitors per day per site that's 100,000 exposures. at 1000/day/site that is 1,000,000 exposures/day. More than likely the hackers can find a way to further exploit the machines they infect.
One in 10 web pages scrutinised by search giant Google contained malicious code that could infect a user's PC.
Researchers from the firm surveyed billions of sites, subjecting 4.5 million pages to "in-depth analysis".
About 450,000 were capable of launching so-called "drive-by downloads", sites that install malicious code, such as spyware, without a user's knowledge.
The report seems vague to me. I did not see any information that helps webmasters check thier site..
No solutions, hints, or what to look for, just be scared unless you own updated anti-virus software....
The report is not vague. It is just public news.
For the security of your webservers, you shouldn't depend solely on BBC-news. There is other IT security related information worth checking, too.
So the solution/hint is to check your own page sources if someone planted an IFRAME into them that shouldn't belong there ...
Google's figures in that report are VERY deceptive. Many news outlets interpreted this to mean one in ten of all sites, but this is NOT what Google's report says.
If you look at the small print, it's only one in ten of the sites that Google had already flagged as suspicious, so they aren't representative of the web as a whole. Google deliberately sought out sites that were already thought to be compromised, so one in ten was actually a surprisingly low figure.
To use a non-computing analogy: If you hang around a court and one in ten drink drive cases results in a conviction, that doesn't mean that one in ten people is a drunk driver, it just means one in ten people already suspected of being a drunk driver have been convicted. The actual figure for the entire population would be much lower.
Can you be less vague and more specific?
... more specific? Didn't you see the attached 3 *specific* links with further *specific* information? What else do you need?
And today, the SANS came out with an authorized reprint of an analysis authored by iDefense:
Sorry, I can't be more specific than these specific links I provided.
The links you provided are just that and nothing else. They may or may not be related...
If you go back and read the original post and follow the link to the press release there are no such links. There are some semi-related and related links on the right side of the page.
The press release - editorial or what ever, is in-conclusive, vague, and doesn't give any specifics.
I do not doubt the possibilty that the editorial is correct, however without specifics, evidence, case studies and other supporting information I have to take the artical/editorial at face value.
Again, a general editorial on a security risk.. I submit to you: So, Whats New? Give me solutions not hype or type fodder!