|Is Linux more secure than Windows|
Lets move the conversation here
| 9:10 am on Mar 22, 2014 (gmt 0)|
I several recent threads the issue of whether Linux is more secure than Windows has come up.
My view is that desktop Linux is definitely more safer than Windows (or MacOS). Whether it is because of better architecture or because it is less often targeted, the end result is that a Linux desktop is MUCH less likely to be compromised than a Windows one.
Servers are harder to judge. Windows advocates argue its security has got much better - but Linux has too. Are they now similar in terms of design? I have seen no stats on exploits (and stats I have seen in the past have been misleading).
I can think of a few advantages Linux has that apply to both desktops and servers:
1) The way Linux software is installed and updated means you often have a single copy of critical libraries that are dynamically linked by applications. This means not every application that uses a library needs an update. A good example of this is the recent GNUTLS flaw: updates were available from all the major distors very quickly, and once installed every app that used GNUTLS used the updated version (except for a few who statically linked).
2) It is not a monoculture. There are multiple distros with different libraries and versions installed by default.
3) Although not all open source benefits from many eyes, widely used software is examined, and flaws can be found by anyone who carries out a security audit. The same goes for overall code quality - GNUTLS, for example, had lots of critics before the flaws were discovered because it looked badly written.
4) Open development processes and licences make it harder for vendors not conceal bugs they found and fixed themselves. It makes it easier to evaluate the track record of software.
| 1:31 pm on Mar 22, 2014 (gmt 0)|
Linux was designed after Unix. MacOSX IS Unix.
|My view is that desktop Linux is definitely more safer than Windows (or MacOS). |
Being less targeted does not make security better. Access is still access.
No. Windows and Linux do not share the same or similar architecture, whether you talk about desktop or server.
While your four points are valid, many people forget the very methodology of how executable software gets installed and executed on a *nix/BSD system that is unlike Windows and one of the first line security methods.
Another thing people don't know is there has been no successful virus on a Linux system since 2002, and that was squashed the same day.
Any "infections" of *nix/BSD systems you read about are caused by user/admin error allowing installation of rogue programs or access and not the drive by installation of malware pervasive in Windows systems.
|wa desert rat|
| 6:45 pm on Mar 22, 2014 (gmt 0)|
My issue with Windows stems from spending far too much time charging customers for sitting on my butt and watching some utility scan Windows computers for malware. Frankly, it's embarrassing.
My wife and I have an old Windows XP machine that we use for income tax preparation. That machine sits in her sewing room and is only turned on for taxes (and only exists because the tax software requires Windows). Interestingly enough, the tax software we use would work just fine on every version of Windows from XP to 8.1 with no alterations.
Last year I downloaded a new copy of Malwarebytes and installed it and, inevitably, ran a scan. Even though the machine had not been turned on for almost a year, the scan found a few bits and pieces. Then I ran a spybot scanner and it found a few issues. We did our taxes, turned off the machine and went to bed.
The next day a client called and I went out to their site to install and configure a new machine for an existing employee. The machine had Windows 7 on it (almost no businesses use Windows 8 or 8.1 and even though Dell and HP claimed to no longer offer V7 they will when a customer demands it or threatens to go elsewhere. I spent the usual few hours adding the computer to the domain, moving data, re-installing applications, etc.
Then I installed an anti-virus utility and Malwarebytes. Interestingly enough, it was exactly the same Malwarebytes as I had installed the day before on our XP machine. And the spyware app was also the same. They installed the same and worked the same - and found the same types of malware - across a decade and a half of every iteration of Windows.
You cannot do that with MacOS. Nor can you do that with Linux. You might be able to create virtual machines that would accommodate all these iterations but that would require a lot of fiddling.
It didn't even make sense because, as we all know, MS (and MS's partners) make a great deal of money from requiring a new version for every iteration of the OS. Not to mention the money MS makes for the new credentials needed by the techs and the new training so users can learn all the little cosmetic changes. So why would scanning utilities, arguably more complex than simple "programs", work virtually unchanged across such a wide spectrum of OSes?
Then I remembered that MS has a free - but not bundled - malware application of its own. Users have to learn about it and download it and install it separately. Why wouldn't MS just bundle it?
More importanly, why wouldn't MS - presumably with the source code to all of these operating systems - simply FIX the flaws that allow all this malware free reign inside their systems instead of yet-another-add-on?
The answer is simple. I don't think they can without disrupting the business model they've built around their operating systems. I think that a basic decision made in 1994 by Bill Gates when MS was moving from Windows 3 to Windows 95 precludes a simple fix and that decision was perpetuated when they moved from their old DOS-based systems into the NT era in the late 1990s.
Gates did not see the Internet as a big deal. In fact, he was not hesitant to label it as a fad. It was for this reason that the early versions of Windows 95 (and all versions of Windows 3) had no tcp/ip stack. If you ran Windows 3 you had to find your own tcp/ip stack and install it if you wanted Internet access.
Gates thought his market was small businesses (big business used big computers) and home users. People using spreadsheets and word processors who mostly did not need to share files; and if they did, they'd use SMB (Windows for Workgroups) or Novel. The other big market Gates envisioned was video games.
But games needed fast graphics and the only way to get that was to tie the graphics more closely to the core of the OS; what was known then as "ring zero" and now as "kernel mode". Here is a brief essay on the idea of security "rings" (along with an even better commentary at the bottom): https://www.osronline.com/article.cfm?article=224
So instead of running in ring 3 (or "user mode") the graphics for Windows 95 was tied to kernel mode; or ring zero. This is a basic difference between *nix and Windows operating systems and this design received a great deal of flack back in the day.
If a hacker can design a program that will bridge the gap between user mode and kernel mode then it will have complete control over that system. And putting graphics in kernel mode facilitates that and explains why we had Windows exploits that loaded and executed by having the victim simply mouse over a pixel on a malicious web site.
Enter NT. In Windows, all the kernels from XP onwards are known as "NT Kernels"; based on "New Technology" and designed in the beginning to operate with RISC (Reduced Instruction Set) CPUs. But because of a need to preserve some backwards compatibility, MS did not execute the change between the older systems and NT perfectly and there are many ways to bridge the gap between user mode and kernel mode in NT.
Here is an excellent essay on just exactly how to do it and why it works on all iterations of Windows from XP through to at least version 7: [blackhat.com...]
And here is an example of how to do it... with source code: [exploit-db.com...]
You will notice that last post claims that it will work for all versions of the NT kernel from XP through to version 8.
And here is an explanation of how it all gets handled in Linux: [duartes.org...]
With all of this data available - and presumably available to Microsoft - one would be led inescapably to wondering why MS has not actually fixed the problem instead of pasting fixes onto the OS as time passed.
I don't know the answer to that other than to speculate that it might have been because of costs, reluctance to force all of its partners to change their products, or some antipathy to losing face. All I know is that Windows is insecure by design.
Apple's original OS for the Mac was also insecure but Steve Jobs was smart enough to bite the bullet and move to OS-X before the situation became critical. And his experience with NEXT led him to the Mach kernel which is, basically, Unix.
So if you have read this far you should understand why all of those "conventional wisdom" remarks are misleading. MS operating systems are not targetted just because they occupy so many desktops... they are targetted because they are easy to crack AND there are lots of them. Linux is not secure because there are fewer of them (really, there are millions of Linux boxes on the Internet and many with static IP addresses) but because the design of *nix makes it much more difficult to crack.
| 9:52 pm on Mar 22, 2014 (gmt 0)|
@graeme_p - gnutls - not everyone updated linux for that or many previous anti-exploit updates. It depends on the user and the environment. If you get someone running a server farm on a shoe-string they are possibly not going to take the trouble to update their machines every few days - or even every few years. Same applies to windows. Auto-updates? Fine, if they are enabled, but are they? And what happens if the server fails to reboot afterwards? Easier to not update.
An additional problem with windows, both desktop and server, is that a lot of copies are illegal rip-offs which MS does not patch anyway. Hence the high number of compromised XP machines in the "poorer" countries such as China and the US. :)
And then there is the type of person I once met in a shop whilst obtaining a mobile-broadband USB device. While waiting to be served he asked if I knew anything about viruses on laptops. Seems he'd bought one "in a pub" and run it the previous night only to have it use up all his mobile bandwidth. He'd taken it to a shop where they told him it had hundreds (literally) of viruses. Which, of course, used up all his bandwidth. Get it de-loused? No. Can't afford it. Not concerned with it being used as part of a botnet. I'm sure he's not alone. And then there are the people who do not even know about such things.
These are typical problems with windows that do not seem to be relevant to linux desktops but a lot of linux servers are infected, usually through poorly designed (usually) PHP web pages which are allowed to have full access to databases and merrily shove exploitable code into them. Not just PHP: there are a lot of exploits for CMS systems, as well. And for control panels. This can also be the case on windows servers but is less prevalent as many windows-based web sites do not have databases behind them. They have to be compromised a different way.
@drhowarddrfine - you are citing a kernel-based virus. As noted above, on servers there are other ways of infecting a LOT of linux servers: bear in mind there are far more linux than windows servers and many are running MySQL and PHP to server pages. Windows can also do this, of course. Possibly this method could be blamed on lax administrators but most of it can be blamed on site designers who often have only a rudimentary knowledge of designing dynamic web sites.
@wa desert rat - Much of that is a direct reprint from another thread, which I have answered in part. And I reiterate: I would not trust any anti-malware to get it right. I've seen a lot of "this is bad" reports from them where the actual item in contention was actually benign and was sometimes just a leftover from a poorly-written software installation (MSIE User-Agent strings, anyone?). I stopped running them eventually.
Windows NT was NOT an upgrade of previous windows software. It was designed from the ground up by a different software group. It was probably based on the Microsoft compilers (can't recall for certain, now) but from personal experience of their C compiler I can state that at the time MS compilers were top of the range. The only fault I ever found in their C compiler was a buffer-overflow problem, which I covered with software of my own. Odd that many MS exploits were due to buffer-overflows; I often wonder if the compiler was the reason.
I accept there were several flaws in MS operating systems - and still are, though far fewer than even five years ago. I also agree that linux servers are far safer - but not entirely safe: old kernels have bugs that are now being exploited (agreed, an adminstrator failure but it's still the software).
If you are considering MSIE as a dangerous beast then also consider Google Chrome, which is patched every few days for sometimes horrendous faults. Safari and Firefox too, of course, though I consider the latter to be the safest.
On the desktop/laptop/whatever scene, if you take such a device, be it windows, linux, mac, phone, whatever through customs at an air or sea port it's only a minute's work to compromise the OS and/or the BIOS. Same if it falls into the hands of an unscrupulous repairer. This is a known hazard of travelling and although not necessarily always carried out there is always the possibility. And if you have a BIOS back-doored then you may as well put the machine in a crusher.
I have no choice than to run MS web servers (and, until now, mail servers) for my clients' use. I have not had an intrusion on my online servers for years - last and only, as I said elsewhere, was several years ago with access via (I believe) a stolen FTP account for a non-MS FTP server which had an exploit hole. Not an MS app!
I am currently upgrading to a Windows 2012 server. Frustrating (couldn't even find the Start menu or the Restart option without an online trawl!) but it "feels" safer than the 2003 it is to supercede.
I am NOT saying windows servers are as sturdy as linux variants. They are not. But they are still pretty solid if they are updated regularly and security precautions are taken.
Desktop/laptop: I run linux Mint (previously Ubuntu) with Windows 2000 Pro and Home for necessary windows apps. Once again, I have never had a virus on either linux (no surprise) or windows, which I ran online since V3.1 through various implementations until about 5 years ago (and before that I ran MS-DOS, from V1 up, offline).
|wa desert rat|
| 10:52 pm on Mar 22, 2014 (gmt 0)|
|If you are considering MSIE as a dangerous beast then also consider Google Chrome, which is patched every few days for sometimes horrendous faults. Safari and Firefox too, of course, though I consider the latter to be the safest. |
Chrome is nowhere near as insecure as MSIE. As I said, MSIE and the NT kernels are so insecure that in Server 2008 and Server 2012 the ability to use that browser to actually look at web content has to be explicitly allowed on a site-by-site basis.
This is at least partly due to the fact that MS created MSIE with hooks directly into the registry. Something that other browsers cannot do.
NT was created as a better version of Windows but because they had to cater to their developers they left a lot undone. Including the ability to cross the line between user space and kernel space.
The inherent (and I do not use that term lightly) insecure nature of Windows OSes from XP through the latest version is responsible for the botnets we see today which do everything from denial of service to click-fraud.
One of my partners in a network engineering firm (which lasted a decade) went to work as the IT director of a local hospital where his major job was to deny hospital workers access to the Internet at large; or at least to most of it. His second major job was to clean the inevitable exploits on the Windows computers. He hired me, since I was the only actual engineer in our company, to run their routing and WAN services.
The costs to business for this are immense; and not commensurate with the benefits in my opinion. When IT's job is to deny users access to the Internet because of the high likelihood of a malware introduction to the system and not just because employees should be working not web surfing, then something is seriously out of whack.
If you have an Exchange server with port 25 open to the world then you are a far braver man than I am. I always have an SMTP server upstream to feed email to the Exchange box and the router set to only accept and forward port 25 connections from that IP address. At the very least I would use spam services.
Are there exploited l.a.m.p. servers? Of course. But nowhere near as many as apologists for MS products would have us believe. Most of the issues are no longer issues simply because of the advent of hosting companies with competent staff to keep their servers up-to-date.
There has never been a linux desktop user who has been infected with a virus because he simply opened an email or moused over a malicious pixel. Cryptolocker, for instance, is no threat to a Linux user but it, or something worse, could turn out to be a game changer.
There is bound to be a day of reckoning for MS over this. They cannot possibly continue to turn out one product after another with the same vulnerabilities patched only by a popup that asks, "Are you sure you want to do this?". Someone has to seriously begin to talk about this and if those of us involved in the industry don't do it then it will only get worse.
|wa desert rat|
| 10:58 pm on Mar 22, 2014 (gmt 0)|
I would like to add that many malicious web servers running Linux are not hacked boxes but, rather, explicitly created to serve malicious payloads to Windows computers that connect to them. Hosting companies are constantly on the lookout for these and when they find one and knock it out another "company" shows up looking for server space and soon there is another one. Or a spammer. Bluehost admins have mentioned this to me a number of times.
I think it's time to stop apologizing for the issues Windows presents and start demanding real fixes. And that's why I'm posting these. I admit that they must be annoying, but we have to start somewhere.
| 2:17 am on Mar 23, 2014 (gmt 0)|
I think the better question is which is more secure for novices. They can both be locked down by somebody who knows what they are doing. Right now their is a root kit on a bunch of linux boxes. It is all about keeping patches up to date and knowing how to set things up.
|wa desert rat|
| 4:29 am on Mar 23, 2014 (gmt 0)|
|I think the better question is which is more secure for novices. They can both be locked down by somebody who knows what they are doing. Right now their is a root kit on a bunch of linux boxes. It is all about keeping patches up to date and knowing how to set things up. |
How many naive users set up servers of any stripe? It requires setting static IP addresses, DNS pointers, firewall rules, etc. Not to mention the trauma of actually installing the OS. Sure, you can buy a server with Linux on it already but it's pretty clear that they don't want to sell it to you very badly. Isn't it more likely that a novice user would have purchased a Dell server with Windows 2008 or 2012 on it? And if you've never set one of THOSE up, you're in for a real treat in terms of complexity what with policies upon policies, etc. Heck, a Linux server is a piece of cake in comparison.
I think that today a novice wanting a server is more likely to get a VM at a hosting service than trying to set one up himself. After all, no one just runs a server to have a server; a server exists to serve something. And many ISPs block all the ports that let you do that on a server at home, anyway.
So we are really talking desktops or laptops here. Maybe at home where he/she is behind a Netgear router/firewall and on a DHCP IP address to boot. In this particular case the Linux OS is far superior to a Windows OS even if the Linux box is unpatched. A Linux user - even as root - can't execute an email attachment simply by opening it; Linux has no mechanism for doing that. Linux executes programs based on an executable bit not on the basis of a suffix like .exe. You have to make it executable on purpose and then execute it.
A windows user, on the other hand, can execute an attachment. And a moment's inattention (he is a novice, after all) will install that exploit right into his machine patched and updated or not.
Rootkits need to be aimed at a Linux box and since the desktop is behind a Netgear router without any way to do port forwarding (unless the user specifically goes into it) no port sniffing will indicate that there is an unpatched Linux box behind that little router.
And even if a naive user does set up SMTP or HTTP on his/her own home desktop, most ISPs block those ports anyway (as I indicated above). So the Linux user can only get rooted by purposely installing a piece of exploited software (by going out and looking for it). While the Windows user can get rooted by opening the wrong attachment.
Which explains why there are so many more botnets of Windows desktops and laptops and why most malicious Linux web servers were configured that way by their admins.
| 7:16 am on Mar 23, 2014 (gmt 0)|
|not everyone updated linux for that or many previous anti-exploit updates. |
Desktops: I am sure almost everyone has. Most non-geek distros have a clear warning you need updates.
Servers: If you cannot either login occasionally and update OR install and enable unattended upgrades (which should take you a few minutes per server) you should not be running an internet facing server.
AFAIK GNUTLS is mostly used by desktop software - it had a bad reputation for quality so most server software developers used OpenSSH.
|If you get someone running a server farm on a shoe-string |
The they should set up unattended upgrades right at the start. In any case A server FARM on a shoe-string?
|But they are still pretty solid if they are updated regularly and security precautions are taken. |
Agreed. The question is which is more secure under the same circumstances. This may vary for different scenarios.
1) Home and small office desktop: Easy win for Linux because of the ease of software install and updates.
2) large office deployments: assuming a competent IT department (not sure that is a good assumption...), no idea. Windows has a reputation for having better tools to manage really large numbers of desktops, but I have no experience of it.
3) Small (low cost and effort) servers. Probably Linux for the ease and speed of updates, multiple suitable distros reducing monoculture. As an aside, the cost advantage of Linux is significant here.
4) Larger servers. No idea how good Windows is so cannot compare. My main argument against Windows here is economic, not security (the hidden cost of vendor lock-in).
It looks like I have produced at-least half an argument in favour of what dstiles does (Linux desktop and Windows server). The other half is simply that the value of his existing codebase outweighs any reasonable advantages of Linux may have in any case (right?).
I also realise that the role in which I consider Linux relatively best is exaclty where it has the least market penetration.
| 10:38 am on Mar 23, 2014 (gmt 0)|
|How many naive users set up servers of any stripe? It requires setting static IP addresses, DNS pointers, firewall rules, etc. Not to mention the trauma of actually installing the OS. Sure, you can buy a server with Linux on it already but it's pretty clear that they don't want to sell it to you very badly |
Lots of hosting companies will rent you a server with Linux (or Windows, or BSD or whatever you want) pre-installed.
VPSs are even easier (and bottom end ones are very cheap these days). You can install an image from a nice web based control panel and these often include Apache. Even if it is not, it is very easy to install something like the LAMP stack. For example, on Ubuntu:
sudo apt-get install tasksel
sudo tasksel install lamp-server
On Debian or Red Hat you have to type another three or four lines.
After that its not much more difficult than using shared hosing - and there are lots of step by step guides.
Hardly a high level of competence required.
Maybe the real security issue is that running a server is now too easy?
| 3:23 pm on Mar 23, 2014 (gmt 0)|
And infections coming from those and their settings are their problem, not Linux.
|there are far more linux than windows servers and many are running MySQL and PHP to server pages. |
|wa desert rat|
| 4:03 pm on Mar 23, 2014 (gmt 0)|
The real security issue is that Windows is easily hacked. The irony is that we even talk about Linux security when there are over 30 MILLION Windows PCs in botnets out there.
What if there was no such thing as Linux or Unix or MacOS?
THEN would you be concerned enough about Windows lack of security? What does it take? It doesn't matter if there are 20,000 Linux servers in a botnet (and if you don't think MS is paying people to inflate those numbers then I have some ocean-front land in Arizona to sell to you). A botnet in 2010 had 30 MILLION PCs and every single one of them was a Windows PC. And that was just ONE botnet.
Read the note at the bottom of that page in which is says that all botnets were infecting Windows operating systems.
|wa desert rat|
| 4:18 pm on Mar 23, 2014 (gmt 0)|
|It looks like I have produced at-least half an argument in favour of what dstiles does (Linux desktop and Windows server). The other half is simply that the value of his existing codebase outweighs any reasonable advantages of Linux may have in any case (right?). |
Good grief. Let me give you a real-world example of how insecure a well set-up Windows server is...
One of our clients called us to say that their Server 2003 computer was "locked up". So I sent one of my partners out there. We discovered that one of their janitors had discovered the admin password (written on a sticky-note) and was surfing #*$! at night on the server.
The machine was completely unusable. I managed to get the files off using Knoppix and then we put new (bigger) hard drives into the box, reinstalled the OS, reconfigured the system, copied back their files, and took the machine back and installed it. The cost to them was over $2,000!
Now a couple of you would argue that leaving the admin password out there was a mistake. And it was. You'd be surprised how often that sort of mistake happens.
But they did not have a static IP. And did not have any ports open to the Internet. Just surfing malicious sites with Internet Explorer took their server down.
And that Server 2003 kernel is an NT kernel; almost exactly identical to the current servers. I refer you back to a couple links I have already posted with source code that demonstrate exploits which allow an attacker to cross the border between user space and kernel space on every iteration of Windows from XP to at least 7.
How did MS "fix" the new Server versions? Well the biggest fix was to make it almost impossible to surf any web site with Internet Explorer (except the MS site) without explicitly changing the settings and allowing that site.
But you can install Chrome pretty easily and then go anywhere you like. A janitor with the admin password can completely destroy a Windows server just by browsing.
In my book that is the very definition of insecure.
| 9:48 pm on Mar 23, 2014 (gmt 0)|
> Most non-geek distros have a clear warning you need updates.
But they are sometimes ignored. On Mint (at least) it's a small blue exclamation in the taskbar which, if small, is easily ignored for a few days. I've seen one (Mint) installation where the update shield was missing from the taskbar. It happens.
I agree about online server updates of any type.
Shoestring - I know it costs to set up a server farm. It costs less if the farm is not properly/regularly maintained, which means more customers at a cheaper rate and still a nice cream-off from the top.
Codebase - yes, I have several Mbytes of library code used by a couple of dozen sites (used to be more: I'm slowly closing them). To convert ASP Classic to PHP (or anything else) is prohibitive. It would be cheaper and quicker to rebuild every site from scratch, building a new library as I go along. Quicker - not entirely sure about that as I'm out of practice on PHP. On the upside, all the data and page structures are in MySQL apart from a handfull of text files on older sites. I started building web sites on a linux server but was persuaded that windows was a better bet - persuaded by an MS "agent". It was early days. I was internet-naive. :)
Agreed. As I said, it's mostly bad site design, but it would be nice if such problems were coded out of (eg) apache.
@wa desert rat
I do not concur with a lot of your arguments but I'll let you have the last word.
|wa desert rat|
| 4:07 am on Mar 24, 2014 (gmt 0)|
|I do not concur with a lot of your arguments but I'll let you have the last word. |
Then refute it with data not speculative theories about new users or small server farms and what might happen if they did this or that (or didn't).
30 million Windows PCs in one botnet and someone wants to know which operating system is less secure for new users. And posits that it might be Linux because of 20,000 rootkits. Really?
I have posted links which explain how all NT-based kernels are vulnerable. I have demonstrated that MS has done little other than pasting warnings about "are you sure you want to do that" (to which a user merely has to click "yes') and crippling Internet Explorer as "fixes".
I've posted descriptions of botnets as discovered by the computer scientists at UC Berkeley and how each and every one of the 30 million were on Windows PCs. None were Linux-based.
I can explain, if you like, how the "server" versions of Windows demonstrate only a minor difference from the "user" versions; other than the addition of routines to require "seats" and not to allow certain services. Indeed, Linux and Unix are very similiar in this regard. It is not difficult to turn a desktop Linux box into a server; or a router for that matter.
The argument is not that Linux is more secure than Linux (even you guys have stipulated that). The argument is that windows operating systems are so insecure that even the company that created them has been able to do little more than make yet-another add-on that looks for malware's fingerprints and then blocks it or removes it.
Windows is so insecure that an entire billion-dollar industry has been created - ex-Microsoft (for the most part) - around trying to make it secure enough to use.
Yet we still get botnets, email attacks, viruses and spyware. All aimed at Microsoft. All people do is say, "well of course... because Windows is on so many PCs it is bound to get attacked more often".
Why is it no one wonders why so many of those attacks succeed?
And when someone does - like me - everyone gets into protective mode around Windows. It's not like everyone who has Windows on a PC doesn't run spyware and malware scans or have anti-virus.
Even the people who write thousands of posts on Windows forums about how to add security to their PCs will rise up in righteous anger when I contend that the very existence of all that demonstrates that Windows is too insecure to trust.
MS can almost certainly make an OS as secure as *nix. But instead they get everyone to make excuses for the insecurity. And to point at Linux and say, "20,000 rootkits out there for Linux". And for 14 years we get the same exploits. And click-bots that cost advertisers money, email attachments that extort money from users (and not chump change, either... crypto locker - which can only affect Windows operating systems (including servers) has started asking for $500 before they remove it).
Even if every single Linux computer were hacked, how would that make Windows secure?
| 5:00 am on Mar 24, 2014 (gmt 0)|
|A janitor with the admin password can completely destroy a Windows server just by browsing. In my book that is the very definition of insecure. |
A janitor having and admin password is my definition of insecure.
|Agreed. As I said, it's mostly bad site design, but it would be nice if such problems were coded out of (eg) apache. |
I do not think much more than is already available is possible at the server or language level. Using better platforms at a higher level (good frameworks) does a lot to improve security.
|To convert ASP Classic to PHP (or anything else) |
If you ever do it, please not PHP. There are alternatives that are a lot more productive and nicer to work with.
|wa desert rat|
| 6:05 am on Mar 24, 2014 (gmt 0)|
|A janitor having and admin password is my definition of insecure. |
Not even close the first time I've found the admin password in less than secret places. With the door open. And the room marked "Server Room". Right off the main entry way.
With a Linux server the janitor can surf nasty sites all day long, download attachments, click on ads... no issues at all.
Just an example of how a well set-up, properly updated and configured windows server can be brought down. Not by at attack, but by simply browsing.
| 9:27 pm on Mar 24, 2014 (gmt 0)|
> please not PHP
I actually began web site programming using perl but that is rather complex. I've looked at alternatives and PHP seems to be more cross-platform than some others. Still, it ain't gonna happen. :)
| 11:00 am on Mar 25, 2014 (gmt 0)|
Perl um.... not for me these days - although some people still love it.
There are LOTS of alternatives. I mentioned some here: [webmasterworld.com ] but it really needs a whole thread.
|Still, it ain't gonna happen |
Yes, I guessed. I would not do a complete rewrite in your shoes either.
Funnily enough I was discussing converting a relatively simple site from Classic ASP to Django with a potential client. They eventually decided against doing anything that drastic (at least for the moment). Lost income for me, but probably the right call for them.
|brotherhood of LAN|
| 2:06 pm on Mar 25, 2014 (gmt 0)|
(ANSI) C all the way ;o)