coppro wrote:
Aaah, but if you are targeting individual workstations, what do you do?
Just pointing out a wee little flaw in your argument there
.
Of course, there
is a big difference between tech-savvy server wizards and the great unwashed masses of the desktop. Not just because many desktop users are less knowledgeable (and in a perfect world, they shouldn't
have to be tech wizards to be able to surf safely (and
freely)), but because many of the security measures you use on a server would make things less convenient in day-to-day desktop use.
Like, say, user management and privileges. Servers (among other things) typically run their services in unprivileged accounts, so any exploits will have the least effect possible. Now, user privileges is a common thing in Linux (and *BSD, etc), even on the desktop. To be fair, Windows has that, too (well, kinda), but most Windows desktop users still use the administrator account for normal use, or a similarly privileged account. It's more convenient, after all.
Anyway, it's not hard to make a virus that would work on a Linux system. From the OS's point of view, there's nothing wrong with an application that harvests executables and modifies them so that they execute malicious code when run -- the program could have a legitimate reason to do what it does, after all. Who's the OS to tell ?
The problem for the virus is first to get run in the first place (uncritical execution by the user of randomly downloaded applications helps there), and then getting access to those other executables to infect. On a system with differently privileged users, a normal user doesn't have write access to any system-wide files, so the virus simply can't do anything outside its little current-user playground. It can run just fine, maybe infect some local executables owned by that user, delete some files here and there, maybe mail your mount point list to Korea and generally do a bit of damage -- but the damage it can do is limited to the user who got the virus, and it won't spread farther than that. It can't format your hard drive, or even cause the OS to stop working. Unless, of course, the virus takes advantage of a bug/exploit -- but those get fixed rather quickly in the Open Source world. Literally anyone can fix it once the exploit is known, after all
.. well, provided they know how to code, but hey.
(The openness of the source code isn't just positive, though -- it can be both a pro and a con. Pro, because potentially everyone can know how things work and find and fix exploits, sometimes even before the exploit is, well, exploited, but also con, because potentially anyone can find exploits in the code but
not tell anyone, and with commented code at their disposal in stead of just some disassembled symbol-less machine code. Since an exploit has to be known before it can be fixed, every little bit can potentially bite you, if it's not fixed as soon as it's discovered. Security through obscurity is hard to pull off in an open source application, because everything is open, after all. Of course, if security through obscurity is your only security, you've got a problem anyway.)
Well, got a little bit sidetracked there, but what's my point, anyway ? To be honest I'm not completely sure -- I just started writing and came up with this, and I think I might just have rehashed, less coherently, what others have already said. Go me. In any case, you're partially right. Viruses can exist on Linux, they can do damage and even spread over the network (sending stuff over the net isn't a privileged operation) unless you've got a firewall. The damage is usually limited, but that doesn't mean it can't be a disaster for the users it affect. However, unless you got infected by a virus exploiting some as-yet unpatched exploit or have unusually lax permissions, your system won't go down.
The morale of the story: Keep backups, and keep /etc/shadow unreadable by non-root.
-
While I'm at it, I prefer BSD-style licenses over GPL-style ones. I'm not against the idea of the GPL per se -- I even think it can be the license of choice in some cases -- but the BSD, as I see it, means more freedom for everyone, and not just the users. So what if that means someone can take advantage of the code I write, even if (shock! horror!) it could be Microsoft* ? The original code will still be there, free for all. Likeminded people will send patches, and the code will survive.
Btw, even RMS has said that the GPL was never intended for things other than source code.
-
Also, as for trusted computing, who said it never surfaced ? Can people run whatever they want on their unmodded Playstations and Xboxes and Gamecubes, and play their DVDs and "
copy-protected"
CDs and iTunes music on "
unauthorized clients"
, such as, say, Linux,
without using what is, in some parts of the world, legally questionable hacks ? Sure, the former examples are proprietary systems that's not meant for general use anyway, and the latter can be worked around since they're just software-based, but can we be sure that's the end of it ? They're there to combat piracy, sure, but at what cost ? And if new restrictions are introduced gradually enough, software solutions moved to hardware ..
Where, and when, do we draw the line ? If, say, Microsoft had Windows print a warning if unsigned applications attempted to run. The warning could be turned off in the control panel, and everyone could have their applications signed for free in any case! For free! Gratis, zip, nada, nothing. Just send it to MS, wait a few weeks or so, and voila. Surely this can only be a good thing, since viruses will never be signed, and everyone can get their apps signed. No problems there, right ?
So, after this has generally been accepted, people don't have to turn off the warning. Because of large amounts of work for software signers, the signing period is increased to a month. Two. A small one-time fee is introduced for faster software signing. But hey, that's understandable, right ? They're doing this for
our good, and it's a wonder they could keep signing all those applications for free anyway. The warning is changed to an error. The small fee is made mandatory, which raises a few mutters, but it's still understandable -- the fee is pretty small, and signing is so much work. In fact, even the per-application fee that surfaces some time later is understandable. However, it's still too much work, so now only approved software publishers can get their applications signed. Oh, and did I mention the OS must be signed, too ? A modified OS could run pirated games, after all.
Yes, I know that's paranoid. Do I really believe this could happen ? Well, no. Or maybe. I don't know. I think it's at least possible that they -- that's the undeterminable, ominous they, btw -- that they could try. Greed is a scary thing.
- Gerry
* It's no secret that Microsoft use BSD-licensed code (or used, at least, last time I checked). BSD still requires you to include the license, so a list can be found in plain sight in one of their readmes.