Jacques B. wrote:
I had a look at rsync and it is a very handy tool no doubt. I had some idea what it was about but had never played with it. Further to my previous posting on md5deep, I had a momentary brain hiccup. You don't need a full backup to compare with. Rather you generate a file containing all the hashes of your trusted system.
Yes, that can be a quicker check, but I start with the premise that you need the backup anyway, since other things can go wrong. I happen to like backuppc for this when another system is available to run it because it is completely automatic and has an efficient storage format that can keep a long history on line in less space than you would expect. It also has a history view where you can see what files changed and when over the interval that you keep backups.
> You
could later on run md5deep in check mode using the hash file you generated and md5deep would report back which files do not match anymore. Of course you'd have to restore that file from a backup or re-install from a trusted online repository. The advantage of this for a home user is that it doesn't require a full backup of your system (hence doesn't require all that disk space). md5deep much like md5sum simply generates a checksum file. So that is the extent of your additional footprint on your system for using such a system. It's actually pretty much how Tripwire and such tools work.
A rootkit will typically replace your md5sum, ps, ls, netstat and similar programs with ones that lie about the programs that were replaced, so you need to be running from a bootable CD to trust the results. It might be possible to make rsync do the same, but I doubt if it has been done since it has to match block-checksums through the file with a real copy - and I'd start by restoring an old copy of rsync anyway.
Having said all that when you get right down to it all a home user needs to do to be safe is keep the system updated, exercise good judgement (vis-a-vis email attachments, downloading from untrusted sources, phishing attacks), use very good passwords, and put in a cheap home router/gateway (of course dial-up not applicable for home router). With that and the fact that they are running Linux does an excellent job of keeping them safe in their single user environment. Even a home user that runs a web server with a static site, or has ssh enabled but not for root will be pretty safe if they follow the above.
The updates are the real key here. There have been a huge number of exploitable vulnerabilities fixed over the last several years and keeping up with those should be your first line of defense. This is a particular problem for distributions like fedora that have a fast life cycle and don't ensure an easy upgrade path from one version to the next. If it is difficult to stay up to date, some number of people will keep running old versions.
SELinux is an additional layer of security that certainly can't hurt.
The place it can hurt is if it causes enough problems that some number of users don't don't upgrade to the versions that use it or don't do timely updates because they have a history of introducing new problems. This drops your first and best line of defense.
In a corporate environment it's obviously very different. Using different means of access control, using other layers of security such as SELinux, implementing physical security measures, are all things that need to be done, and properly.
If you are introducing Linux as something new you can do that. Otherwise you have to be very careful not to break existing programs and infrastructure with changes and updates.
I read somewhere online a while back where they hooked up various unpatched Windows systems (different generations of it) and unpatched Linux systems (don't remember the distros) to the web totally unprotected. The various Windows versions were all compromised within minutes to hours. None of the Linux ones were. However when all the updates were applied to these boxes none of them were compromised (no Windows boxes and no Linux boxes).
Exactly, but the thing they should have compared is the life-span over which you can do this without a re-install from scratch or the user time involved over the life of a computer. If you had installed windows 2000 or XP around their SP2 time (or whenever MS introduced on-line updates) and a RH9 or fedora box, how much user time/effort would it have taken to keep those boxes within a few days of available updates. With windows there would have been a lot of reboots, but nothing with more effort than clicking the update link. With fedora, you'd have gone though perhaps 7 re-installs and in my case at least 5 or 6 updates that required selecting an older kernel to even reboot.
If you want a distribution to be more secure in actual use, you have to make it painless to update and never break anything that previously worked - otherwise some number of people just won't do it.
-- Les Mikesell lesmikesell@xxxxxxxxx