Dotan Cohen wrote:
On 6/8/05, john bray <jmblin@xxxxxxxxxxx> wrote:
On Wed, 2005-06-08 at 01:06 +0300, Dotan Cohen wrote:
On 6/8/05, Alexander Dalloz <ad+lists@xxxxxxxxx> wrote:
Am Di, den 07.06.2005 schrieb Dotan Cohen um 23:39:
Last week I wrote that I somehow filled 7 out of 10 megs in my linux
partition. Today that last bit was filled- I am at 100% capacity.
I cannot download email or create new files. What could be the cause
of this? Where should I look for bloat? What can I delete?
Dotan
This can easily be happen if log files fill very quickly. I.e. if you
have Apache running, a fault in your page and quite some hits, the
error_log can grow rapidly. So watch out for large log files.
Alexander
--
Alexander Dalloz | Enger, Germany | GPG http://pgp.mit.edu 0xB366A773
legal statement: http://www.uni-x.org/legal.html
Fedora Core 2 GNU/Linux on Athlon with kernel 2.6.11-1.27_FC2smp
Serendipity 23:54:54 up 14 days, 22:32, load average: 0.38, 0.53, 0.49
BodyID:69189987.2.n.logpart (stored separately)
/var/logs is 23 megs (same as last week)
/var is 1.3 gigs (same as last week)
/usr is 3.7 gigs (same as last week)
/proc is 480 megs (same as last week)
I only checked those because those were the biggies last week. The
system is so slow now that it takes a long time for it to caculate
those values. Where else should I look?
Dotan
hey dotan - have you checked the size and how many files are in /tmp?
i've seen systems do that when some dolt program creates and doesn't
clean up tons of files in /tmp
john
Yes, when I checked the directory tree I check /tmp. It is empty. As
about the only thing that I can do on this machine is browse the web,
I have been looking for a command that will show me all large
files/directories. I thought that df would do it, but man doesn't seem
to know of any option that would do this. Nor does google!
How does one go about searching for bloat? All the obvious (logs, tmp,
yum clean all) leave no hints. Last thing that I want to do is to tell
Ety to log onto the windows machine because linux is broken!!!
Figure out what you consider a "bloated" file, then
find / -size +nnn[c|k] -print
For example, if you feel 200MB is bloated:
find / -size +200000k -print
will display all files 200MB (200,000kB) or larger.
I thought of waiting for FC4 and just rm'ing the whole drive (I
managed to back up /home/whats_important_to_me onto a 256 XD card with
a usb card reader), but we need the machine daily.
Please, any other ideas about what to check, and what I can erase are
seriously needed right about now. If someone I trust (the names
Dalloz, Lynch and a few others come to mind) want to ssh in, I'll see
if I can set up the daemon. Thanks all, I always appreciate your help.
And your willingness to teach patiently.
Have you checked to see if your machine is swapping? Use "free" and
see how much free memory you have. "vmstat 5" will also show you if
you're swapping.
----------------------------------------------------------------------
- Rick Stevens, Senior Systems Engineer rstevens@xxxxxxxxxxxxxxx -
- VitalStream, Inc. http://www.vitalstream.com -
- -
- Is that a buffer overflow or are you just happy to see me? -
----------------------------------------------------------------------