I don't really have numbers to show this. But it's pretty easy to
reproduce these sorts of situations. For example, create a couple 512Mb
files and delete them at the same time, while trying to accomplish
anything else (like a compile in the background).
Other similar nonsense can be created with stuff like:
#!/bin/sh
while dd if=/dev/zero of=foo.$$ bs=4096 count=16384 1>/dev/null 2>&1 \
&& cat foo.$$ >/dev/null; do
:
done
Run a few of those at the same time. They will saturate the disk, and
anything else trying to use the disk will have no hope.
I know this last example is abusive. But I don't think deleting large
files is abusive.
I can guess what's going on... but I wouldn't mind hearing it from folks
more familiar with the code.
Dean
-
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@vger.rutgers.edu