blocking pipe read from zombie process?

From: Nick Cabatoff (ncc@cs.mcgill.ca)
Date: Thu Jun 15 2000 - 14:03:17 EST


Apologies if this isn't the right place for this question... I guess
I'm still trying to localize my problem, and would appreciate any
advice.

In my perl script I read from a file handle, which happens to be a pipe.
At some point the child process I'm reading from terminates normally
(and becomes a zombie), but what should be a last read operation that
returns nothing instead never returns at all.

I'd be inclined to blame this on perl except for a couple of oddities.
One is that doing a ps lax shows that the perl process is using WCHAN
pipe_read. Also, ltrace output shows

_IO_getc(0x081e69f8) = '\n'
memmove(0x081e7700, 0xbfffdc70, 63, 63, 0x080c14ac) = 0x081e7700
realloc(0x08251ce8, 8367) = 0x08251ce8
memmove(0x08253d57, 0x081e7700, 63, 0x081de980, 0x080c1d4c) = \
  0x08253d57
strncmp("0 packages upgraded, 0 newly ins"..., "0 packages upgraded, \
  0 newly ins"..., 73) = 0
_IO_getc(0x081e69f8 <unfinished ...>
--- SIGTERM (Terminated) ---

The first line cited is the last byte read from the pipe. At this point
the other process has exited, and I know that it didn't print anything
after that newline. Then we try and read another byte, and block
indefinately instead of immediately returning with no data. So it
doesn't look like perl's fault to me.

This is on 2.0.35/6 and 2.2.12 (perl 5.00404, if that matters.)

Any suggestions?

-
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@vger.rutgers.edu
Please read the FAQ at http://www.tux.org/lkml/



This archive was generated by hypermail 2b29 : Thu Jun 15 2000 - 21:00:36 EST