Any reason not to split perl filtering?

Russ Allbery rra at stanford.edu
Tue Dec 2 03:14:27 UTC 2003


bill davidsen <davidsen at tmr.com> writes:
> Russ Allbery  <rra at stanford.edu> wrote:

> | You take a fairly sizable I/O performance hit by doing this since you
> | have to start talking back and forth with a separate process (hence
> | going through the kernel).  Whether that's better or not depends a lot
> | on the system configuration.

> Other news programs do this,

Yup, I know.  They take a sizable performance hit by doing so.  :)

> but I was actually going to go pthreads. As long as I use a lock to keep
> from running more than one at a time and exploring how thread-safe the
> perl stuff might be, I think I can do what I need.

Threading and Perl does not mix at all well.  Some people have used it,
but be prepared for a lot of instability.

> My fallback is to use two processes sharing a memory segment large
> enough to hold the max size article. That would cut the overhead down to
> one memcopy, although not desirable.

Actually, I think writing to a pipe is going to be of comparable speed to
using shared memory, if not faster.  But my information may be dated.

-- 
Russ Allbery (rra at stanford.edu)             <http://www.eyrie.org/~eagle/>

    Please send questions to the list rather than mailing me directly.
     <http://www.eyrie.org/~eagle/faqs/questions.html> explains why.


More information about the inn-workers mailing list