huge innfeed process

Russ Allbery rra at stanford.edu
Wed May 24 09:49:33 UTC 2000


Ilya Etingof <ilya at glas.net> writes:

> I've noticed that the size of my innfeed process (taken from
> inn-BETA-20000518 distribution) grows from 40M to 250M overnight. It
> normally serves about 100 connections.

> Why it becomes so huge? Is it normal or there is a memory leak
> somewhere?

I've normally seen this with an innfeed process that's serving out a lot
of binaries to slower peers; it ends up loading those articles into memory
and has quite a few of them in its internal queue.  I separate out the
innfeed for large articles from small articles, so I just decreased
max-queue-size to 5 from 25 in that innfeed and that seemed to bring the
memory usage under control.

Currently on newsfeed.stanford.edu, the large article innfeed is consuming
37MB of core instead of around 8MB like the other innfeed processes.  At
25 articles per queue instead of 5 articles, I'd expect it to often use
five times as much memory, which would be right in the same ballpark as
the 250MB you're seeing.

Note that some of that memory is memory mapped article files and therefore
may not be "real."

> I also noticed a great number of temp files left in innfeed/
> directory. They typically look like newsfeed.domain.com.output.syaWzP so
> there are no lost temp files for "input". What caused innfeed leave
> these files there? How do I prevent this from happening?

I've never seen this....

-- 
Russ Allbery (rra at stanford.edu)             <http://www.eyrie.org/~eagle/>



More information about the inn-workers mailing list