Any reason not to split perl filtering?
davidsen at tmr.com
Tue Dec 2 20:52:23 UTC 2003
In article <6F100264-247E-11D8-A872-000A9589374A at meowing.net>,
Greg Andruk <gja at meowing.net> wrote:
| On Dec 2, 2003, at 2:51 AM, bill davidsen wrote:
| > Other news programs do this, but I was actually going to go pthreads.
| > As
| > long as I use a lock to keep from running more than one at a time and
| > exploring how thread-safe the perl stuff might be, I think I can do
| > what
| > I need.
| Maybe I'm missing something, but unless the filter is spending a lot of
| time counting things between articles, isn't this still going to end up
| running pretty much sequentially?
If I can't make this machine run faster I have to disable filtering
completely. Given that as a choice, it's time to at least try to get
something running in parallel. The single threaded small memory design
is showing its age, going to faster and faster CPUs to use a single
thread design just doesn't scale. And my next target will be the input
logic, big selects suck and the throughput per sockets is poor as well.
I have one more idea, but that one is so odd I'm not ready to even
admit to having such a thought ;-)
bill davidsen <davidsen at tmr.com>
CTO, TMR Associates, Inc
Doing interesting things with little computers since 1979.
More information about the inn-workers