[bind10-dev] Thoughts on Architecture of Xfrin & Notify-In/Out

Shane Kerr shane at isc.org
Mon May 17 16:23:27 UTC 2010


On Mon, 2010-05-17 at 10:42 -0500, Michael Graff wrote:
> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA1
> 
> On 2010-05-16 11:53 PM, Shane Kerr wrote:
> 
> > BUT... if we assume reasonable zone refresh rates and/or relatively
> > decent connectivity, then this won't be such a big problem. Consider
> > that with 100k zones and a 2-hour refresh rate, we should see only a few
> > 10s (20? 30?) of transfers in progress during normal operation. Someone
> > with 100k zones won't be surprised to see lots of processes! ;)
> 
> Wait, I thought xfer-in was called only when there was something to
> transfer, not for a SOA check or simple refresh operation.

Good point.... during "normal" operation there should only be a very few
zones being transfered. 

> > So I think it is not always possible to limit the number of XFR in
> > progress to a "reasonable" number. So I think a threaded (or
> > event-driven) model makes sense here.
> 
> I will still throw in for the "keep it simple, make it complicated
> later" approach of single threaded design and implementation.

I'm not sure it is more simple. For the most part good threaded design
and coding is just good coding. At least for problems like this which
are largely parallel. But see below...

> > The XFR code is relatively simple now, and if it remains simple we can
> > be more sure that it is robust. I propose we go forward with the
> > threaded model until we discover that it is broken, and then fix it if
> > necessary.
> 
> I propose the opposite, for the very same reasons:  the code is simple,
> keep it simple until there is a reason to complicate it.

Ideally it is 99% the same in Python. In the one case you use the
threading module:

----------------------------------------------
import threading

def work(data):
    print("I got some data:", data)

t = threading.Thread(target=work, args=(123,))
t.start()
t.join()
----------------------------------------------

In the case of multiple processes you use the multiprocessing module:

----------------------------------------------
import multiprocessing

def work(data):
    print("I also got some data:", data)

p = multiprocessing.Process(target=work, args=(123,))
p.start()
p.join()
----------------------------------------------

This works fine as long as you don't access global data and suchlike.

http://docs.python.org/py3k/library/threading.html
http://docs.python.org/py3k/library/multiprocessing.html

I guess we could try to support both models from the beginning. Since
the multiprocessing modules was designed to be as close to the threading
module as practical, this shouldn't be too tricky.

--
Shane




More information about the bind10-dev mailing list