Forrest J. Cavalier III
mibsoft at epix.net
Fri Dec 8 04:37:02 UTC 2000
I think some have missed my key point:
The point is not how sites are providing service today. The point
is how full sites will continue to do a good job in the long term.
Full feed growth is outstripping Moore's law, and that is a problem.
Moore's law is approximate, of course. Keeping costs fixed,
and looking at historical trends....
CPU speeds double every 18 to 24 months.
Historically magnetic storage capacity doubles every 16 months
Bandwidth doubles every 12 months.
Suppose we take Usenet doubling every 6 months to be true, as Joe
St. Sauver estimated, that means Usenet is outstripping Moore's law,
which is the challenge I hoped would be discussed.
Is everyone just planning to subset Usenet? No one carries a
full feed? Even that doesn't scale as well as you think.
The current transport mechanism (flood-fill) isn't designed
to work well in a fragmented system. Peering arrangements
and the protocol are too static and coarse to allow selective
How do you find a peer willing to exchange a feed of
and assure that all sites carrying that newsgroup remain
Everyone is so used to having a very well connected mesh
of Usenet sites. You dump an article to your peers,
and you can assume it reaches everywhere.
When most sites are carrying and propagating only subsets
of Usenet, that assumption is invalid. How can that be fixed?
More information about the inn-workers