From: Brad Templeton (email@example.com)
Date: Tue Sep 09 1997 - 12:33:40 CDT
Hmm. My feeling was (and I didn't see much disagreement) that multi-part
articles were a bad idea, to be strongly depricated.
The reason is this. If a site puts on an article size limit, it puts it
on for a reason, and that reason is to control use of its own resources.
Which means the goal is not to have people "get around" the limit by
breaking an article up into pieces.
Multiparts are a pain. The software to reassemble them is complex, because
the parts can come in any order among other things. Often single parts
are missing, either temporarily or forever, making the entire multipart
useless in may cases.
The limit on the size of an article a site can take is not supposed to be
a limit in the software, at least not any limit less than say 5 megabytes.
There is no excuse for writing software that can't handle a 5 megabyte
article today, not when RAM is $3 per megabyte and disk 10 cents.
So if there is a limit, it's a software-configured site limit, in which
case doing a multipart to get around it should be considered net abuse.
I would have the spec say multipart MAY be used ONLY if an article
exceeds 5 megabytes, and that readers should assemble such multiparts
if they wish to, but local sites may also discard them.
Multiparts were a kludge developed in the days when 50kb was "a lot," mostly
to handle binaries and the odd very long text article. They are little
but a pain in the butt today.
On internal subnets, people will start sending video around using USENET
(it's actually the right way to send video, for it is on the very large
files that the efficiency of pre-distribution really shines through) but
that's the only thing likely to push modern limits.