Re: Source code in programming faq

---------

Vic Metcalfe (vam@brutus.tlug.org)
Tue, 23 Apr 1996 22:05:55 -0400 (EDT)


Thanks for all your suggestions. I should clarify a couple of things.

- The size of the source code (uncompressed text) is about 41K, which
compresses to a 10K .gz file, and 14K uuencoded.

- My idea of providing the source in shell script format was to write my
own shar like utility that would generate more human readable scripts. I
will provide a short example of what I mean:

cat << END_OF_SAMPLE > sample.c
/****************************
** This is a sample file. **
****************************/
int main() {
return(0);
}
END_OF_SAMPLE

Most of this is easy to read, for any person. Shar files aren't too bad,
but not quite this easy to read.

- I'm not really to worried about the ethics of including binaries, since
I'm convinced that they are a really important part of the faq, and
reader feedback has confirmed that opinion. Plus, at 14K they aren't
large enough to generate much complaint.

I think I am left with these options:

1 - Provide source by ftp, and point those without access to a ftp-mail
gateway. I like this idea. I just called a friend with an ftp server,
and I can put it there.

2 - Put it in the faq, as it was, and go over 64K. I run brutus here, so I
can update/reconfigure/whatever the news software to handle it, but since
some systems are supposed to not handle incoming news bigger than 64K, I
don't think this is a good idea.

3 - Go multi-part, and put the source in another section, either tarred,
zipped and uuencoded, or as a non-shar, human readible shell script.

I like 1 and 3.

1 is good because it is easy, but on the other hand, the faq is growing,
and I think it'll be bigger than 64K soon anyway, so I might have to go
multi-part anyway. 1 also saves bandwidth on usenet. The problem with 1
is that I want the faq to be user friendly, so when Jane Blow obtains it,
I don't want to make her chase down the samples which are refered to in
the faq. This problem is worst for the uucp/fidonet people, but even the
PPP/SLIP people will be annoyed when they've downloaded it, hung up, and
then find they have to reconnect because they didn't get it all after all.

3 is good because it keeps the thing together as a unit, and people are
more likely to get it all at once. Having it in fragments isn't a bad
thing because it has to go that way anyway, if it is to stay below 64K
chunks. The question then is to stay with tar/gz/uu format, or do the
shell script for human readers. I think that the tar/gz/uu format is
better since it is much smaller, and my audience is technical enough that
I haven't had a single person email to ask me what the heck it means.
I'm only concerned because of this quote from an email I received from a
contributer: (andrewg@microlise.co.uk)

"As far as the examples go, *don'* post them uuencoded as a separate
article. If you do, it will fall foul of cancel-bots targeted at binary
postings. (I think the previous versions escaped this fate since they
contained a large proportion of non-binary data)."

Does anyone know if systems really still choke on large usenet postings?
Are thise cancel-bots reality, or myths from ancient times? It seems to
me that with things like alt.binairies... groups being popular, anything
that couldn't handle it would be rare. Should I just forget the problem
and go bigger than 64K?

Thanks once again for your comments,
Vic.



[ Usenet Hypertext FAQ Archive | Search Mail Archive | Authors | Usenet ]
[ 1993 | 1994 | 1995 | 1996 | 1997 ]

---------

faq-admin@landfield.com

© Copyright The Landfield Group, 1997
All rights reserved