Tuesday, November 1, 2011

The Cure For Argument List Too Long

This is one of those things that drives me
crazy. Why does Linux have an artificial limit
on the size of an argument list? So many things
in Linux are dynamic. Why not shell argument
lists too?

Here's someone who proposes 4 different ways to
solve this problem:

Argument List Too Long

His last solution, solution #4, puts the whole
problem in a nutshell. Apparently this is a
limitation that is defined when you compile your
linux kernel.

Since most of us use precompiled kernels, we're stuck
untess we wish to recompile the kernel.

The variable that controls this is really a C
Language constant (not a true variable). It
is defined for the C Language preprocessor like
this:

#define MAX_ARG_PAGES 32

Perhaps there is a philosophical reason for keeping
things the way they are. Perhaps long command lines
that cause you to over-run the maximum allotment for
an argument list are, in themselves, bad ideas.

An overly long command line could well be a bad idea.

For this reason, I feel that the author of the above
article is probably right. Even though using the
find command takes a long time, it does, by it's very
nature, allow you to process an infinite number of files.

At least, I assume it does. Since the find command
finds one file at a time and then calls up and resolves
one process at a time to process that file, it really
minimizes the footprint (memory used) over speed (the
find command will take a long long time to do what it
does).

Perhaps things are the way they are because someone has
put a lot of thought into it and has decided not to change
things. This is often the case.

Ed Abbott