Frequently I need to run a process for each item in a list, stored in a text file one item per line: usernames, filenames, e-mail addresses, etc. Obviously there are more than 3 ways to do this, but here are 3 I have found useful:
Bash
sh prog1.sh list.txt
Source: prog1.sh
while read line
do
echo $line
done < $1
4 lines. Not bad.
Perl
perl prog2.pl list.txt
Source: prog2.pl
while(<>) {
print `echo $_`;
}
3 lines. Pretty good.
Perl -n
perl -n prog3.pl list.txt
Source: prog3.pl
print `echo $_`;
1 line! The -n
switch basically wraps your Perl code in a loop that processes each line of the input file. I just discovered this while flipping through my 17-year-old copy of Programming Perl (link is to a newer edition).
I really like this method because you can write a script that processes a single input that could easily be reused by another script, but can also easily be used to process an entire list by adding just the -n
switch. (There’s also a similar -p
switch that does the same thing, but additionally prints out each line.)
I should note that in the examples above, I am using echo
as a substitute for any command external to the script itself. In the Perl examples, there would be no need to call echo
to merely print the contents of the line, but it’s a convenient stand-in for a generic command.
As suggested by a comment on a previous post, I have made these examples available in a git repository: iterate over lines.
You can put the bash loop on one line:
If you're dealing with a whitespace-delimited file (all items on one line) you can use a for loop instead:
xargs
is another option.-L 1
limits the number of items from the input file per execution-a input_file
specifies the file (or arg-file)Note that the
-a
option forxargs
is available for the Gnu version.If you are using another version of xargs, try this instead:
The Perl option can be used directly from the command-line as well: