I am downloading files (in parallel) which have a very large amount of data (fastq files) into a directory. I am running out of space quickly. So I got the following script (from here, modified slightly) to compress files as they are downloaded:
inotifywait -m ./ -e create -e moved_to |
while read dir action filepath; do
echo "The file '$filepath' appeared in directory '$dir' via '$action'"
# compress file
if [[ "$filepath" =~ .*fastq$ ]]; then
pigz --best $filepath
fi
done
This helped in that I run out of hard drive space at a later time, but I'm still downloading files quicker than I am compressing. Is there a way to parallelize the compression process so that I am compressing multiple files at the same time? (I'm assuming the above code doesn't do that)
One way I can think of (perhaps) accomplishing this is by running the script from different terminals multiple times, but I'm pretty sure this is a very lousy way of doing this