These files can be sent in via sftp or ftp with files encrypted by gpg. Files can also be zipped or compressed with gz. Once the files are either unzipped, decompressed or decrypted I'm expecting a plain ASCII text file.
Without knowing all the intricacies and ins and outs of your workflow pipeline, I'd almost say (in my admitted ignorance) that the whole "check for ASCII" test is a bit superfluous.
Wouldn't the upstream process be responsible for checking if the decryption/decompression was successful, and wouldn't the downstream process be responsible for checking for well-formed data? Is there a particular case you are trying to guard against?
It seems to me that a plain -T $file should be sufficient to catch a rogue encrypted and/or compressed file that made it past the first process without triggering an error (although what happens if the file is Base64 or otherwise ASCII-armored?)
|