http://www.perlmonks.org?node_id=1073777


in reply to Removing non-printing (hex codes) from text files

Do you know what encoding these files are using? There are tools to convert from one encoding to another.

If you really need to filter unprintable characters, line by line, then

$\ = "\n"; while (<>) { s/[^[:print:]]//g; print; }
does this, but also converts to unix line endings...

tr/// will be faster, however. Characters \177-\377 aren't ASCII printable, you may want to filter those, too. There's no need to chomp line ends, only to add them back later.

while (<WORK>) { tr/\000-\011\013\014\016-\037\177-\377//d; print $file $_; }

Even better is to specify the good characters, remove anything other. Consider using fixed-length input (avoid arbitrary big line buffers). Finally, you probably want raw binmode to avoid problems with ^Z and so on. See open.

use open IO => ':bytes'; $/ = \4096; ... while (<WORK>) { tr/\012\015\040-\176//cd; print $file $_; }