There are places where embedded \000 characters will get you into trouble, but Perl
can read and write such files without problems, since Perl uses counted strings.
Be warned though that passing such strings to anything like a system routine (like open() or -x), that take null-terminated strings as arguments, you will have problems or security holes.
| [reply] |
That answers it nicely. As is, I'm planning to slurp it all
in and then do a
@msgid=split /\000/, $data;
foreach $j (@msgid)
{ $mid($j}++; }
(reason? I wanna filter my POP account by already-dl'ed
message-id's)
--
Perl is intergalactic! WolfSkunks use it!
| [reply] [d/l] |
If by "slurp it all in" you mean something like:
$/ = undef;
$_ = <>;
then Perl will certainly have no trouble at all reading all the data and assigning it all to $_, no matter what it contains. It will print it all out to STDOUT or any file handle as well (but you may need to be careful if the file handle is actually a pipe to a less forgiving process).
Plain old line-based I/O will also treat nulls just like any other character that isn't (part of) a line terminatation -- lines containing nulls will be fully read and written.
Of course, sometimes it makes sense to use nulls as the input record separator:
# one way to replace nulls with newlines:
$/ = "\x00";
while (<>) {
s/\x00/\n/;
print;
}
In addition to matching nulls exactly with "\x00" in regexes, Perl also matches them via the "." wildcard, and the negated character classes "\S", "\D", "\W", and so on. | [reply] [d/l] [select] |