Let me first say that I almost always prefer Perl over the shell as well. But I don't always agree with your arguments; they are a bit too black and white for me.
Writing portable shell scripts is hard. I don't think it's hard, although it's easy to write something that is not portable. But you have several options available to make porting easier. First is to limit yourself to the POSIX standard. Most Unix vendors support at least POSIX complaince for their shell tools. The second option is to use the GNU tools, and install the GNU tools on all the platforms your program needs to run on. GNU tools have been ported to all major Unix platforms, and most run on Windows as well. Is each external command available on each of our platforms? That's a valid question to ask when porting a shell program, but an equally valid question is to ask whether each Perl module you use is available on each of the platforms you use. This may usually be true for pure Perl modules, but is not always true for XS modules. Specially not if said modules use third party libraries.
Shell scripts tend to be fragile. Running commands and scraping their output is an inherently fragile technique, breaking when the format of the command output changes. My experience is that the behaviour of shell commands tends to change less than the behaviour of Perl modules. Sure, a shell command may output something different if you upgrade it, but a Perl module may change its behaviour as well if you upgrade it. And Perl modules don't have a standard - shell programs do: POSIX. And then there's perl itself. In the 25+ years of programming that I do, I cannot recall a program breaking because of an upgrade of the shell. An upgrade of Perl always tends to break at least one of my programs in some way.
Shell scripts tend to be slow. That's 'the pot calling the kettle black', isn't? Agreed, the shell doesn't shine when it comes to speed, but neither does Perl, does it? Luckily, with modern machines, it's usually the disk or the network (or the memory) that's the bottleneck, so the relative slowness of the language itself doesn't really matter. But in the few cases where it did, I'm always amazed by the speed of C compared to Perl (every now and then I rewrite a Perl program in C for speed reasons).
Shell scripts tend to be insecure. Running an external command in a global environment is inherently less secure than calling an internal function. Eh, not necessarely. Languages don't make programs (in)secure. Programmers do. And frankly, I have more trust in what my vendor puts in /usr/bin than what I download from CPAN. I do not agree that running an "external command" is "inherently" less secure than calling an "internal function". And I've no idea what you mean by "global environment".
As a programming language, shell is primitive. Shell does not have namespaces, modules, objects, inheritance, exception handling, complex data structures and other language facilities and tools (e.g. Perl::Critic, Perl::Tidy, Devel::Cover, Pod::Coverage, ...) needed to support programming in the large. Up to version 5, Perl didn't have namespaces, modules, objects, inheritance nor exception handling. And C still doesn't have them. And depending what you mean by 'complex data structures', C doesn't have them either. As for Perl::Critic, Perl::Tidy, Devel::Cover, Pod::Coverage, I disagree that they are needed - in fact, I wouldn't want to use Perl if they were needed to program in Perl. And in the history of Perl, they are the new kids on the block. People prefered perl over shell long before those tools were first created.
If you're sure the script will "never" need to be ported to Windows. Been, there, done that. Shells (and shell like tools) have been ported to Windows, just as Perl has. And remember the time the perl port on Windows was different from the one on Unix? At that time, that wasn't true for some of the shells. "Oneperl", the perl that merged the Unix and Windows ports only dates from 1998; macperl merged even later than that (and that only because MacOS went Unix).