|We don't bite newbies here... much|
OK, then the first step is to learn about the different types of "DOS" commands. DOS and Windows have a long and ugly history, and you need to learn a little bit of that history if you want to do more than just pushing the mouse over the desktop. My ancient DOS manuals distinguished between three types of DOS commands:
One day, the DOS developers decided to look at the file data instead of the file extension when running external commands, so it was possible to rename EXE files to *.COM and vice versa. This allowed them to compile and link commands as EXE files while still using a COM file extension for backwards compatibility.
So, you really need to distinguish between internal and external commands.
With Windows 2000, XP, and newer, things become a little bit complicated, they inherited a very similar, but still different command shell named cmd.exe from the NT line, which behaves different and has different build-in commands. Of course, each Windows version has its own version of cmd.exe, all with different features. cmd.exe is the default shell on 2000 and newer, but command.com is still there.
dir, cls, echo and a few others are internal commands, i.e. you have to invoke them using command.com or cmd.exe. Typically, you prefix the desired command with command.com /c or cmd.exe /c, depending on the OS version and on which implementation of the command you want to execute.
The majority of commands are external commands, most of then are EXE files, a few are renamed to *.COM for backwards compatibility. These commands can be executed directly, just like you would execute any other application (because that's what they are).
The next DOS / Windows history lession is the command line. Unix shells expand, interpret and split the command line and pass an array of parameters to the invoked program. If you type foo *.txt *.asc into a Unix shell, the foo program is typically invoked with a list of all txt and asc files in the current directory. DOS and Windows shells do not expand or split the command line, instead, they pass it as a single string to the invoked program. If you type foo *.txt *.asc into a DOS or Windows shell, the foo program gets *.txt *.asc as a single command line string, and has to expand and split it in its very own way.
Most DOS and Windows programs delegate that to the C runtime library (or equivalent), and so most DOS and Windows programs expand the command line in a very similar way. But there is no guarantee for that.
Under DOS, the command line was limited to 126 bytes due to the COM file format, and sometimes, you still run into this limit.
The DOS / Windows shells still process the command line, e.g. they replace environment variables or remove redirections from the command line, so the command line is not passed completely unmodified to the invoked program. The rules to prevent this from happening (i.e. quoting) differ from shell to shell, and from OS version to OS version
So invoking the DOS shell really begs for trouble. (Not that unix shells are paradise, their quoting rules are a little bit simpler, but still differ from shell to shell.) Obviously, avoiding the shell as much as possible prevents problems. For exec and system, this means to use the multi-argument form. When you call exec or system with a single argument, perl may pass that argument to a random shell, leaving you with a lot of problems.
On Windows, things become again problematic here, as there is no way to pass an array of parameters to the invoked program. Perl has to generate a single command line string from your arguments. And you have to hope that perls algorithm to construct the string is compatible with the invoked program's algorithm to split it again into an array.
I/O redirection is once again problematic on Windows. On Unix, you fork, modify I/O and environment as needed in the forked copy of the current process, and then execute the real child, which inherits I/O redirection, environment and so on. On Windows, you have to do all this when creating a new process using the CreateProcess API function, and for more tricks, you need the CreateProcessEx API function. (I wonder when Microsoft will reach CreateProcessExExExEx ;-)) Don't worry, perl.exe will handle most of this for you. But it is not perfect.
So, your really best bet on Windows is NOT to use external programs at all if you can avoid it.
You do not need dir, as I've explained before. copy and xcopy can be replaced by File::Copy and File::Copy::Recursive. For ftp, you have a load of CPAN modules, like Net::FTP, LWP::Simple, LWP::UserAgent. cd won't work at all, but you can use chdir and Cwd.
For many other commands, you will find superiour perl replacements, either build-in (e.g., the DOS find command is a crippled grep) or on CPAN. Only for very low-level access (fdisk, format, network config), you need either an external command or access to the Windows API. Often, Win32::API can be helpfull, sometimes DBD::WMI works, and if you are lucky, someone has already written a module and published it on CPAN, typically named Win32::Something.
Today I will gladly share my knowledge and experience, for there are no sweeter words than "I told you so". ;-)
In reply to Re^3: why IPC::Open3 can't execute MS-DOS "dir" command?