Beefy Boxes and Bandwidth Generously Provided by pair Networks
The stupid question is the question not asked

Removing Directory and Contents with Perl

by htmanning (Monk)
on Mar 01, 2007 at 22:09 UTC ( #602793=perlquestion: print w/replies, xml ) Need Help??
htmanning has asked for the wisdom of the Perl Monks concerning the following question:

I'm trying to have my script remove a directory and its contents (if any) but I've tried these and they don't work.
$deleteregdir = "$rootpath/$username"; rmdir ($deleteregdir);
I've also tried this...
$deleteregdir = "$rootpath/$username"; rmdir -r ($deleteregdir);
and I've tried this...
$deleteregdir = "$rootpath/$username"; system ("rmdir -r $deleteregdir");
All three leave the directory there if it is not empty. If it's empty it gets deleted.


Replies are listed 'Best First'.
Re: Removing Directory and Contents with Perl
by bobf (Monsignor) on Mar 01, 2007 at 22:16 UTC

    If you want to delete a directory tree (including the contents), use the rmtree function in File::Path.

    In addition, you may be interested in File::Spec, which can be used to portably perform operations on path names.

    Finally, if you want to stick with rmdir, you should also look at unlink.

Re: Removing Directory and Contents with Perl
by ptum (Priest) on Mar 01, 2007 at 22:31 UTC

    Perhaps what you want, for at least some unix variants, is something like this:

    #!/usr/local/bin/perl -w use strict; my $deletedir = '/tmp/test1'; my $result = system("rm -rf $deletedir"); print "Result: $result\n";

      I would put some strict testing around an "rm -rf" command, especially if you're running this script as root/Admin. We recently lost several TB of data because a script didn't set the directory variables correctly and ran "rm -rf /*" (instead of "rm -rf $path_to_home/*") as root.

Re: Removing Directory and Contents with Perl
by grinder (Bishop) on Mar 02, 2007 at 08:38 UTC

    If you want to delete an entire tree of directories in the file system, you want to use rmtree from the File::Path module (it's part of the standard distribution, you already have it on your system).

    It takes a reference to a list of directory starting points, so it looks like this:

    use File::Path 'rmtree'; rmtree([ "$rootpath/$username" ]);

    You don't actually have to specify the 'rmtree' function in the use statement, as it gets exported (made available) by default, but I like to do so anyway, so that subsequent maintainers can see quickly where a function came from. This is especially important when you have half a dozen modules bringing in bits and pieces.

    If you had an array of user directories you wanted to remove, you could fill the list via a map:

    rmtree([ map {"$rootpath/$_"} @user ]);

    The reason it uses this slightly odd approach is because there are two little-known, rarely-used positional control parameters you can tack on the end:

    rmtree([ map {"$rootpath/$_"} @user ], 1);

    The above call will behave "verbosely", printing out what it's doing as it goes about its job.

    There's a second parameter that you can also set to a true (non-zero, 1 by convention) value, that will instruct rmtree to skip the attempt at deleting something for which the process lacks the necessary delete or write privileges (on platforms like VMS where that makes sense).

    I've never really understood the purpose of this, since whether you ask the function to skip the files, or let it attempt to do so anyway, and fail, the result is the same: there will be some files that will not be deleted. But it usually works out as it should. If the rmtree fails, an equivalent rm -rf would as well.

    The uselessness of the second parameter notwithstanding, it should be pointed out that this sort of interface is positively archaic, and not to be emulated if you're just starting out. The reason being is that if you want to change the default skipping behaviour by setting the second control parameter, you are forced into a decision about what to set the first control parameter: 0 or 1. What does it do again? Which one sets the print behaviour, the first or the second?

    There are better interfaces available: either using objects with chained methods to change the behaviour, or just simply a series of public routines like rmtree, rmtree_print and so on.

    • another intruder with the mooring in the heart of the Perl

Re: Removing Directory and Contents with Perl
by Moron (Curate) on Mar 02, 2007 at 14:54 UTC
    Also, I think one of the most important lessons to learn from this is to check the return code of builtin and library functions you call and use $!, e.g.:
    rmdir $deleteregdir or die "$!: for directory $deleteregdir\n";
    ...would at least have told you in fairly plain English why that use of rmdir didn't work.


    Free your mind

      #!/usr/bin/perl deldir("test"); # or deldir($ARGV[0]) to make it commandline sub deldir { my $dirtodel = pop; my $sep = '\\'; #change this line to "/" on linux. opendir(DIR, $dirtodel); my @files = readdir(DIR); closedir(DIR); @files = grep { !/^\.{1,2}/ } @files; @files = map { $_ = "$dirtodel$sep$_"} @files; @files = map { (-d $_)?deldir($_):unlink($_) } @files; rmdir($dirtodel); }
      This is the code I wrote for my friend to delete directories. Simple no modules, just plain pure PERL. Readmore over here

        --, because that code just hurts my eyes, even more than "PERL". ( The name of the language is "Perl", the name of the executable for running scripts is "perl", but there is no "PERL".)

        Problems with the code, from first to last line:

        • neither "use warnings;" nor a -w parameter in the shebang line.
        • no "use strict;"
        • Hardcoded name of directory tree to be removed in call to deldir().
        • Misleading function name "deldir". This function attempts to delete an entire directory tree, not just a directory. "deltree" would be a better name.
        • my $dirtodel=pop; instead of my $dirtodel=shift; or my ($dirtodel)=@_;, which accesses the last argument instead of the first one, without any explaination and without any need to do so.
        • Non-portable backslash as path separator. DOS and Windows have no problems with a forward slash.
        • Misleading comment after that assignment makes a naive reader think that Linux needs a special treatment (the forward slash), whereas all other operating systems work with a backslash. In fact, the forward slash works with most operating systems including DOS, Windows, and MacOS X. Only some exotic or old systems (VMS, classic MacOS) need a different character. Delegating that problem to File::Spec (a core module delivered with perl) would completely eliminate the need to modify this code for different operating systems.
        • opendir without error checks. The code does not use autodie, so errors are not automatically handled.
        • Directory handle DIR is not localised.
        • Use of a bareword handle (DIR) instead of the modern form opendir(my $handle,$name)
        • No error check for closedir. (I regularily omit that check, too. Simply because closedir does not return any useful errors. You could pass a wrong handle to it, but that's the only error condition for closedir that I know.)
        • grep filters out all directory content that starts with one or two dots, followed by any amount of arbitary characters. So, grep also filters out all hidden directory entries on Unix-like systems. The regular expression lacks the final $.
        • grep should be moved into the readdir line, so that @files does not have to be re-assigned using a temporary list. (I think modern perl versions should be able to optimize the temporary list away, but I would not bet on it.)
        • The first map modifies each entry in the @files array returned by grep by uselessly assigning to the aliased $_, and copies the modified entry into a temporary list, which finally overwrites @files. (Again, modern perl versions may be able to optimize a little bit, but again I would not bet on it.)
        • The first map should be moved into the grep { ... } readdir line, again avoiding useless re-assignments to @files.
        • The second map again overwrites the @files array, without any need to do so. @files is never used again after this map. So, in fact, this map would better be written as an ordinary foreach loop iterating over @files.
        • The ternary ?: operator fills the @files array with the return values of unlink and deldir, but no piece of the code checks the return values. Instead of ?:, better use if (...) { ... } else { ... } here.
        • unlink without error checks, and no implicit error check by use autodie.
        • unlink called for each single file in a large list of files. unlink accepts a list of files, and after filtering out directories, unlink could delete all files at once. Of course, calling unlink with more than one file makes it very hard (or impossible) to give an exact error message with the name of the files that could not be deleted.
        • The map / ?: line suggests that deldir has a return value with the same semantics as unlink. The very first call to deldir suggests that it is a void function returning nothing useful. So, what does deldir return?
        • The very last line in deldir implicitly returns whatever rmdir returns. I have the impression that this is accidentally, because there is no explicit return and the very first call to deldir is a void call. Also, the second map is effectively a void call. On the other hand, the usual error check (rmdir ... or die ...) is also missing, autodie is not used, and the return value of rmdir has the same semantics as unlink. So, either insert an explicit return and document that the caller has to check for errors, or add an error check.

        You might notice that the code contains exactly one non-empty line that I did not critisize. It's the final line "}".

        Problematic recursion: In the worst case (deep tree where readdir returns subdirectories before files), this keeps a list of all files in the entire directory tree in memory.

        Wrong tool:

        • Most, if not all, Unix-like systems have an external rm command that can recursively delete (parameter -r) an entire directory tree without asking questions (parameter -f). It is optimized for this job, running at least as fast and as efficient as any perl code you could write. So, on Unix-like systems, system('/bin/rm','-rf',$directoryName) and die "rm -rf $directoryname failed" is often the best solution.
        • On DOS / Windows, you can get the same effect either with the old deltree utility or the (enhanced) del command with the /s /f /q parameters.
        • For cross-platform code, File::Path::rmtree() is the portable solution. File::Path is a core module, so it is available whereever you run perl.

        Using no modules here is not only stupid, but dangerous. Your code is broken in every single line except for the final bracket, uses far too much resources, and does not do what it promises to do. If your friend really runs this code, he should better get someone else to write code for him until you learned to write safe and clean code.


        Today I will gladly share my knowledge and experience, for there are no sweeter words than "I told you so". ;-)

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://602793]
Approved by bobf
and all is quiet...

How do I use this? | Other CB clients
Other Users?
Others studying the Monastery: (3)
As of 2016-12-09 22:37 GMT
Find Nodes?
    Voting Booth?
    On a regular basis, I'm most likely to spy upon:

    Results (158 votes). Check out past polls.