Beefy Boxes and Bandwidth Generously Provided by pair Networks
Just another Perl shrine
 
PerlMonks  

Permission denied error from dirmove function of File::Copy::Recursive.

by Gulliver (Monk)
on Jul 22, 2011 at 18:43 UTC ( [id://916189]=perlquestion: print w/replies, xml ) Need Help??

Gulliver has asked for the wisdom of the Perl Monks concerning the following question:

I have written a Perl application that downloads several directories to a temporary download folder before doing some operations and then moving subfolders from the download folder to various locations. The problem is that other people browse through the download folder during the ftp transfer and occasionally cause the dirmove function to fail.

The download folder gets created with a timestamp ( 12 digits in the actual application) so it is unique and easy to find if something goes wrong. I read about flock but it doesn't seem to work for directories when I tried it. I'm thinking about setting the hidden bit (Windows 2008 server) during the download. I already have a function that is triggered by __DIE__ so I could unhide it if something happens.

The code below allowed me to recreate the problem. I found if I opened the directory in Windows Explorer then there was no error. If I opened the file with Notepad at the target location my script would move the file but then die without moving the directory. By changing to the directory in a command prompt before the dirmove then I get the "Permission Denied" error and nothing is moved.

Has anyone else dealt with this? Any suggestions?

#!/usr/bin/perl use warnings; use strict; use File::Copy::Recursive qw( dirmove fmove ); # 'D:/Users/bas/ftp_20110722/1/logs/ACREROCK TIE_SUB_log. +txt'; my $frompath = 'D:/Users/bas/ftp_20110722'; my $topath = 'D:/Users/bas/logs'; my $target = '/1/logs'; print "\nabout to move $frompath$target\n"; print "press <enter>"; <>; dirmove "$frompath$target", "$topath$target" or die "can't move $frompath$target to $topath$target :",$!;

Results in this:

d:\Users\bas>perl dirmove_test.pl about to move D:/Users/bas/ftp_20110722/1/logs press <enter> can't move D:/Users/bas/ftp_20110722/1/logs to D:/Users/bas/logs/1/log +s :Permiss ion denied at dirmove_test.pl line 22, <> line 1.

Replies are listed 'Best First'.
Re: Permission denied error from dirmove function of File::Copy::Recursive.
by ikegami (Patriarch) on Jul 22, 2011 at 20:09 UTC

    There's nothing you can do to force the files to get deleted.

    Windows does have an means of requesting that a file gets deleted as soon as possible. This is done by calling CreateFile (which is kinda like Perl's open) using FILE_FLAG_DELETE_ON_CLOSE. I can help you implement this if you're interested in using it.

      So instead of moving the directories I could copy them and then delete each file one at a time. Any file that can't be deleted I could use CreateFile to mark it to be deleted when closed.

      Your link led me to GetSecurityInfo which can be used to get security information about a directory. If a delete directory failed I could get information about who has it open. It doesn't look like CreateFile will let me mark a directory for deletion. But I think I'm getting in a little over my head now.

      From Creating and Deleting Directories:

      To delete an existing directory, use the RemoveDirectory or RemoveDirectoryTransacted function. Before removing a directory, you must ensure that the directory is empty and that you have the delete access privilege for the directory. To do the latter, call the GetSecurityInfo function.

        It doesn't look like CreateFile will let me mark a directory for deletion.

        I didn't see anything either, but that means you'll be left with empty directories at worse. Is that really so bad? They can get removed the next time around.

Re: Permission denied error from dirmove function of File::Copy::Recursive.
by roboticus (Chancellor) on Jul 22, 2011 at 19:17 UTC

    Gulliver:

    I'd suggest making a list of the operations that fail, so when you get to the end of your script, you can periodically retry the items on the list until they either succeed, or until your script decided it has tried enough times. Then report the problems at the end for a human to fix.

    ...roboticus

    When your only tool is a hammer, all problems look like your thumb.

      Each step in this application builds on previous steps so your suggestion doesn't really apply. It would be like building a house without all the pieces and going back later to fix things. Plus there are up to 250,000 files in up to 5000 directories involved.

      What I have already done is make the script look for an existing download folder and process any files there.

      In the problem I stated of people browsing the directory, if someone consistantly does this after running the script then the script will consistantly fail which would be an embarassment. If I can send a more descriptive error message than "permission denied" then hopefully I can change this behavior or find out if there is something else to blame as other responders have suggested.

Re: Permission denied error from dirmove function of File::Copy::Recursive.
by Marshall (Canon) on Jul 23, 2011 at 03:43 UTC
    I found the unlocker freeware utility which can delete "locked" files. There is a command line interface to it. I do not know the details of how it works.

      This is very interesting. I don't want to install Freeware on a corporate server but if this VB or C# app can do it then maybe I could do the same. If I could find something open source that is similar I could see how to do it.

        I did some more snooping around and found unlocking-files-that-are-in-use. The article actually mentions unblocker and shows how to unlock a file with Microsoft's Sysinternal process explorer tool. I don't know if there is a command line interface to this or not, but looks certainly like something worth more investigation. I presume that your company would not have a problem with freeware from Microsoft's technet.

        I still was not able to find the underlying mechanism that these folks use to perform the unlocking - but evidently enough folks have figured it out that with more effort, this could be discovered. I hope that the above tool can be controlled via command line and if so, then I think you are in good shape.

Re: Permission denied error from dirmove function of File::Copy::Recursive.
by flexvault (Monsignor) on Jul 23, 2011 at 12:46 UTC

    I don't usually comment on windows specific problems, but this problem could exist in any computer environment as well.

    I suspect that you have some 'daemon process' accessing the files for indexing purposes, and if you could find how the daemon locks the files, then duplicate that process. This would normally happen very fast, so I doubt a 'human' user is your problem.

    Some possible solutions that might work:

    • If your operation is not 24x7, then run this script off-shift in single user mode.
    • Can you change the owner of the directory/files to be accessed only by your script
    • Can you mount a separate drive and copy the directory. Obviously this drive needs to be restricted from your users.

    This is a common problem in multi-user, multi-tasking operating systems, so there has to be a way to fix this. Perl is not the problem, but perl has tools to access your system, and invoke some operating specify solution.

    Good Luck

    "Well done is better than well said." - Benjamin Franklin

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://916189]
Approved by ww
Front-paged by planetscape
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others avoiding work at the Monastery: (6)
As of 2025-03-27 16:56 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    When you first encountered Perl, which feature amazed you the most?










    Results (70 votes). Check out past polls.

    Notices?
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.