Beefy Boxes and Bandwidth Generously Provided by pair Networks
"be consistent"
 
PerlMonks  

Re: Optimize file renaming.

by trammell (Priest)
on May 26, 2005 at 13:49 UTC ( [id://460686]=note: print w/replies, xml ) Need Help??


in reply to Optimize file renaming.

Nobody has mentioned the rename(1) script (installed into /usr/bin/rename on my Debian machine, part of the perl package) written in Perl. IIRC it's been around since Perl 4 days....
% touch 1.png 11.png 1111.png % rename 's/(\d+)/sprintf("%03d",$1)/e' *.png % ls *.png 001.png 011.png 1111.png %

Replies are listed 'Best First'.
Re^2: Optimize file renaming.
by Fletch (Bishop) on May 26, 2005 at 14:08 UTC

    And if you don't have it and don't feel like grabbing the entire source tarball:

    #!/usr/bin/perl -w # rename - Larry's filename fixer $op = shift or die "Usage: rename expr [files]\n"; chomp(@ARGV = <STDIN>) unless @ARGV; for (@ARGV) { $was = $_; eval $op; die $@ if $@; rename($was,$_) unless $was eq $_; }

    Always handy. Don't leave $HOME without it.

Re^2: Optimize file renaming.
by blazar (Canon) on May 27, 2005 at 08:54 UTC
    On the system I'm currently on:
    $ file $(which rename) /usr/bin/rename: ELF 32-bit LSB executable, Intel 80386, version 1 (SY +SV), for GNU/Linux 2.0.0, dynamically linked (uses shared libs), stri +pped
    However from the description you give of perl's rename it seems it lets you run arbitrary code. I hope it uses Safe; although I heard some experienced and knowledgeable perl hacker say that even that isn't really bulletproof...

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://460686]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others contemplating the Monastery: (4)
As of 2025-05-23 21:18 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.