I maintain a badly designed website and have to often make a single change to over 400 static webpages. Perl to the rescue! I cobbled this together so it doesnt' have a lot of fancy error-checking but it does open every file of every subdirectory (without recursion) and replaces every instance of your search string with your replace string which it asks on the tty. It then backs up the file it read (by adding .byProc to the end of the filename) and saves the new file. Not the best but it certainly works. You call it by ./progName /directory/path -clean. The -clean switch will search the directories for the .byProc files and unlink them. No safety net!
#!/usr/bin/perl -w my @dirList; my @fileList; my $totalEdits = 0; $procDir = $ARGV[0]; if (!$ARGV[0]) { print "Give directory: "; chomp($procDir = <STDIN>); } if ($ARGV[1]) { if ($ARGV[1] eq "-clean") { $cleanUp = 1; $unlinkedFiles = 0; } } push @dirList, $procDir; print "Processing."; while (!$toDie) { $thisDir = shift(@dirList); if ($thisDir) { opendir(CURRDIR, $thisDir); @tempDirList = readdir(CURRDIR); rewinddir(CURRDIR); close(CURRDIR); print "."; foreach ($i = 0; $i < ($#tempDirList + 1); $i++) { if (substr($tempDirList[$i],0,1) ne ".") { $completeHandle = $thisDir . "/" . $tempDirList[$i]; if (-d $completeHandle) { push @dirList, $completeHandle; } elsif (-T $completeHandle) { push @fileList, $completeHandle; if ($cleanUp && index($completeHandle, ".byProc", +0) > 0) { unlink $completeHandle or print "Error unlinki +ng $completeHandle: $!\n"; $unlinkedFiles++; } } } } } else { $toDie = 1; if ($unlinkedFiles) { $unlinkedFiles--; if ($unlinkedFiles == 0) { $unlinkedFiles = "No"; } } print "\n"; } } if ($unlinkedFiles) { print "\n\n$unlinkedFiles files deleted.\n\n"; } if (!$cleanUp) { print "\n\n" . $#fileList . " files found.\n\n"; while (!$canGo) { print "Process all files found? <y|n>\n"; chomp($goOn = <STDIN>); if ($goOn eq "y") { #work with each file $canGo = 1; } elsif ($goOn eq "n") { print "Exiting...\n"; exit; } else { print "Must be either yes [y] or no [n]\n"; } } print "Please enter search criteria (can be RegExp):\n"; chomp($toFind = <STDIN>); print "Please enter what you would like to replace it with:\n" +; chomp($toReplace = <STDIN>); foreach ($i = 0; $i < ($#fileList + 1); $i++) { open(FH, $fileList[$i]); $fileContents = <FH>; close(FH); my @fileParts = split($toFind,$fileContents); $totalEdits += ($#fileParts--); if ($#fileParts < 1) { #print "No instances of \"$toFind\" found in $fileList +[$i].\n"; } else { $oldFile = $fileList[$i] . ".byProc"; open(FH, ">>$oldFile"); print FH $fileContents; close(FH); open(FH,">$fileList[$i]"); $fileContents =~ s/$toFind/$toReplace/eg; print FH $fileContents; close(FH); print "Replaced \"$toFind\" with \"$toReplace\" $#file +Parts times in $fileList[$i]\n"; } } print "$totalEdits made in $#fileList files\n"; }
Feedback appreciated!

Fates! We will know your pleasures: That we shall die, we know; 'Tis but the time, and drawing days out, that men stand upon. - Act III,I, Julius Caesar

Replies are listed 'Best First'.
Re: Search and Replace Entire Directories
by kwoff (Friar) on Nov 16, 2001 at 22:50 UTC
    I'm glad to see these kinds of solutions, because I also maintain a big, badly-designed site, with some HTML designers who usually don't design their pages with maintenance in mind. Like, they should use server-side includes more, but it requires extra effort to create a separate file and #include that. Then if they want to change something, they'll open up the page in their browser, edit the resulting HTML, and overwrite the nicely done page with a hard-coded copy. By the time I know about it, that page has been copied like a template to dozens of other (unknown files). ARGH..

    Anyway, I also have similar problems, which I use the following for (it's not quite the same as yours and I usually end up tweaking it). I just offer it as another solution. Sorry it's not completely in perl. :)

    #!/bin/sh # query-replace files recursively, substituting in place STRING1=$1 STRING2=$2 find . -type f | xargs -n 255 perl -p000i.bak -e "s/${STRING1}/${STRIN +G2}/g"

    And a little explanation:

    `find` recursively descend the file system, and its argument "-type f" means get only look at text files (sometimes I insert "-name '*.html'". The output of `find` is a list of filenames, which are piped to `xargs`. I use `xargs` for efficiency, so that you're not starting and stopping the `perl` interpreter hundreds of times. "-n 255" shovels only 255 file names at a time, though, so that the maximum command-line arguments of the shell aren't exceeded. Finally, those files are passed as arguments (by `xargs`) to `perl`. The -p switch prints every line, but in this case each line is a "paragraph" because of the -0 switch using '00' special case end-line delimiter (I did that so you could match regexps over multiple lines). The -i switch edits "in place", with each file backed up to a file with '.bak' appended to the name.

    Example usage:

    $ '\<(\/)?[hH]3\>' '\<${1}H1\>'

    would take <H3> or </h3> and replace with <H1> and </H1>, respectively. It's a bit ugly, I admit. I end up using emacs usually.

Re: Search and Replace Entire Directories
by ask (Pilgrim) on Nov 19, 2001 at 16:43 UTC
    you should try something like
    find ./ -type f | xargs perl -i.bak -pe 's!replace!with this!g';
    next time. =)

     - ask

    ask bjoern hansen,   !try; do();