Beefy Boxes and Bandwidth Generously Provided by pair Networks DiBona
No such thing as a small change
 
PerlMonks  

Re^2: Janitors Thread Retitler v1

by tye (Sage)
on Dec 02, 2004 at 19:17 UTC ( [id://412006]=note: print w/replies, xml ) Need Help??

This is an archived low-energy page for bots and other anonmyous visitors. Please sign up if you are a human and want to interact.


in reply to Re: Janitors Thread Retitler v1
in thread Janitors Thread Retitler v1

I'd prompt for a 'from' and 'to' and not try to parse "Re^2:" but just s/\Q$from/$to/ on the titles. Perhaps slightly more robust would be

s/(?!(?<=\w)\w)\Q$from\E(?!(?<=\w)\w)/$to/

Which is like \b\Q$from\E\b except that it is more DWIM in cases where $from starts or ends with \W characters.

And I'd prompt for what to add to the bottom of retitled nodes (with what castaway proposed as a default) since I'd only rarely want to include the full original title, especially that many times.

I didn't check, but I hope it allows me to retitle subtrees of a thread.

- tye        

Replies are listed 'Best First'.
Re^3: Janitors Thread Retitler v1
by ysth (Canon) on Dec 03, 2004 at 01:46 UTC
    "that many times" ? I thought we only put the janitor-comment on the top node. Or perhaps I have no idea what you mean?

      Sorry, I missed "root". I still would rather not quote the entire previous title in many (most) cases.

      I was thinking it might be nice to have two default attributions to pick from, the second being something like "... by tye: $title =~ s/$old/$new/...". Actually, I'd like [id://22609] used instead of [tye], but I don't mind maintaining my own copy of the script for such quirks. :)

      - tye        

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://412006]
help
Sections?
Information?
Find Nodes?
Leftovers?
    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.