Beefy Boxes and Bandwidth Generously Provided by pair Networks
No such thing as a small change
 
PerlMonks  

Re^2: How To Remove Warnings From Regex

by roho (Bishop)
on Feb 04, 2024 at 04:44 UTC ( [id://11157516]=note: print w/replies, xml ) Need Help??


in reply to Re: How To Remove Warnings From Regex
in thread How To Remove Warnings From Regex

Thank you Athanasius! That works perfectly. For as long as I can remember, I've avoided the backslash numbers in favor of the dollar-sign numbers, but I see in this case that is exactly what is called for. Thanks again.

"It's not how hard you work, it's how much you get done."

  • Comment on Re^2: How To Remove Warnings From Regex

Replies are listed 'Best First'.
Re^3: How To Remove Warnings From Regex
by choroba (Cardinal) on Feb 04, 2024 at 18:19 UTC
    \1 works inside the regex where the capture group was defined. $1 works everywhere else, i.e. in a replacement of a substitution. $1 in a regex refers to a previously matched regex (which explains the uninitialized warnings).

    map{substr$_->[0],$_->[1]||0,1}[\*||{},3],[[]],[ref qr-1,-,-1],[{}],[sub{}^*ARGV,3]
      Thanks for the explanation choroba. It always helps to know the "why" behind the way things work. It makes it easier to remember in the future when a similar situation arises. Perl Monks are the best!

      "It's not how hard you work, it's how much you get done."

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://11157516]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others sharing their wisdom with the Monastery: (2)
As of 2025-11-11 17:03 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    What's your view on AI coding assistants?





    Results (68 votes). Check out past polls.

    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.