Beefy Boxes and Bandwidth Generously Provided by pair Networks
Syntactic Confectionery Delight
 
PerlMonks  

Re^3: What is the most efficient way to split a long string (see body for details/constraints)?

by Anonymous Monk
on Jun 21, 2019 at 18:46 UTC ( [id://11101683]=note: print w/replies, xml ) Need Help??


in reply to Re^2: What is the most efficient way to split a long string (see body for details/constraints)?
in thread What is the most efficient way to split a long string (see body for details/constraints)?

I tried to edit my comment above, but I can't see any way to do this once a comment is posted (here on PerlMonks). Anyways, I really appreciate the amount of time and effort you put into creating (and executing) these benchmarks. They really opened my eyes to the pros and cons of the approaches presented.

The how-to/cookbook format of the various implementations will definitely help guide myself and others when any of these approaches is the best fit for the task at hand.

  • Comment on Re^3: What is the most efficient way to split a long string (see body for details/constraints)?

Replies are listed 'Best First'.
Re^4: What is the most efficient way to split a long string (see body for details/constraints)?
by mikegold10 (Acolyte) on Jun 21, 2019 at 18:47 UTC

    Figured out why I couldn't edit my comment - I wasn't logged in! Duh... (hiding head in shame)

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://11101683]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others chilling in the Monastery: (4)
As of 2025-12-16 11:28 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    What's your view on AI coding assistants?





    Results (95 votes). Check out past polls.

    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.