Beefy Boxes and Bandwidth Generously Provided by pair Networks
Your skill will accomplish
what the force of many cannot
 
PerlMonks  

Re^2: Are beheaded strings known to be slow?

by Anonymous Monk
on Oct 10, 2025 at 17:58 UTC ( [id://11166459]=note: print w/replies, xml ) Need Help??


in reply to Re: Are beheaded strings known to be slow?
in thread Are beheaded strings known to be slow?

Thanks for clarification. I.e. we better be careful with 4-args (or LHS) variant of substr?

It's funny, Perl rightfully decides (case "0") it's cheaper to move 499_998 initial bytes _up_ than moving the trailing 500_000 bytes _down_ (and so scalar gets OOK-"contaminated"); and it's the opposite (rightfully) for "2" -- scalar stays OOK-free & usable OK with regex engine. When ("1") they are both 499_999, then... -- ??? move the 1st half UP and actually _complicate_ things! (do additional work of setting the OFFSET?)

use Devel::Peek; $Devel::Peek::pv_limit = 1; for ( 0 .. 2 ) { my $s = 'a' x 1e6; substr $s, 499_998 + $_, 2, 'b'; Dump $s; } __END__ SV = PV(0xdbb198) at 0x26f8ae0 REFCNT = 1 FLAGS = (POK,OOK,pPOK) OFFSET = 1 PV = 0x2954309 ( ""... . ) "a"...\0 CUR = 999999 LEN = 1000001 SV = PV(0xdbb198) at 0x26f8ae0 REFCNT = 1 FLAGS = (POK,OOK,pPOK) OFFSET = 1 PV = 0x2954309 ( ""... . ) "a"...\0 CUR = 999999 LEN = 1000001 SV = PV(0xdbb198) at 0x26f8ae0 REFCNT = 1 FLAGS = (POK,pPOK) PV = 0x2954308 "a"...\0 CUR = 999999 LEN = 1000002

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://11166459]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others perusing the Monastery: (2)
As of 2026-01-16 00:42 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    What's your view on AI coding assistants?





    Results (118 votes). Check out past polls.

    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.