Beefy Boxes and Bandwidth Generously Provided by pair Networks
XP is just a number
 
PerlMonks  

Best Nodes

( [id://9066]=superdoc: print w/replies, xml ) Need Help??

Best Nodes of the Day, Week, Month, and Year

Are they really the best nodes? Maybe, maybe not. You be the judge!

Also check out Selected Best Nodes — today's random selection of 50 of the top 2000 nodes of all time.

By the way — Please don't upvote these nodes just because other monks thought they were good. If you do, their node reputations will increase — but then we won't really be showcasing the best nodes.


Best Nodes of The Day

# Node Author Rep
1 Re^3: joining string content Corion 15
2 Re: joining string content johngg 14
3 Re: joining string content Corion 14
4 Re: joining string content choroba 13
5 Re^5: joining string content tybalt89 12
6 Re: joining string content hippo 12
7 Re^5: joining string content Corion 11
8 Re: HTTP::Cookies::Chrome doesn't seem to decrypt properly brian_d_foy 7
9 Re^2: joining string content GrandFather 6
10 Re^2: joining string content Anonymous Monk 6
As of Jun 25, 2025 at 07:42 UTC, Next refresh in 30 mins ±15 min

Best Nodes of The Week

# Node Author Rep
1 Re: Control C; into a running system / exec? tybalt89 24
2 Re: Control C; into a running system / exec? haukex 19
3 Re: Control C; into a running system / exec? hippo 17
4 Re: Obtain the full path of a json key (gron) Arunbear 16
5 Re: Obtain the full path of a json key haukex 16
6 Re: Obtain the full path of a json key Corion 15
7 Win32::Netsh tests repaired (-: Intrepid 15
8 Re: Control C; into a running system / exec? swl 14
9 Re: Obtain the full path of a json key tybalt89 14
10 Re^3: Obtain the full path of a json key tybalt89 13
As of Jun 25, 2025 at 06:42 UTC, Next refresh in 2 hours and 30 mins ±15 min

Best Nodes of The Month

# Node Author Rep
1 Re: Speed comparison of foreach vs grep + map Corion 32
2 Re: Zipping the contents of a directory by filename choroba 31
3 Re: Zipping the contents of a directory by filename Fletch 29
4 Re: LWP file size hippo 28
5 Re: Speed comparison of foreach vs grep + map GrandFather 28
6 Re^3: Speed comparison of foreach vs grep + map eyepopslikeamosquito 27
7 Re: Speed comparison of foreach vs grep + map ikegami 27
8 Re: Nooo!... Have I trashed my Strawberry? swl 25
9 Re: Flipping the Sign Bit in pack()'ed Value choroba 25
10 Re: Control C; into a running system / exec? tybalt89 24
As of Jun 24, 2025 at 15:26 UTC, Next refresh in 7 hours and 15 mins ±15 min

Best Nodes of The Year

# Node Author Rep
1 DBI revived Tux 53
2 Re: Perl Best Practices -- 20 years later choroba 47
3 Re: Why is using binmode on a file handle 77 times slower than just closing it and re-opening? choroba 43
4 The recent outage Co-Rion 41
5 Re: Interrupting a loop choroba 41
6 Re: Returning to Perl after almost 3 decades talexb 40
7 DuckDuckGo Donates $25,000 to The Perl and Raku Foundation marto 39
8 Re: Issue of formatting columns of data choroba 39
9 Re: Returning to Perl after almost 3 decades NERDVANA 39
10 Perl Best Practices -- 20 years later pfaut 38
11 Re: Does perl have a builtin limit to the size of shared memory segments I can write to? NERDVANA 37
12 Does anyone use Perl on Windows? stevieb 37
13 Re: Does anyone use Perl on Windows? Discipulus 36
14 Who's still around? stevieb 36
15 Re: Two variables changed by tr hippo 36
16 Generate random strings from regular expression bliako 36
17 CPANSec is now CNA! Tux 35
18 Re: Why does this compile? choroba 35
19 Re: How do I call a sub using a variable as its name, objectively choroba 35
20 Re: Hash syntax hippo 34
As of Jun 25, 2025 at 06:42 UTC, Next refresh in 22 hours and 30 mins ±15 min
Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others pondering the Monastery: (5)
As of 2025-06-25 08:02 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.