Beefy Boxes and Bandwidth Generously Provided by pair Networks
XP is just a number
 
PerlMonks  

Re: Using threads to process multiple files

by locked_user sundialsvc4 (Abbot)
on Jan 30, 2015 at 20:38 UTC ( [id://1115133]=note: print w/replies, xml ) Need Help??


in reply to Using threads to process multiple files

This node falls below the community's threshold of quality. You may see it by logging in.

Replies are listed 'Best First'.
Re^2: Using threads to process multiple files
by BrowserUk (Patriarch) on Jan 30, 2015 at 22:11 UTC

    Downvoted: Because the OP stated that his non-threaded version works fine; and he's trying to use threading to speed it up.

    How could moving from his two memory-based hashes, to disk-based tied hashes "speed things up", when they are at least 1000 times slower.


    With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    "Science is about questioning the status quo. Questioning authority". I'm with torvalds on this
    In the absence of evidence, opinion is indistinguishable from prejudice. Agile (and TDD) debunked
Re^2: Using threads to process multiple files
by eyepopslikeamosquito (Archbishop) on Feb 05, 2015 at 23:11 UTC

    Upvotes all around, sirs ... (as BrowserUK excellently describes)

    Downvotes all around you, sir ... for being obsequious.   Eew'd.    “--”

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://1115133]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others musing on the Monastery: (3)
As of 2026-02-12 04:34 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.