Beefy Boxes and Bandwidth Generously Provided by pair Networks
There's more than one way to do things
 
PerlMonks  

Re^4: Ultimate anti-leech, anti-proxy, anti-bot, CAPTCHA works, link does not (code included)

by taint (Chaplain)
on Apr 20, 2013 at 00:14 UTC ( [id://1029602]=note: print w/replies, xml ) Need Help??


in reply to Re^3: Ultimate anti-leech, anti-proxy, anti-bot, CAPTCHA works, link does not (code included)
in thread Ultimate anti-leech, anti-proxy, anti-bot, CAPTCHA works, link does not (code included)

Greetings BrowserUk, and thank you for the reply.
Yes. Performing copy/ies does seem a bit inefficient. The files are "mini" Operating systems, intended for various embedded systems. Most are < 120Mb -- but still.
A search on CPAN for Symlink::Temp yielded no results. Looks like I'll have to create the module myself. File::Symlink && File::Symlink::Temp -- coming soon to a CPAN mirror near you! :)
Maybe I'll crack open File.pm, and see if I can figure out how I might create a Symlink module
With some of the properties File::Temp provides. Or am I just wasting my time?

Thanks again for taking the time to respond BrowserUk.

--chris

#!/usr/bin/perl -Tw
use perl::always;
my $perl_version = "5.12.4";
print $perl_version;

Replies are listed 'Best First'.
Re^5: Ultimate anti-leech, anti-proxy, anti-bot, CAPTCHA works, link does not (code included)
by taint (Chaplain) on Apr 20, 2013 at 00:46 UTC
    I may have spoken too soon.
    File::Symlink::Atomic appears to be pretty close to solving much of my needs.
    If I could add the UNLINK feature of File::Temp to this, I'd be set!

    --chris

    #!/usr/bin/perl -Tw
    use perl::always;
    my $perl_version = "5.12.4";
    print $perl_version;
      If I could add the UNLINK feature of File::Temp to this, I'd be set!

      Just one thought.

      If you've ever had a long download interrupted just before it completes, and then discovered that the link became immediately invalid as soon as the download was interrupted thus preventing you from just completing the partial download and forcing you to start the whole process over from scratch, you might consider your users and abandon that idea.

      I remember having to restart a 400MB download from IBM, 8 times before it completed properly. Given I was using a 56kb modem and it always seemed to get to around the 350MB mark before aborting, it was intensely frustrating.


      With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
      Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
      "Science is about questioning the status quo. Questioning authority".
      In the absence of evidence, opinion is indistinguishable from prejudice.

        I remember having to restart ...

        I've been there, but a lot of the good anti-leech ... will let you resume with a new link, well using wget anyway (since it does its bytes/range getting based on filename/filesize, not url)

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://1029602]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others taking refuge in the Monastery: (2)
As of 2024-03-19 06:15 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found