Beefy Boxes and Bandwidth Generously Provided by pair Networks
"be consistent"
 
PerlMonks  

Re^4: 5.10 imminent?

by grinder (Bishop)
on Apr 12, 2007 at 15:37 UTC ( [id://609690]=note: print w/replies, xml ) Need Help??


in reply to Re^3: 5.10 imminent?
in thread 5.10 imminent?

Delta Copy appears to be a windows wrapper around rsync (and doesn't require you drink the Cygwin Kool-Aid beforehand).

• another intruder with the mooring in the heart of the Perl

Replies are listed 'Best First'.
Re^5: 5.10 imminent?
by BrowserUk (Patriarch) on Apr 12, 2007 at 15:55 UTC

        Okay, 20 minutes later and I have a version of rsync that seems to run, so I switched into a directory where I'd like the sources put and typed the advised command:

        c:\Perl\src>rsync -avz rsync://public.activestate.com/perl-current/ receiving file list ... done drwxr-xr-x 6592 2007/04/12 16:55:20 . -rw-r--r-- 6 2007/04/12 16:55:20 .patch -r--r--r-- 31992 2007/03/20 16:06:44 AUTHORS -r--r--r-- 6111 2006/06/13 20:28:51 Artistic -r--r--r-- 3776973 2007/02/27 13:20:45 Changes ... -r--r--r-- 1363 2006/06/13 20:29:38 x2p/str.h -r--r--r-- 3637 2006/06/13 20:29:38 x2p/util.c -r--r--r-- 944 2006/06/13 20:29:38 x2p/util.h -r--r--r-- 48960 2006/06/13 20:29:38 x2p/walk.c sent 99 bytes received 86198 bytes 232.29 bytes/sec total size is 62055224 speedup is 719.09

        Result. A 10 minute wait and exactly nothing transferred!

        Of course you say. If you don't have a copy of bleed perl to be updated, then there is nothing to do?

        Or maybe I'm using the wrong options?

        Or...

        Could I work out what I'm doing wrong (besides following the instructions slavishly)? Maybe, but if the desire is to encourage widespread partisipation in the smoking (Is that legal?) of bleed perl, then it would make sense to make it reasonably easy for people to do so. Without having to ask 20 questions and end up feeling foolish in the process. Because otherwise, those few (non-usual suspects) who decide they'd like to have a go, will stop before they get started.

        From my perspective, I prefer a link to an archive file that I can download because wget knows how to resume a broken download without needing to go through an ever more laborious process of working out how far it got last time and what has changed in the meantime. A download of 15 MB can take me 2 or 3 days. Stuff changes. ISP drop connections. Each time it resumes, there is more stuff at this end to sync, and more likelyhood that stuff at the other has changed. wget doesn't have this problem.


        Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
        "Science is about questioning the status quo. Questioning authority".
        In the absence of evidence, opinion is indistinguishable from prejudice.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://609690]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others exploiting the Monastery: (3)
As of 2025-04-30 00:55 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.