Beefy Boxes and Bandwidth Generously Provided by pair Networks
Your skill will accomplish
what the force of many cannot
 
PerlMonks  

Re^2: Diff'ing dates

by LighthouseJ (Sexton)
on Oct 02, 2007 at 14:33 UTC ( [id://642117]=note: print w/replies, xml ) Need Help??


in reply to Re: Diff'ing dates
in thread Diff'ing dates (solved)

Using your inspiring one-liner, I morphed my script with yours and came up with this:

/bin/perl -MTime::Local -ne '{$t[1]-- && push(@epochs,timelocal(@t[5,4,3,2,1,6])) if (@t=split)} END { print $epochs[1] - $epochs[0]; }'

The END block is most likely what will change because the value of the diff will be used to determine exit status.

Thanks a lot for the fresh perspective, it was very helpful.

"The three principal virtues of a programmer are Laziness, Impatience, and Hubris. See the Camel Book for why." -- `man perl`

Replies are listed 'Best First'.
Re^3: Diff'ing dates
by ikegami (Patriarch) on Oct 02, 2007 at 14:50 UTC

    The exit code can be controled as follows:

    END { exit 1+($epochs[0] <=> $epochs[1]) }
    0: first time is earlier 1: both times are the same 2: second time is earlier
    perl -MTime::Local -lane'END{exit 1+($d<=>0)}$F[1]--;$d+=($.*2-3)*time +local@F[5,4,3,2,1,6]'
      Oh, I'm not interested in which is earlier, I know the second one will always be later (or the same) than the first. I'm more interested in the actual value of the difference but I can handle that easily enough.
      "The three principal virtues of a programmer are Laziness, Impatience, and Hubris. See the Camel Book for why." -- `man perl`

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://642117]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others contemplating the Monastery: (2)
As of 2025-05-14 05:16 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.