http://www.perlmonks.org?node_id=392267


in reply to improve ugly flow control

Is anyone saying that Aristotle's clean and simple (IMHO) solution has any problems?

Replies are listed 'Best First'.
Re^2: improve ugly flow control
by dragonchild (Archbishop) on Sep 22, 2004 at 12:19 UTC
    For one item, Aristotle's solution is excellent. However, I believe my solution scales better when performing this kind of test over and over. *shrugs* It all depends on what you want to do.

    ------
    We are the carpenters and bricklayers of the Information Age.

    Then there are Damian modules.... *sigh* ... that's not about being less-lazy -- that's about being on some really good drugs -- you know, there is no spoon. - flyingmoose

    I shouldn't have to say this, but any code, unless otherwise stated, is untested

      How so? You've basically turned my TRY block into a function and deferred the failure handling through an exception. That doesn't seem more efficient to me — am I missing something?

      I don't particularly like exceptions as a mechanism to deal with soft failures though. In case I did need to handle multiple cases, I'd do something much along the lines of my first post, like this:

      OPTION_LIST: for( [ \%hash1, \@options1, \&do_something1, ], [ \%hash2, \@options2, \&do_something2, ], [ \%hash3, \@options3, \&do_something3, ], ) { my ( $hash, $options, $callback ) = @$_; foreach my $try ( @$options ) { next unless exists $hash->{ $try }; $callback->( $try ); next OPTION_LIST; } log_failure(); last; }

      Note that both this and your code is deficient if you need atomic behaviour; do_something1 will already have been called by the time a failure to find any of the @options2 in %hash2 is detected. If that is undesired, a proper exception-based solution will hardly differ from the non-exception solution.

      Makeshifts last the longest.

        My understanding of the OP's question was that this wasn't a "soft failure" situation. This was an out-and-out "Bad Thing"™ kind of failure. Exceptions aren't for everyone - this is true. However, a recommended method of working with DBI is to use RaiseError and eval/$@ blocks.

        The reason to defer to a function is that this method can now be used by multiple scripts / modules in multiple situations, providing the same handling in all installs for a given company. "Efficient" can mean multiple things. In my case, I look for efficiency in developer time. I am almost always less concerned with CPU/RAM efficiency1, because they almost always cost less than the equivalent developer time.

        1. Except, of course, in the pathological case where the tradeoff is grossly prejudiced against the hardware. It's a standard maximization problem that all first-year calculus students learn to solve.

        ------
        We are the carpenters and bricklayers of the Information Age.

        Then there are Damian modules.... *sigh* ... that's not about being less-lazy -- that's about being on some really good drugs -- you know, there is no spoon. - flyingmoose

        I shouldn't have to say this, but any code, unless otherwise stated, is untested