http://www.perlmonks.org?node_id=278562

dragonchild has asked for the wisdom of the Perl Monks concerning the following question:

In Re: Inlining method/function calls?, Zaxo describes using the C preprocessor for macro definitions. All of a sudden, I'm wondering why constant is needed. Could someone tell me why constant is preferable to #define in the following code?
#!/usr/bin/perl -P #define ABC 1 use constant DEF => 1; print ABC, $/; print DEF, $/; ------ $ ./abcd gcc: file path prefix `/usr/ccs/bin/' never used 1 1
One thing I'm noticing is that gcc has a warning it throws up. But, wouldn't using the C preprocessor be faster and use less memory than constant, especially for large numbers of them? I'm thinking of sites with mod_perl and a large number of hits ...

Also, what's the difference in using #include vs. require? And, doesn't #define provide first-class macros where there is no real provision for them in Perl?

------
We are the carpenters and bricklayers of the Information Age.

Don't go borrowing trouble. For programmers, this means Worry only about what you need to implement.

Please remember that I'm crufty and crochety. All opinions are purely mine and all code is untested, unless otherwise specified.

Replies are listed 'Best First'.
Re: C pre-processor in Perl
by adrianh (Chancellor) on Jul 28, 2003 at 19:46 UTC
    Could someone tell me why constant is preferable to #define in the following code?
    • constant will work on platforms without cpp (that's the big one for me :-)
    • constant is package scoped. #define is not.
    • With constant you have access to perl at compile time. With #define you do not.
    • Different versions of cpp on different platforms do different things.
    • It's unlikely to be much faster, if at all. The constant's are expanded at compile time so there is no run-time cost using either system. Starting up a cpp process may actually slow down your compile time.
    • require() will include your code only once, compile in a separate lexical file scope, and in package main by default. #include will not.
    • C macros are very much not first class macros IMHO. See Macros, LFSPs and LFMs for more on this.
    • :-)

Re: C pre-processor in Perl
by simonm (Vicar) on Jul 28, 2003 at 19:18 UTC

    Could someone tell me why constant is preferable to #define in the following code?

    Perhaps in the case you've shown, they're more or less equivalent, but as things grow more complex, the benefits of using constant instead start to show up.

    Constants are scoped to Perl packages, not to a source file, which minimizes namespace conflicts:

    package Potato; use constant tuber => 1; package Peach; use constant tuber => 0; package main; print Potato::tuber, Peach::tuber;

    You can also define constant references to an array or hash:

    use constant visited_urls => {}; ... unless ( visited_urls->{$my_url} ++ ) { ... }

    Of course, there's tricks you can pull with pre-processor macros that go beyond what use constant can do, but for simple cases like the one you show, I'd encourage you to standardize on constant instead.

      That constants are scoped to a Perl package vs. a source file seems to be a straw man for a few reasons:
      1. Good development techniques usually indicate that one package should exist in one file.
      2. For those cases where 2+ packages should co-exist in one file, I have often found that I want the same constants for all the packages within a file. But, I have to do things like Parent::SOME_CONSTANT instead of just SOME_CONSTANT, for that specific reason.
      Your two examples also seem to have contrivance problems.
      1. If I want to associate tuber-ness with Potatoes vs. Peaches, I would make it accessible via some method. So, instead of Potato::tuber, I would have Potato->is_tuber, or some such.
      2. Constant references to hashes/arrays don't seem to have a huge benefit. Why not just use %visited_urls instead? You gain scoping and the ability to treat it like any other variable. Instead, by making it a constant, I still don't have constant-ness and I'm requiring my maintainers to keep track of the fact that this name is supposedly a constant, though its contents are variable.
      I'm not attempting to shoot down all of your statements. I'm attempting to question what's behind them, so that you could potentially explain your thought processes. This is a completely new feature of Perl to me, and one that seems to have huge benefits. But, I am a little concerned with the fact that I've been programming in Perl for over 8 years and I've never heard of it. That raises a red flag to me and I wanted to know more about it.

      ------
      We are the carpenters and bricklayers of the Information Age.

      Don't go borrowing trouble. For programmers, this means Worry only about what you need to implement.

      Please remember that I'm crufty and crochety. All opinions are purely mine and all code is untested, unless otherwise specified.

        Good development techniques usually indicate that one package should exist in one file.

        Agreed, but what if I want to use those constants from a subclass, or in some other piece of code? More pre-processor trickery? Ick.

        The Perl pragma constants are accessible at runtime, and can be exposed to other modules in a normal Perl fashion.

        For those cases where 2+ packages should co-exist in one file, I have often found that I want the same constants for all the packages within a file. But, I have to do things like Parent::SOME_CONSTANT instead of just SOME_CONSTANT, for that specific reason.

        One technique you might find applicable is to define a constant and then export it to your other modules, the same way you would if they were spread out over multiple files:

        package MyFlags; use constant FooFlag => 42; use constant BarFlag => 23; use base 'Exporter'; BEGIN { @MyFlags::EXPORT_OK = qw( FooFlag BarFlag ); } BEGIN { $::INC{'MyFlags.pm'} ||= __FILE__; } package MyWidgetFactory; use MyFlags qw( FooFlag ); print FooFlag . "\n"; package MyFlyingMonkey; use MyFlags qw( FooFlag BarFlag ); print BarFlag . "\n";

        If I want to associate tuber-ness with Potatoes vs. Peaches, I would make it accessible via some method. So, instead of Potato::tuber, I would have Potato->is_tuber, or some such.

        Sure, by all means, go ahead and treat it as a method; after all, use constant is mostly just shorthand for making simple subroutines:

        package Potato; use constant tuber => 1; package main; print Potato->tuber;

        Constant references to hashes/arrays don't seem to have a huge benefit.

        Agreed, there's no breakthrough here -- just some syntactic sweetener... If your module's interface features a lot of public methods and only a couple of public package variables, it can be attractive to wrap those references in constant subs.

        This is a completely new feature of Perl to me, and one that seems to have huge benefits.

        Yup; both the CPP #defines and Perl constants are useful tools, each with their own ups and downs; I hope this post has clarified why, if given the choice, I'd typically use constant.

Re: C pre-processor in Perl
by Rhandom (Curate) on Jul 28, 2003 at 19:24 UTC
    I agree with perrin on the require portion. However - on the constant portion - I have used schemes similar to the following many times:

    use constant foo => do { my $foo; # do something that is really hard to lookup or calculate $foo; # return of the do };


    This way, say you are in a daemon, or long running process, or mod_perl, and so on - you can get the compile time benefits, even with something that might take a minute or so to calculate the first time. It doesn't fit all situations - but for those it does it is nice to do a Deparse and see chunks of code removed when doing a

    if (foo) { } else { }


    my @a=qw(random brilliant braindead); print $a[rand(@a)];
      I would use globals for that, since they don't get messed up by use in a quoted context.

      BEGIN { use vars qw($FOO); $FOO = ... # some slow action here } if ($FOO) { }
Re: C pre-processor in Perl
by sauoq (Abbot) on Jul 28, 2003 at 19:34 UTC
    #!/usr/bin/perl -P #define FOO bar print "FOO\n"; print qq(FOO\n);
    prints
    FOO bar
    and that's reason enough for me to avoid it.

    -sauoq
    "My two cents aren't worth a dime.";
    
      Well, that doesn't work with the constant pragma either. Back to globals...

        I think you missed the point that code makes. Whether or not a constant define'd via preprocessing is replaced inside a quoted string depends on the quoting mechanism used. It isn't replaced with actual quotes ("" or '') but it is with the generic quote operators¹ q() and qq().

        So, change how you quote your literals and you might end up changing the literals themselves. I don't like things that change unexpectedly like that.

        1. This is, of course, because it is a C preprocessor and not a Perl preprocessor.

        -sauoq
        "My two cents aren't worth a dime.";
        
Re: C pre-processor in Perl
by chromatic (Archbishop) on Jul 28, 2003 at 19:20 UTC

    In a simple program, you could certainly get away with #define. You'd be sunk if you tried to use it in a module that you expected other people to use, though.

    I'm not sure calling C macro processing 'first-class' is very accurate though. It feels more like third-class to me. :)

Re: C pre-processor in Perl
by perrin (Chancellor) on Jul 28, 2003 at 19:16 UTC
    Well, constant is lame and should be avoided, so I'd certainly prefer a #define there. I doubt it has any significant speed or memory advantages though. It just lets you avoid the syntax traps of the constant pragma. The #include one seems like a bad idea. If you include that code in more than one place, it will take up more memory. You will also lose the dependency tracking that require gives you.

    Also, note that you can do this with a source filter by using the Filter::cpp module.

    UPDATE: Why not just use the -P option, instead of a source filter? Well, you mentioned mod_perl. Environments that use a persistent interpreter will often not allow you to do command-line options like -P.

Re: C pre-processor in Perl
by TomDLux (Vicar) on Jul 28, 2003 at 20:58 UTC

    Pre-processor replacements are incompatible with debugging, whether in C, C++, or Perl. Perl code is very compatible with the Perl debugger.

    --
    TTTATCGGTCGTTATATAGATGTTTGCA

Re: C pre-processor in Perl (breaks)
by tye (Sage) on Jul 29, 2003 at 14:31 UTC

    Because it doesn't work very well. In particular, the C preprocessor doesn't know Perl syntax. Take this tiny example and save it into cpp.pl:

    #define FOO "bar" print qq(The value of FOO is "), FOO, qq("\n);
    and then try it out:
    $ perl cpp.pl The value of FOO is "FOO" $ perl -P cpp.pl The value of "bar" is "FOO" $
    It expands it when it shouldn't and doesn't when it should.

    I expect that you could get many C preprocessors to choke or issue warnings on fairly common Perl syntax as well.

    This is enough of a reason for me to not use such a trick. There are several other good reasons to not use this given in the thread. One of which I ran into just trying to demonstrate how my tiny example fails. First, on Win32 even with MS VC++ installed:

    > perl -P cpp.pl 'cl' is not recognized as an internal or external command, operable program or batch file. > vcvars32 Setting environment for using Microsoft Visual C++ tools. > perl -P cpp.pl Command line error D2003 : missing source filename >
    Random Unix box:
    $ perl -P cpp.pl ngcc: -: No such file or directory ngcc: No input files specified. $

                    - tye