Beefy Boxes and Bandwidth Generously Provided by pair Networks
good chemistry is complicated,
and a little bit messy -LW
 
PerlMonks  

Re: best way to inline code? (i.e. macro)

by hecroce (Friar)
on Oct 18, 2005 at 03:21 UTC ( [id://500855]=note: print w/replies, xml ) Need Help??


in reply to best way to inline code? (i.e. macro)

Wouldn't Memoize help with that?
  • Comment on Re: best way to inline code? (i.e. macro)

Replies are listed 'Best First'.
Re^2: best way to inline code? (i.e. macro)
by Tanktalus (Canon) on Oct 18, 2005 at 04:14 UTC

    Memoize is used to change large/long-running subroutines into short/fast subroutines by caching results, and only for subroutines where there are no side effects, and all outputs can be computed solely from parameters. Reading from a database, for example, involves inputs that are not parameters, and thus fails this requirement.

    The OP seems to be saying that there is an oft-called small routine that he would like to inline. That is something one would do with the "inline" keyword in C++, but in perl, you simply write a constant sub and hope that perl figures out that it's inlineable. Generally speaking, other than for constant, it hasn't sounded useful to me. So, these small routines are already small, and hopefully fast. I don't see that Memoize would be likely to be useful in a scenario where someone wants to remove even the overhead of a subroutine call.

    Personally, I think what the OP is looking for is a source Filter that can take function definitions and put them into the code where they are called. For a single module, it shouldn't be too bad: just take everything inside the function (which should be very small) and put it in a do block, and take all the parameters that were being called, and put them at the top of the block as a localised @_ list.

    Mind you, if I were that interested in speed, I'd be doing this whole thing in C or C++ anyway. Given that I'm already running perl, I'm not going to worry about the overhead of function calls. The speed benefit I, as the programmer, am getting is already huge. I'll just tell my boss he needs a faster 'puter. He'll probably save money that way anyway.

      The speed benefit I, as the programmer, am getting is already huge. I'll just tell my boss he needs a faster 'puter. He'll probably save money that way anyway.

      For the disbelievers, let's do a quick numbers game. The average cost to a company for an hour of a programmer's time (salary, benefits, lights, deskspace, etc) is around $70. For a top developer, that goes up to $100/hr or more. So, for a week's work (40 hours), that costs your company $2800. Now, a quick google found that you can get a dual-core Xeon server in a base configuration for about $3k from Dell. Let's say tricking it out puts that price up to about $7k. That's 100 hours, or 2.5 weeks of work.

      In general, the server your app or db is running on is not a tricked-out dual-Xeon. I've seen performance boosts of 4-10x just by moving servers. Can you provide a similar boost from 2.5 weeks of coding, especially with no new bugs and no additional maintenance burden?

      Additionally, you can often get a 2-4x boost just by realigning your database. Yeah ... code optimizations are often the last refuge of the incompetent. (Bonus points to whomever can identify the source of that misquote.)


      My criteria for good software:
      1. Does it work?
      2. Can someone else come in, make a change, and be reasonably certain no bugs were introduced?
        In general, the server your app or db is running on is not a tricked-out dual-Xeon. I've seen performance boosts of 4-10x just by moving servers. Can you provide a similar boost from 2.5 weeks of coding, especially with no new bugs and no additional maintenance burden?

        I've provided 100x speed increases in perl code with just a day or two of work in the past. Never underestimate the power of the right optimisation, applied carefully.

        In the OT's case, I doubt that inlining his functions will be *quite* as productive, but it is at least worth trying, and he has, apparently, done at least some basic profiling of his code. He might be able to play tricks with his source code using the C pre-processor to fake up macros. Read perldoc perlrun's warnings about it, and I suggest running the code through cpp seperately instead of using -P.

        code optimizations are often the last refuge of the incompetent. (Bonus points to whomever can identify the source of that misquote.)

        Samuel Johnson...

        And next month it'll need a quad Xeon, and the month after that a cluster of them. And the month after that, the company will let all it's Perl programmers go and switch to using Java for performance reasons.

        Here's another misquotation for you: "Quotations are the refuge of the timid and inexpert".


        Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
        Lingua non convalesco, consenesco et abolesco. -- Rule 1 has a caveat! -- Who broke the cabal?
        "Science is about questioning the status quo. Questioning authority".
        The "good enough" maybe good enough for the now, and perfection maybe unobtainable, but that should not preclude us from striving for perfection, when time, circumstance or desire allow.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://500855]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others cooling their heels in the Monastery: (8)
As of 2024-04-19 09:53 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found