Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl-Sensitive Sunglasses
 
PerlMonks  

Re: Optimizing into the Weird Zone

by dragonchild (Archbishop)
on Aug 12, 2003 at 12:56 UTC ( [id://283163]=note: print w/replies, xml ) Need Help??


in reply to Optimizing into the Weird Zone

Perhaps it suggests that as processors approach quantum levels of computing, that chaos theory starts to come into play. Right now, things are very predictable, except for the tiny points that they're not. In 10 years, who knows?

I agree with the other respondents - optimize for the programmer, not the compiler. If I can get in, do my thing, and get out in under 8 hours, and be guaranteed there's no action at a distance ... that's optimized. Anything less is poor.

------
We are the carpenters and bricklayers of the Information Age.

The idea is a little like C++ templates, except not quite so brain-meltingly complicated. -- TheDamian, Exegesis 6

Please remember that I'm crufty and crochety. All opinions are purely mine and all code is untested, unless otherwise specified.

Replies are listed 'Best First'.
Re: Optimizing into the Weird Zone
by Abigail-II (Bishop) on Aug 12, 2003 at 13:17 UTC
    Perhaps it suggests that as processors approach quantum levels of computing, that chaos theory starts to come into play.

    I don't think you mean what you just said. One outstanding property of quantum level physics is that things are no longer deterministic. In chaos theory, everything is still very deterministic. It's just that very tiny disturbences is the start configuration can lead to very different outcomes. But each input leads to a very determined outcome, and repeatedly so.

    I agree with the other respondents - optimize for the programmer, not the compiler. If I can get in, do my thing, and get out in under 8 hours, and be guaranteed there's no action at a distance ... that's optimized.

    That depends. If you write a program that calculates how much flap an airplane should have during the landing, and your write the program in 8 hours, but the resulting program takes 15 minutes to calculate the flap output than that's not optimized. Programmer time and run time are trade-offs, and the best trade-off isn't always "minimize programmer time".

    Abigail

Re: Re: Optimizing into the Weird Zone
by hardburn (Abbot) on Aug 12, 2003 at 14:42 UTC

    I don't think approaching "quantum levels of computing" has anything to do with it. Yes, processors are getting smaller, but I don't think that's the problem the orginal poster is hitting.

    It's not just that CPU manufacturers are approaching sizes of a single atom, but the way the processor works is no longer easily determined. IIRC, Intel gave up publishing op code timings back with the PII 400--the timings fluxuated so much between runs that there was no point.

    Branch prediction, piplining, and now hyperthreading basically make processors into run-time code optimizers. I've studied some hand-optimized ASM compared to the output of GCC, and while the hand-optimized can shave off an instruction or two, it will suffer more if the processor makes a branch misprediction.

    So all in all, its better if programmers leave that stuff alone if we can get away with it.

    ----
    I wanted to explore how Perl's closures can be manipulated, and ended up creating an object system by accident.
    -- Schemer

    Note: All code is untested, unless otherwise stated

      I've studied some hand- optimized ASM compared to the output of GCC, and while the hand-optimized can shave off an instruction or two, it will suffer more if the processor makes a branch misprediction.

      That just means the guys that optimised the code generators of the GCC compiler did an excellent job.

      So all in all, its better if programmers leave that stuff alone if we can get away with it.

      I bet everyone who uses the GCC compiler is glad that it's programmers didn't take a hands-off attitude.

      I guess it comes down where in the food chain your code comes. If you know it will always be top predator--consuming memory and cycles--then you can afford to apply no bounds, nor waste effort trying to curtail its appitite. However, if your code needs to live in a competative environment sharing limited resources, and especially if your code lives only part way up the food chain (ie. libraries and modules), then your efforts to limit the consumption by your code will have knock on benefits for every run of every top predator written to use it.


      Examine what is said, not who speaks.
      "Efficiency is intelligent laziness." -David Dunham
      "When I'm working on a problem, I never think about beauty. I think only how to solve the problem. But when I have finished, if the solution is not beautiful, I know it is wrong." -Richard Buckminster Fuller
      If I understand your problem, I can solve it! Of course, the same can be said for you.

        That just means the guys that optimised the code generators of the GCC compiler did an excellent job.

        I bet everyone who uses the GCC compiler is glad that it's programmers didn't take a hands-off attitude.

        Yup. The GCC people work hard so the rest of us don't have to.

        ----
        I wanted to explore how Perl's closures can be manipulated, and ended up creating an object system by accident.
        -- Schemer

        Note: All code is untested, unless otherwise stated

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://283163]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others chanting in the Monastery: (8)
As of 2024-04-19 12:27 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found