Beefy Boxes and Bandwidth Generously Provided by pair Networks
good chemistry is complicated,
and a little bit messy -LW
 
PerlMonks  

Re: pissed off about functional programming

by spurperl (Priest)
on Apr 24, 2005 at 13:43 UTC ( #450955=note: print w/ replies, xml ) Need Help??


in reply to pissed off about functional programming

It appears that in the world of programming something comes up once in a few years and is believed by many to be the "silver bullet". OOP, Agile technologies, etc. But generally, a lot of hype is generated for a small merit.

There is no silver bullet, but from each such hype we learn. OOP isn't a silver bullet, but we do gather a few useful techniques and design ideas from it. Agile/XP isn't a silver bullet, but it also has many interesting ideas.

Lets just say that if a new "silver bullet" is on its way now, I sure hope it will be FP - it has marvelous lessons to teach any programmer.

Update: I forgot to bring your attention to the fact the Church's thesis certainly doesn't say that "all computer languages are equivlent". A more or less formal defintion would be "any real-world computation can be translated into an equivalent computation involving a Turing machine".


Comment on Re: pissed off about functional programming
Re^2: pissed off about functional programming
by merlyn (Sage) on Apr 24, 2005 at 14:03 UTC
Re^2: pissed off about functional programming
by mstone (Deacon) on Apr 25, 2005 at 15:27 UTC
    It appears that in the world of programming something comes up once in a few years and is believed by many to be the "silver bullet".

    The problem is that we had one. The shift from programming in assembly language to programming with high level languages produced an order-of-magnitude increase in programmer productivity. And since it happened once, there's always the vague hope that it might happen again.

    Lets just say that if a new "silver bullet" is on its way now, I sure hope it will be FP - it has marvelous lessons to teach any programmer.

    Sorry. We've already been using FP long enough to know that it isn't the next silver bullet. There is a well-defined body of problems which FP handles really well, but there's also a well known body of problems FP doesn't handle so well.

    More to the point, FP hasn't lived up to the vision John Backus stated in 1978, of creating a sort of algebra for software.. a world where we could build ever more complicated programs by hooking existing programs together in well-defined ways. There seems to be an upper limit on how much stuff we can integrate before the result becomes too domain specific to be useful for further integration. There's also a problem with each new level of integration leaving us handcuffed to a new form of data representation, which also limits further combination. They're bascially the same problems that kept OOP from being the next silver bullet. There may be a way to crack those barriers, but neither FP nor OOP has done it so far.

    Doesn't mean they're useless.. far from it. They just aren't the conceptual nuke that will boost us to the next 10x level of productivity.

    I forgot to bring your attention to the fact the Church's thesis certainly doesn't say that "all computer languages are equivlent". A more or less formal defintion would be "any real-world computation can be translated into an equivalent computation involving a Turing machine".

    Okay, I admit to doing a hand-wave with regard to Church's thesis. What actually happened is this:

    In 1931, Kurt Godel defined the theory of mu-recursive functions. It's an elegant theory that explains how to create new functions by combining existing ones. It uses three basic functions: zero(), next(), and project() (selecting a specific item from an N-tuple), and three operations: function composition, primitive recursion, and mu-recursion. The set of functions that can be produced by applying those operations to the base functions is called 'the set of general recursive functions'. In mathematical terms, this was basically a Theory of Everything.

    In 1933, Alonzo Church published the rules for lambda calculus. In '36, he published a paper{1} where he speculated, but couldn't prove, that the set of functions generated by lambda calculus was actually the set of general recursive functions. About the same time, Kleene{2} and Post{3} published papers approaching the same problem from different angles, and Alan Turing came out with a paper{4} that proved the Turing machine was computationally equivalent to lambda calculus.. or actually, 'idempotent', meaning you can define either one in terms of the other. That's the standard mathematical way of showing that two systems are fundamentally the same, despite any apparent differences in their crunchy candy shells.

    {1} - with the whizzer of a title, _A note on the Entscheidungsproblem_

    {2} - _General recursive functions of natural numbers_

    {3} - _Finite combinatory process-formulation_

    {4} - _On computable numbers with an application to the Entscheidungsproblem_ -- I think they just liked saying that word.

    Church is the one who defined what he called 'effective calculabilty' (which we call 'computability' today) and linked lambda calculus to the mu-recursive functions. Turing was the one who actually linked Turing machines to lambda calculus. His paper served as the template for bringing all the pieces together into a unified theory of computation.

    So.. it was wrong of me to say that Church's thesis 'explicitly' links lambda calculus to Turing machines. I might've gotten away with it if I'd said 'Church-Turing thesis', though. "The theory of computability says ..." would definitely have worked.

      The shift from programming in assembly language to programming with high level languages produced an order-of-magnitude increase in programmer productivity.
      ...snip...
      Sorry. We've already been using FP long enough to know that it isn't the next silver bullet.
      Hmm. Are you sure there isn't another 10x productivity boost from functional languages? Or are you just refusing to see it? Take a web browser. Code one up in C. Then do the same thing in O'Caml. There you go.
        Except that C isn't a HLL, it's assembly language. Your experiment would only substantiate his point.

        Why bother with C? That just invites arguments about whether it's fair for me to use gecko instead of writing my own layout engine from scratch.

        O'Caml supports both functional and imperative programming, so we can eliminate all that quibbling by doing both versions in the same language and using the same libraries. Then we can be really sure that we're seeing a valid comparison between the functional and imperative styles.

        So.. you spend five weeks writing a web browser in a strictly O'Caml functional style, then I'll see if I can write an O'Caml imperative version that matches your feature set in one year.

        Or how about you do your version in O'Caml, and I'll do my version in Java? Again, you get five weeks, and I get a year.

        If you want to compare library sets, we'll compare library sets. If you want to compare languages, we'll compare languages. Either way, I think you'll have a hard time demonstrating even a consistent 1.25x increase in productivity strictly from using the functional style.

      Let me get this straight. You're saying that Perl isn't 100x more productive than assembly?
      Sorry. We've already been using FP long enough to know that it isn't the next silver bullet. There is a well-defined body of problems which FP handles really well, but there's also a well known body of problems FP doesn't handle so well.

      I don't think that's the point :-) I quite agree that FP is no more a silver bullet than any other development style. However, IMHO, FP has yet to have it's "silver bullet" moment in the limelight - which means that the really nice things that the FP style can give you still haven't been absorbed by the world at large.

      Before the Java "silver bullet" I was always having arguments with people who said automatic memory management was impractical, and how VM based platforms would never scale for real world use. Despite the fact that people had been using them for years - decades even. Post-Java these discussions have become a rarity.

      In fact I think that this will be the lasting legacy of Java - it'll be the language that made garbage collection and VMs popular.

      I still encounter people who think that useful bits from the FP toolbox, like lambdas, map. currying, etc. are a complete waste of time since they're not in their toolbox and they've never used them in anger. An FP language could do with a bit of the silver bullet treatment just to get useful concepts out there and in peoples heads.

        Adding to that: Automatic memory management is the primary reason Java has tons (more than any other platform or language that I'm aware of) of readily sharable open source libraries, and C, C++, etc. have very few by comparison.

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://450955]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others chanting in the Monastery: (9)
As of 2014-12-27 18:16 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    Is guessing a good strategy for surviving in the IT business?





    Results (177 votes), past polls