in reply to Evolving formulae

While this is of course cool, curve fitting (and things like Taylor's series) seem to be better ways to fight this problem. At least this works very well in dealing with functions.

I long remember reading "The two worst ways to solve a problem are genetic algorithms and neural networks". Worst -> Slowest. This isn't a knock at GA and NN, this is the truth, as written by AI purists themselves.

Loosely: You use the former when you know what "closer to right" means, but you don't know how to make something "more right". You use the latter when you know when the right answer, but you don't know how to get there.

The day we consign math to genetic algorithms, IMHO, is the day we stop understanding math. If you are trying to fit a function to data, there are better ways. Your write-up is of course cool and is well worth of the ++, just be advised the brute-force approach to isn't mathematics anymore.

Just like an infinite number of monkeys with typewriters will eventually come up with a Shakespearean tragedy, we could try to have todays fast computers mutate a formula until it gets close to a predefined goal
FYI -- when the predefined goal is the input numerical value for pi, then you know the goal, and you are there. Hence the above quote about the better methods for solving a problem. Pi is probably a very bad case, curve fitting would probably serve a better example and would have a more interesting fitness function.

Replies are listed 'Best First'.
Re: Re: Evolving formulae
by tsee (Curate) on Feb 26, 2004 at 09:19 UTC

    I wholeheartedly agree that the example given is nothing to phone home about. Curve fitting is about what I had in mind when I first started hacking this, but since my periods of spare time are getting rare, I chose to aim lower about 1/3 through.

    What inspired me in the first place was an article in the German equivalent of the Scientific American ("Spektrum der Wissenschaft") which demonstrated how somebody had written a reasonably similar program that evolved *code* in a simple assembly language (which ran on a virtual machine). The small programs were awarded for performing logic functions like a^b, etc.

Re: Re: Evolving formulae
by awwaiid (Friar) on Feb 27, 2004 at 15:42 UTC
    The day we consign math to genetic algorithms, IMHO, is the day we stop understanding math.

    I suppose that depends on two things -- first how well the programs are able to describe their new discoveries to us. This is the case for all of mathematics right now... just because you personally didn't solve some problem doesn't mean you can't understand the solution. Also when you do solve a problem it doesn't mean you can make everyone else understand.

    Second, you (and everyone should do this excercise with us) need to decide whether you consider mathematics (or programming for that matter!) to be a discipline of creation or discovery. Either people are making this stuff up and it wouldn't be there otherwise, or we are just finding out what is already out there.

    I personally am a discoverist!

    Hmm... actually now that I think about it that second point could well be irrelevant. Who cares whether the algorithms are discovering mathematics or creating it? Oh well, its a fun thought anyway.

    What does everyone think... are programs discovered or created?

      Personally, I prefer thinking of programs being created - I seem to work so much more productively that way ;)

      Fun aside, I think mathematics is neither completely human-created nor -discovered. It's a mixture of the two. Now, if Hilbert had been right and we could deduct everything from a created set of axioms, I'd say we created the axioms and discovered the rest.

      Unfortunately, Hilbert was wrong and on the grand scale of things, to me, mathematics is a steaming pile of dung if you consider that perfectly logical things contradict themselves. Thus, mathematics must have been created by humans.


        ... if you consider that perfectly logical things contradict themselves.

        I think there might be a flaw in your argument! :)

        Actually you have a wonderful point.