Just another Perl shrine PerlMonks

### Re^3: Random Math Question

 on Oct 11, 2005 at 03:30 UTC ( #499017=note: print w/replies, xml ) Need Help??

in reply to Re^2: Random Math Question

At the risk of devolving into a purely theoretical, impractical exercise (if it's not already too late (which it is)), here goes nothing ;)

There are two cases...

1. If pseudorandom generation is impossible, then we can tell (by sampling its output) how much true randomness any algorithm uses (call this the true entropy). In this case, the Mersenne Twister is nowhere near big enough. The MT has 2^19937 configurations, so a single MT has at most 19937 bits of entropy. This is nowhere near the 1.5 million bits required to sample the space in question. There would be a polynomial-time algorithm that would be able to tell (by sampling its output) whether or not your algorithm was using MT.

2. On the other hand, if the MT is really pseudorandom in the strong sense of my previous comment, then we can talk about not only its true entropy but also its computational entropy, that is, the amount of entropy it can "fool" all polynomial-time algorithms into thinking it uses.

From what I recall, if pseudorandom generation turns out to be possible in this strong sense, it is quite reasonable for a function's computational entropy to be much higher (say, by a squared factor) than its true entropy. In this case, MT could be sufficient to sample the desired space.

Essentially, if pseudorandom generation is possible, then bits from the pseudorandom generator are completely interchangable with truly random bits in the polynomial-time realm. If there is ever a case where it made a (statistically) significant difference in an algorithms output, then already that gives you a distinguishing algorithm that contradicts the definition of the pseudorandom generator! Neat, huh?

Replies are listed 'Best First'.
Re^4: Random Math Question
by BrowserUk (Pope) on Oct 11, 2005 at 05:58 UTC

Thanks. For your amusement, I came across this paragraph whilst looking around for further reading on entropy (my emphasis):

Another major source of confusion about entropy change as the result of simply rearranging macro objects comes from information theory "entropy".2 Claude E. Shannon's 1948 paper began the era of quantification of information and in it he adopted the word "entropy" to name the quantity that his equation defined 2. This occurred because a friend, the brilliant mathematician John von Neumann, told him "call it entropy no one knows what entropy really is, so in a debate you will always have the advantage" 3. Wryly funny for that moment, Shannon's unwise acquiescence has produced enormous scientific confusion due to the increasingly widespread usefulness of his equation and its fertile mathematical variations in many fields other than communications 4, 5.

Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
Lingua non convalesco, consenesco et abolesco. -- Rule 1 has a caveat! -- Who broke the cabal?
"Science is about questioning the status quo. Questioning authority".
The "good enough" maybe good enough for the now, and perfection maybe unobtainable, but that should not preclude us from striving for perfection, when time, circumstance or desire allow.

Create A New User
Node Status?
node history
Node Type: note [id://499017]
help
Chatterbox?
and all is quiet...

How do I use this? | Other CB clients
Other Users?
Others drinking their drinks and smoking their pipes about the Monastery: (7)
As of 2017-11-19 12:56 GMT
Sections?
Information?
Find Nodes?
Leftovers?
Voting Booth?
In order to be able to say "I know Perl", you must have:

Results (281 votes). Check out past polls.

Notices?