Beefy Boxes and Bandwidth Generously Provided by pair Networks
good chemistry is complicated,
and a little bit messy -LW

session keys: how far to take it

by saskaqueer (Friar)
on Dec 28, 2004 at 11:22 UTC ( #417743=perlmeditation: print w/replies, xml ) Need Help??

I know that there are at least a few others besides me here who do the whole web applications thing. I'm pretty confident that there are also other areas of programming that require a nice randomly generated key, for whatever purpose. The question is, how should one generate these keys in a fast and efficient manner, while at the same time ensuring that it is extremely difficult to guess or compute these keys.

Methods vary from person to person of course, which is the primary reason I'm posting this to begin with. I'm interested to see which methods people use to generate keys for their applications. Everything from simple methods to the extremes. How much work isn't enough, and how much work in generating such keys is considered overdone and redundant?

Quick examples and tests follow. Notice how each of the first three examples hit repeat hashes after several thousand loops. My own code example is still running steadfast.

# CGI::Session::SHA1 # repeat session key hit at 11435 use Digest::SHA1; my $key = Digest::SHA1->new()->add( $$, time(), rand(9999) )->hexdigest(); # CGI::Session::MD5 # repeat session key hit at 10481 use Digest::MD5; my $key = Digest::MD5->new()->add( $$, time(), rand(9999) )->hexdigest(); # Apache::Session::Generate::MD5 # repeat session key hit at 4398 use Digest::MD5 qw( md5_hex ); my $key = md5_hex( md5_hex( time() . {} . rand() . $$ ) ); # My own concoction # currently at count 80699, no repeat keys yet use Digest::SHA qw( sha224_base64 ); my $key = sha224_base64( join( '', map { chr( (1..127)[rand 127] ) } 1..1000 ) );

Replies are listed 'Best First'.
Re: session keys: how far to take it
by BrowserUk (Pope) on Dec 28, 2004 at 12:22 UTC

    This will generate a million unique 128-bit random numbers in around 20 seconds, and happily goes up to 10 million (and I suspect very, very much higher, but I ran out of memory) without duplication.

    If you want them hexified use unpack as in the commented out print statement.

    #! perl -slw use strict; use Time::HiRes qw[ time ]; use Math::Random::MT qw[ rand srand ]; $| = 1; our $MAX ||= 1000000; srand( time ); my %hash; for ( 1 .. $MAX ) { my $rand = pack 'N4', map{ rand 0xffffffff } 1..4; die "Got duplicate after $_ attempts" if exists $hash{ $rand }; $hash{ $rand } = (); # print unpack 'H*', $rand; printf "\r$_\t" unless $_ % 1000; } __END__ [12:20:07.29] P:\test>417743 -MAX=1000000 1000000

    Examine what is said, not who speaks.
    Silence betokens consent.
    Love the truth but pardon error.

      If I understand the math at all (and it is quite possible I don't!), the odds against the above producing a duplicated ID at any one choice are something like: 0.000000000000000000054

      Which, when compared with your odds of being hit by falling space debris: 0.0000000000000166

      Or getting struck by lightning: 0.000001763

      Or that you would be one of those caught up in the recent tsunami: 0.000008333

      Seem pretty good odds to me.

      Examine what is said, not who speaks.
      Silence betokens consent.
      Love the truth but pardon error.
Re: session keys: how far to take it
by daddyefsacks (Pilgrim) on Dec 28, 2004 at 13:19 UTC

    I'm currently trying out Data::UUID as a way to generate unique ids for an application I'm working on after reading about it here.

    So far it has suited my needs very well.

      Note that although UUIDs are guarenteed unique, they do not guarentee unguessability.

      "There is no shame in being self-taught, only in not trying to learn in the first place." -- Atrus, Myst: The Book of D'ni.

        Yep, I should have mentioned that, I'm not using the module for anything like session keys where I have to worry about someone guessing my keys.
Re: session keys: how far to take it
by exussum0 (Vicar) on Dec 28, 2004 at 13:50 UTC
    People seem to be addressing the problem of speed via brute force. Why not just pregenerate them and queue them up? If you are getting millions of new session people in an hour, I'm sure you can get enough iron to pregenerate those.

    A good function that goes through a range of numbers and doesn't repeat after some huge N, isn't hard. Using that as a seed to SHA-1, MD5 or even RC4 isn't killer. Queueing isn't hard. There you go. :) And if you are smart about your function, the next one is hard to guess, but THAT is a science all in itself and a harder one. Also making sure the encryption isn't easily analyzed from session key to session key is hard, but it is easy as well.

    Give me strength for today.. I will not talk it away..
    Just for a moment.. It will burn through the clouds.. and shine down on me.

Re: session keys: how far to take it
by Anonymous Monk on Dec 28, 2004 at 12:10 UTC
    It depends on who you can trust. If you can trust anyone with root (or physical) access to the box, you could simple use a counter, and a secret key. Concatenate the secret key with the counter, and make a digest of them. Instead of a counter, you could use a timestamping feature of a database (anything that gives you a unique number - you don't have to care whether it's guessable).

    What you are doing with the technique is basically combining two things, each of them supplying one the requirements you need. The secret key gives you something that's hard to guess - and the counter or timestamp gives you uniqueness.

Re: session keys: how far to take it
by perrin (Chancellor) on Dec 28, 2004 at 22:20 UTC
    It is not important for session keys to be hard to guess. If you include a MAC (using SHA1 probably) in your cookie, knowing a valid session key will not be enough information to forge a valid session. Instead, you should focus on making one that will definitely not repeate, because a repeat will be a potential disaster for your application. I recommend using mod_unique_id (an apache module) or a sequence from a database, because these are both good enough for use in a cluster of machine.

    Nearly all of the "hard to guess" key generation tools have problems with duplicates, as you have seen. You also have to be careful when making something our of concatenating things like process ID which can vary in length, since you can get accidental repeats if you are not forcing them to a specific size or using a separator character between them. Really, it's much easier to use mod_unique_id or a database.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: perlmeditation [id://417743]
Approved by BrowserUk
Front-paged by Arunbear
and all is quiet...

How do I use this? | Other CB clients
Other Users?
Others drinking their drinks and smoking their pipes about the Monastery: (4)
As of 2017-01-20 21:20 GMT
Find Nodes?
    Voting Booth?
    Do you watch meteor showers?

    Results (177 votes). Check out past polls.