Beefy Boxes and Bandwidth Generously Provided by pair Networks
There's more than one way to do things
 
PerlMonks  

Re^3: Store a huge amount of data on disk

by erix (Vicar)
on Oct 18, 2011 at 17:08 UTC ( #932198=note: print w/ replies, xml ) Need Help??


in reply to Re^2: Store a huge amount of data on disk
in thread Store a huge amount of data on disk

Whether it is fast enough depends, I think, as much on the disks on your system as on the software that you'll use to write to them.

From what you mentioned I suppose the total size to be something like 300 GB? It's probably useful/necessary (for postgres, or any other RDBMS) to have some criterium (date, perhaps) by which to partition.

(FWIW, a 40 GB table that we use intensively, accessed by unique id, gives access times of less than 100 ms. System has 32 GB, and a 8-disk raid10 array.)

Btw, postgresql *does* have a limit for text column values (1 GB, where you need 2 GB, but I suppose that could be avoided by splitting the value or something like that)


Comment on Re^3: Store a huge amount of data on disk
Re^4: Store a huge amount of data on disk
by Sewi (Friar) on Oct 18, 2011 at 18:41 UTC
    Thank you for that numbers. A 1 GB upper limit would be ok, too as we don't want to reach this limit, but it might happen. I expect that I need to split at some high limit anyway, 1 GB or 2 GB doesn't matter.

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://932198]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others musing on the Monastery: (5)
As of 2014-08-30 09:21 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    The best computer themed movie is:











    Results (291 votes), past polls