|P is for Practical|
Data compression by 50% + : is it possible?by baxy77bax (Deacon)
|on May 11, 2019 at 20:51 UTC||Need Help??|
baxy77bax has asked for the wisdom of the Perl Monks concerning the following question:
I have a very simple dataset: ints from 1..90 that occur only once in a given line. They cannot be consecutive (meaning there is no sequence in a dataset). Example:
And what I am trying to do is to reduce the generated dataset by 50% or more (never got a headache from extra). It needs to be a lossless compression scheme and the order needs not to be preserved (necessarily) but it would be cool to have it. Is this possible? Any trick is allowed no matter how complicated or simple it is. if it gets 50%+, and the data can be reconstructed from it, the goal has been achieved.
Thank you !!
no need for code if you do not feel like coding...
Just to settle "is the code right" issue. The code is right. I have an automata that depending on some input spits out ints (coded exactly as i demonstrated -> ASCII 33+90) It does it 9 times in batches of 10's. sometimes it even skips the output batch but this is extremely rare. The output it produces is large (3-4GB) and i was just wondering if it was an option to shrink it. gzip or other compressors are not option as the system where it resides is old and does have bare minimum of additional tools on in. basically naked cpu with c and perl.