in reply to
Re^6: On showing the weakness in the MD5 digest function and getting bitten by scalar context
in thread On showing the weakness in the MD5 digest function and getting bitten by scalar context
People are rat-holing on the "I can make two random-data chunks which have the same MD5 hash." Sure, that's an interesting data point.
But my point is (and has been) that the complexity is far higher if you want to (1) engineer a new data stream which will match an original data stream's hash, while (2) maintaining a plausible protocol and formatting.
- Forge a new JPEG image (same hash as the original image) which has no broken data fields.
- Forge a new tarball which can still be decompressed.
- Forge a new text file without using random line noise or dictionary gibberish.
If you can do that, even once, THEN I will be impressed and reconsider the value of MD5's distribution fingerprinting.
[ e d @ h a l l e y . c c ]