Memory Full

Forum / Development

Best crunchers in 2020

BSC * 10 Feb 2020 17:28:16

Salut und tach auch,

I was wondering which crunchers give the best results both in crunch rate and decrunch speed. I don't care if the crunching is slow. And the decrunch does not have to be blazing fast, either. But the compression ratio should be excellent. I am also interested in articles covering how to improve compression ratio based on how the input file might be organized - if that's feasible.

Thanks in advance!

Beb * 10 Feb 2020 19:33:20

Maybe Aplib or LZSA2.
https://www.pouet.net/prod.php?which=81573

Hicks * 15 Feb 2020 13:40:00

Hardcore crunching & sloooow decrunching: Shrinkler for 4k (7 seconds of decrunch for The Third Kind), or Exomizer for bigger files (2 times slower than Aplib).

BSC * 04 Mar 2020 20:59:35

Shrinkler yields impressive results. But I will still try Exomizer, too. Any other suggestions?

ast * 17 Mar 2020 12:57:57

My personnal choice will be PuCrunch. I especially love this one.

BSC * 17 Mar 2020 17:04:39

Wow, never heard of that! Thanks, will give it a shot.

ast * 17 Mar 2020 21:15:26 * Modified at 21:15:39

you're welcome Bsc.

Targhan * 27 Mar 2020 20:43:20

Is there a "definitive" cruncher for short data (from let's say, 50 bytes to 2kb)? Or is it "try them all and test"? Thanks.

Hicks * 28 Mar 2020 14:34:02

Try them all!
If datas = 2k, I think that Shrinkler will be the best in all cases.
If datas = 50 bytes, don't crunch them, since every decruncher will be bigger than them.
If datas = 1k, then try a cruncher with a very short decruncher like ZX7 (the smallest one?), it could be better than Shrinkler...

Targhan * 28 Mar 2020 16:34:14 * Modified at 16:34:57

Thanks. But something I didn't say is that there would be many data to decrunch, at various point, so there would be one decrunch code, and many tiny-to-small chunks of data. As for Shrinkler, no way, it is too slow (it's for a game).

Hicks * 29 Mar 2020 09:30:19

Take a look here: https://raw.githubusercontent.com/emmanuel-marty/lzsa/master/pareto_graph.png
I suppose that LZSA2, MegaLZ, or ZX7 will do the job well... easy to try on a batch of files.

BSC * 05 Apr 2020 01:03:19 * Modified at 01:06:26

I have some data that has a lot of inherent/repetitive patterns and thus probably compresses quite good, regardless of the cruncher used (at least that's what I guess). I was wondering, mainly when considering shrinkler, if it was any good if I pre-compressed (more generally re-arrange to decrease redundancies) that data somehow so that my code will expand/create it at runtime, making the uncompressed binary become smaller, even though I'd have to add the code for blowing up that data. I figure the entropy in that version would be higher compared to the untreated one. Do you think that's worth trying? I am sure you guys have already dealt with similar problems.

Hicks * 06 Apr 2020 15:06:57

According to my tests, lower the entropy is, better the Shrinkler results are. So, I already take time for precrunching some datas and the results were always worse than before. My conclusion is that we have to test the opposite: unloop code, repeat datas, and then the entropy will be lower even if the file is bigger.

The lesson is that Shrinkler is much more sensitive to entropy than to raw size file (it's true for all crunchers, but Shrinkler is very very sensible to this).

BSC * 06 Apr 2020 19:57:43

Some really interesting and helpful insights, thanks Hicks!