Archive: NSIS Compression


Hi,

There may be an undocumented command for this but I have noticed that NSIS compresses each file individually. It would do much better if it "batched" files within a section and compressed the lot in one go (only one table for starters).

It would probably be better to "batch" by extension within group as then it can compress better (text files are completely different from graphics etc) or allow the user to specify where batching occurs.

If there is such an option already please let me know, I did search the newsgroups as I noticed that the doco appears out of date as I can't see some commands I'm pretty sure I've seen mentioned in recent posts...

Thanks


Well...
--------------------------
SetCompress auto|force|off
This command sets the compress flag which is used by the installer to determine whether or not data should be compressed. Typically the SetCompress flag will effect the commands after it, and the last SetCompress command in the file also determines whether or not the install info section and uninstall data of the installer is compressed. If compressflag is 'auto', then files are compressed if the compressed size is smaller than the uncompressed size. If compressflag is set to 'force', then the compressed version is always used. If compressflag is 'off' then compression is not used (which can be faster).
---------------------------

By "It would do much better if it 'batched' files" do you mean build time or space --or-- installer time/space? I was under the impression (and that doesn't mean I think I am right) that by setting the flag to 'auto' you'd get the best of both worlds. 'Force' might work well if you have a lot of similar files. And 'Off' would work well if you have files that don't compress all that well (like files that are used in a highly compressed format to begin with).


Sorry but my files are being compressed, I made sure of that and I tried the data block compression (whatever that is)!

The compression has an overhead (a table which is used to recreate the data), this is why a file originally 100 bytes long would get larger when "compressed". This is what the "auto" flag is for, it ensures that if a compressed file is larger is not used (that is you are trying for minimal size - I'm not sure why anyone would choose 'ON' - surely not as an extremely poor mans encryption?).

The larger the chunk of data the smaller the percentage of the data taken up by the table. Larger data should also produce a better more efficient table (but depending on how the compression works it may not look ahead far enough for this to help).

Basically I was hoping that if I had 100 small gif files in a section that rather than possibly have 100 poorly (if at all) compressed files with the overhead 100 times, I could get the overhead once as well as well compressed data.


dbareis means that he doesn't want NSIS to compress every file seperately, but instead compress groups of files at the same time - it's like creating a seperate zip file for 1000 small files - every time you get overhead.
So dbareis wants them to be compressed at the same time, thus saving overhead (and improving compression).

You could write a tool for this, which'd combine all files into 1 file, append it to an exe file, which extracts the files again upon execution. But that's not the best solution for the problem.


This ties into the proposed BZIP2 compression, as well, because BZ2 works really well on large blocks but not on smaller files.

If the datablock could be completely reorganized to compress in 900kb chunks instead of a per-file basis, BZ2 would probably kick ZLIB's ass on ratio. (I say 900kb because BZIP2 has a built-in block size of 900kb, after which it resets.)

I think this is a task best left to Justin, though, because I know that the datablock gets slightly reorganized with every release of NSIS and a patch would probably break between versions.