- NSIS Discussion
- Big problem with big files...
Archive: Big problem with big files...
DEADFACE
30th January 2003 17:07 UTC
Big problem with big files...
I cannot create instaler for big distribution
for example:
--- part of file ---
File: Descending to: "files\e2driver" -> "$INSTDIR\e2driver"
File: "e2_d3d8_driver_mfc.dll" [compress] 199060/540736 bytes
File: Returning to: "files" -> "$INSTDIR"
File: "x_level1.ras" [compress] 115238381/123196327 bytes
File: "x_level2.ras" [compress] 66137012/70958070 bytes
File: "x_level3.ras" [compress] 95316992/102462128 bytes
File: "e2mfc.dll" [compress] 167396/442368 bytes
File: "grphmfc.dll" [compress] 47684/98304 bytes
File: "MaxPayne.exe" [compress] 17169/49152 bytes
File: "MFC42.DLL" [compress] 463287/995383 bytes
File: "MSVCIRT.DLL" [compress] 23554/77878 bytes
File: "MSVCP60.DLL" [compress] 116939/401462 bytes
File: "MSVCRT.DLL" [compress] 131857/266293 bytes
File: "rlmfc.dll" [compress] 106290/282624 bytes
File: "sndmfc.dll" [compress] 38417/98304 bytes
File: "x_data.ras" [compress] 71616378/143134507 bytes
File: "x_music.ras" [compress] 120877314/144606272 bytes
File: "x_polish.ras" [compress]
Internal compiler error #12345: error mmapping datablock to 776490929.
Note: you may have one or two (large) stale temporary file(s)
left in your temporary directory (Generally this only happens on Windows 9x).
--- part of file ---
1. x_polish.ras have 220 megabytes
Full distribution have (unpacked) 830 MB
Packed by bzip2 have 615 MB
2. I have empty temp directory - before running NSIS (!).
I have:
HDD: disk C: 6GB (FAT32 with 1,5GB free space) <- temp directory
other disks FAT32 +/- 36 GB on 5 partition
RAM: 128 MB
OS: Windows 98 and Windows XP (always the same error)
I use zlib algoritm, bzip2 is work but (decompression) its
too slow for me.
Questions:
1. What is wrong? (FAT32 ?!)
2. Can I change default (detected by nsis c:\win98\temp)
temp directory (in NSIS of course)?
3. When 7z will be integrated in NSIS?
Sugestion:
On very big file progress bar is stop for 1-5 minutes,
can someone change it.
kichik
30th January 2003 17:18 UTC
http://forums.winamp.com/showthread....733#post850733
If this guy is using FAT too then that's the reason. It might be that Windows NT only supports large files on NTFS. It's a pretty good possibilty.
You can't change the default temporary directory returned by NSIS because NSIS just takes it from the system. What you can do is ask NSIS to bring the temp directory for all users (see SetShellVarContext).
7z should be avaiable before NSIS 2 final is out.
That suggestion is already on the todo list.
AnalogKid
30th January 2003 22:35 UTC
Hello all,
I have a similar problem; the file it takes a dump on is only about 39MB, but the overall installer will be over 900MB at that point (mod package that has lots of gfx, etc.). I've tried both BZip2 and the standard (LZW?). 7z would be a nice addition but he keeps changing file formats so it's been hard to use so far.
Anyways, this is what I get when it drops out:
Internal compiler error #12345: error mmapping datablock to 907605624.
System:
P4 1.5GHz, 512MB, 30GB (~4GB free)
Win2K Pro +SP3
(oops forgot this) NSIS v2.0b0 and 1.98 were tried ... (/oops)
My page file is 384MB ... do you think that the total combined physical & page file size is the limit? That would come out to just about 900MB, which is where it's choking. I'll increase my swap to 1GB and get back to you.
Thanks for such a kick ass installer btw! :up:
...AnalogKid
AnalogKid
30th January 2003 23:04 UTC
... increasing my page file to 512MB (so that I have 1GB combined) didn't have any effect. It seems to me that if the combined physical/page file size were the limit of the installer that it would at least be able to get past that 39MB file now, since I increased the swap by more than the file size (I went from 384 to 512, so +128MB).
Any ideas? :cry:
...AnalogKid
kichik
30th January 2003 23:11 UTC
The other compression type is zlib deflate.
I saw your post in the other thread but I would still want to know, are you using FAT or NTFS?
AnalogKid
30th January 2003 23:17 UTC
Ooops I knew I was forgetting something ...
C: 13.9GB partition, FAT32 3.83GB free
(system drive)
D: 13.9GB partition, FAT32 1.53GB free
(Dev env. & Data drive)
hope that helps troubleshoot this!
NTFS isn't worth the pain in the neck unless you require the secure filesystem or have huge files (for DV editing). My $.02
...AnalogKid
DEADFACE
31st January 2003 09:49 UTC
Thanks
kichik
1st February 2003 13:48 UTC
It is now confirmed. The problem is with FAT. I have tried it on two of my FAT partitions, and both failed. Later, with the help of killahbite, we have tested a 1.4GB installer on a very fragmented NTFS partition and it worked perfectly fine (although a bit slow because it was fragmented). So, Windows NT kernel is not enough, NTFS is required.
virtlink
1st February 2003 14:57 UTC
Should be mentioned in the documentation, under a section named: 'known issues' or something, along with other problems that might occure with some Windows versions, program's installed, etc..
AnalogKid
3rd February 2003 17:10 UTC
Bummer. Are you saying that the system drive needs to be FAT, or just the drive where NSIS is trying to write the installer?
I keep forgetting that the source is open -- if I get some time I'll wander through it and see what's up on this end. I'm sure you're right, but I'm just wondering why FAT16/32 is a problem.
...AnalogKid
kichik
3rd February 2003 17:16 UTC
The drive where the system's temporary directory is should be NTFS. The problem is with memory mapped files on FAT drives. You can have a look in Source\strlist.h - class MMapBuf.
AnalogKid
3rd February 2003 19:59 UTC
Unfortunately, it still doesn't work after CONVERTing both drives to NTFS. No big loss for me in converting them but I'm kind of bummed that it still won't work.
I did some digging in the NTKernel docs in MSDN and found this statement (under remarks):
"Windows_NT/2000/XP:__If the file mapping object is backed by the paging file (hFile is INVALID_HANDLE_VALUE), the paging file must be large enough to hold the entire mapping. If it is not, MapViewOfFile fails."
The way the code is structured, it looks like m_hFile can continue to be invalid after your CreateFile() call which will skip the m_mapping=MapViewOfFile and fall right into the if (!m_mapping). I'm not sure that's what you really meant to do. Basically, if you have any of the following it looks like you'll get the same error msg:
1) m_hFile == INVALID_HANDLE_VALUE (if the CreateFile() failed)
2) m_hFileMap == NULL (if m_hFile == INVALID_HANDLE_VALUE)
3) m_mapping == NULL (if MapViewOfFile() is never called)
If I get time later in the week I'll step through the mess on my end and let you know at which step everything goes to shit over here.
Thanks!
...AnalogKid
Jacob Metro
18th February 2003 20:40 UTC
#12345 error mmapping datablock
Why?
I'm using a Windows 2000 (SP3) system (NTFS) with 768M RAM, 120 GB HDD, 3900 MB virtual memory set aside, and clean temp directories. I'm running into the #12345 error mapping 753554251. I moved my temp directories to the root so I could watch them fill up. I am using the Windows task manager to watch my processor level. I set aside so much VM just so I could prove (to myself) that VM isn't at fault.
My question are these: What causes this error? Why are NSIS installers limited to 2 GB?
Thanks AnalogKid for giving me some idea of where to look.
If anyone has more info, let me know.
Sincerly,
Jacob Metro
liquidmotion
18th February 2003 21:12 UTC
why should installers be over 2gb? if you need to use large files (say for a game installer), why not just keep them external and just copy the file to the outdir, rather than compressing the files into the installer?
shouldn't be too hard.
Jacob Metro
18th February 2003 22:39 UTC
Re:liquidmotion Response
Well,
I guess that would work for large individual files. All I would need to do is use the installer for the little stuff, and use CopyFiles (or whatever the function is) to copy from my installation media to the destination drive. But what would you do about thousands of little dlls and hundreds and hundreds of data folders?
What if my company makes accounting and data control software for major banks. Each bank is different and requires a personalized set of dlls and a personalized installer. Nullsoft NSIS is the best because I can write a cheap, quick script and just copy it several times and change the data source using the File function. In my application suppose that the end result is say 10GB worth of system files (to be added to the destination system, application software, and a small set of "starter data".
I just was wondering if there was a quick answer to "Why?" can't I make a compiler bigger than 2 GB and why I'm getting that error even before I reach 1 GB worth of source. (And hopefully the quick answer will be more useful than, "cause it can't" and "I don't know."
Suppose also that I might want to propigate a whole WINNT directory such that in the event of a media or software failure I can quickly "reinstall" (kind of like a COMPAQ QuickRestore) the WINNT directory and be up an working again. I can see how without that limitation, so many more things can be done. NSIS might become a defacto standard of sorts in the software distribution model.
virtlink
20th February 2003 11:02 UTC
In the earlier days, a PIMP installer of 8 MB took 8 MB of memory, and a 512 MB installer also took 512 MB of memory. I think that part of that code it still inside the NSIS program. It doesn't need that amount of memory anymore, but it definitely uses memory for each file extracted.
I know this isn't an answer to you question, just info.
kichik
20th February 2003 13:04 UTC
Well, as it seems it doesn't always work on NTFS, so I will have to give it a deeper look when I get the time.
Jacob Metro
20th February 2003 17:51 UTC
Thanks
Actually virtlink, your answer was a good discription of the problem. Thanks kichik for your steadfast and excellent work.
Sincerly,
Jacob Metro
Jacob Metro
3rd March 2003 19:27 UTC
I have another example:
Windows2000 Pro NTFS
Installer w/ dlls etc.....
1.10GB SIZE
616MB Size on Disk
785904KB Physical Memory
2707820KB Total Virtual Memory
Error #12345
virtlink
4th March 2003 10:30 UTC
Well, KiCHiK, tell me:
What is the problem with big files? Do they need so much memory while extracting? Or is something else the problem.
For example:
Is it possible to create a 1.10 gB file on a NTFS system, without NSIS? I think the answer is yes.
Then, what is the problem?
I hear constantly that on different filesystems larger or smaller files are possible, but not what's causing the error in NSIS.
Sunjammer
4th March 2003 10:34 UTC
I *think* part of the issue is with memory mapped files which NSIS uses.
kichik
4th March 2003 16:33 UTC
Yep, that's it. Need to get some free time to look really deep into it to solve this once and for all.
Jacob Metro
5th March 2003 17:48 UTC
Info on the problem
I've done a little research
I don't have the best of all systems to test with, but I use it for customer work so I guess it will have to do for this as well.
I am using Windows 2000 Pro, NTFS, AMD 850, 768MB RAM.
Attached is a file named NSIStst.RAR. THIS IS NOT A RAR FILE. This is a CSV file produced by the W2K performance monitor. Simply download this file, rename it a CSV file, and open it using MS Excel (or your spreadsheet viewer). I asked the performance monitor to watch several counters while I ran NSIS.
Here's what I have found.
With a 1.57GB set of files using STANDARD compression (ZLIB), compresses fine.
However that same set using (I'm guessing) ENHANCED compression (BZIP2), does not compress fine and errors out with the #12345 error.
The CSV file shows a substantial memory leak using BZIP2. The \Memory\Available MBytes counter begins by showing 640MB ram at the start of the program and ends with 16MB ram before the error.
Secondly, using BZIP2 compression (and I'm a newbie programmer so I don't really know what this means, if anything) there is a problem with the Windows 2000 Task Manager's CPU Time column for MAKENSIS on the Process page. Let me put it another way. My System Idle Process just keeps on ticking. My CPU usage is between four and ten percent. But disregarding all these factors, the CPU Time column is not updated as regularily as I would like (even putting the process' priority at "real-time"). In fact before failure the CPU Time column shows 33 seconds worth of work time. My stop watch shows 10 Min 19.8 Sec worth of work time.
For control purposes my test run using STANDARD (ZLIB) compression used 18 minutes 30 seconds worth of processor time.
As far as I've seen (a 400MB test, 800MB test, 1GB test, and this 1.57GB test) this "leak" does not occur using standard compression.
So my guess is that for some reason BZ2, isn't leaning as heavily on the processor as it could and is thus eating up too much memory. That, in itself, could break the 2GB barrier.
I left other counters in the CSV file which I thought may be of benefit to a more advanced programmer.
I really appreciate the NSIS Scripting Language. It is a powerful tool.
Sincerely,
Jacob Metro
kichik
5th March 2003 17:55 UTC
Thanks for the information.
What version of NSIS have you used to do the testings? Can you please try with 1.98 too?
Jacob Metro
5th March 2003 17:57 UTC
Reply to kichik
I'm using 2.0b1
I'm trying now with 2.0a7 and 1.98
Jacob Metro
6th March 2003 00:20 UTC
More Information
More Information about the problem
Attached is a file named Perflogs.ZIP. [I]This is a ZIP file.[/U] (...Entirely unlike the RAR file from my last post...)
This zip file contains quite a few pieces of data:
NSIS20b1.bz2.csv - Counters watching the operation of the program using NSIS vs. 20b1 with bz2 compression.
NSIS198.zlib.csv - Counters watching the operation of the program using NSIS vs. 1.98 with zlib compression.
NSIS20a7.bz2.csv - Counters watching the operation of the program using NSIS vs. 20a7 with bz2 compression.
NSIS20a7.zlib.csv - Counters watching the operation of the program using NSIS vs. 20a7 with zlib compression.
NSIS20a7.zlib.2nd.csv - Counters watching the operation of the program using NSIS vs. 20a7 with zlib compression. (A second run for comparison's sake).
NSIS198.bz2.csv - Counters watching the operation of the program using NSIS vs. 20a7 with zlib compression.
NSISfail.bz2.rtf - System Monitor image showing the last moments of life of NSIS20b1 using bz2 with 786 MB ram (standard).
NSISFAIL2.bz2.rtf - System Monitor image showing the last moments of life of NSIS20b1 using bz2 with 512 MB ram (test).
NSISTest_000013.csv - More counters watching the second fail (above) with 512MB ram.
Okay, it's really late (for me) and my wife is ready to go home. I only have a few minutes to explain my methodology and findings. I really hope a programmer with more experience might be able to interpret (sp) and use this information.
Following kichik's suggestion I tested NSIS1.98, NSIS20a7, and NSIS20b1. The csv's above are the results of these experiments.
I have added another counter since the last post \Objects\Sections to watch the "virtual memory created by a process for storing data" as it changes over the operation of this program. (It doesn't).
What I have found is interesting (to me at least)...
All tested versions of NSIS have this problem when using bz2 compression. For me, I did note an increase of CPU usage from NSIS20b1 (4%-10%) to NSIS198 (46%-78%). All versions appear to fail at 27MB. The "CPU time counter problem" mentioned in the last post dne using ZLIB on any version tested. When avail MBYTES falls to 27MB using BZ2 the compression bombs. I noted that using ZLIB when Avail MBYTES falls to zero the DPC and APC processor interrupts increase (drastically) letting me know that paging is being used.
For a final test I lowered my paging files size to 1.5GB and removed 256K RAM from my PC. I would expect the failure to occur earlier in the process. IT DOES NOT. The system fails at the exact same place, the exact same file and datablock, regardless of the amount of physical RAM in the PC.
Thus the failure is not directly dependant on RAM.
I'm not sure what that means, but...
For what it's worth.
Thanks,
Jacob Metro
P.S. Because of maximum attachment size issues (I will be posting another 7 times to get this material out there for you guys.
P.P.S. I understand and respect the right for rules to exist and the necessity for obedience to law. In that spirit I hereby state "I am not post pumping" per the Post Pumping law
"- If a post is suspected to serve no other purpose than to increase the post count of the poster, then the moderators may, at their discretion, issue a PM to the member in question enquiring about the nature of the post. Posts/Threads that are blatantly post pumping will be removed/locked without recourse to the poster."
Jacob Metro
6th March 2003 00:22 UTC
Next ZIP
Jacob Metro
6th March 2003 00:23 UTC
Next ZIP
I didn't know about a 60 second rule (it's not on the books) :)
Jacob Metro
6th March 2003 00:24 UTC
Next ZIP
I'm just being thourough...
Jacob Metro
6th March 2003 00:31 UTC
Last one
I cut quite a bit from that one to get it through the max size limit, I have the original if it is needed.
Jacob Metro
8th March 2003 17:46 UTC
More interpretation
I've had more time to look at the logs
What I think this may be showing is that ZLIB uses the available physical memory and then automatically begins using available virtual memory. That is proven when you note that the ZLIB CSVs show available physical memory dropping to zero, hanging in the zero range for several minutes, while the processor interrupts and memory swapping takes care of memory's job. Which is why ZLIB compression works.
However, I note that the processor interrupts begin immediatly before the memory reaches 32MB (I can't prove that 32 is the point, but I think I can prove that the swapping begins before we run out of memory) anyway during BZ2 compression, I don't note any appreciable change in processor interrupts or swap file usage prior to running out of physical memory.
Now, with that said, I'm pretty sure that an application this "simple" (no disrespect intended) does not diffrentiate between physical memory and virtual memory. It would take a savy programmer to force the program to look only at certain ranges of memory. I haven't yet begun to look through the source myself (I want to duplicate and understand the problem first) so I can't state definitively (sp?) that "A genius programmer has added code to NSIS's available compression routines to cause only physical memory to be addressed." Realizing that I am a novice programmer, the person responsible for the addition of BZ2 should not take offense to my wording "genius" above for indubitably it requires more skill to differentiate (in code) between virtual and physical memory than to simply use virtual memory.
Therefore if BZ2 is loosly based on the Wheeler block sorting process, it might be that process itself which is responsible using only available physical memory.
Sincerely,
Jacob Metro
kichik
9th March 2003 16:41 UTC
Just so you'll know I am not ignoring this, I am reading this, and I will have a look in all of your supplied details when I get enough free time for it.
Thanks
kichik
15th September 2003 23:02 UTC
OK, done in latest CVS version, FINALLY! I have improved NSIS's file mapping so it won't map the entire file into memory (including the installer data block which is saved in a memory mapped file too). I have successfully compiled 3 installers of random movie trailers, 1GB total. Aside from my computer begging for mercy everything went fine ;)
23 minutes with no compression, 1 hour and 4 minutes with bzip2 compression and 28 minutes with zlib compression.
All installers installed all of the trailers with no problems and with the new extraction progress I could even see that it's not stuck :)
Operation choke computer completed successfully :D
Almost forgot... Use FileBufSize to let NSIS feast on your 20TB 5GHz QUAD DDR-III ;)
virtlink
16th September 2003 13:19 UTC
I think that this improvement brings NSIS again a little bit closer to the top of most used IDK's.
Good work! :up:
Jacob Metro
16th September 2003 15:36 UTC
Awesome
This is awesome. Thanks for the patience and hard work.
Moasat
9th January 2004 16:50 UTC
I still get this error when using XP on NTFS while compiling more than 2GB of many small files. I'm trying to use LZMA compression because I believe it will be able to compress the whole installer down below 700MB but I guess because it needs to create one large temp file before LZMA compression, and it is failing at the ~2GB limit of the temp file.
I get:
Internal compiler error #12345: error mapping file (2103166549, 30836736) is out of range.
Any way to increase this or work around it?
Also, I see in the source that ints were used for the file mapping and buffers instead of unsigned ints. Maybe someone 'in the know' could change these to unsigned and increase the abilities to at least 4GB.
Joost Verburg
9th January 2004 19:17 UTC
Large file handling has been improved, but 2GB is still the current limit.
You'd better use different archives instead of one 2GB executable.
Moasat
13th January 2004 15:39 UTC
That's really a downer. With the LZMA support, it would compress down to fit on one CD. Now, with external archives, the whole thing looks very kludgy.
I've tried to edit the source myself but I don't have as good of an understanding of the whole project. Changing the offset in the GrowBuf and MMap objects would obviously need to be done. Other than that, just the external code that uses those objects have to understand unsigned offsets instead of signed ones. Why have signed offsets in a buffer or file anyway?
I hope this is strongly considered in a near future release.
Joost Verburg
13th January 2004 15:44 UTC
Try the version without solid compression (NSIS 2.0 RC2):
http://prdownloads.sourceforge.net/n...d.zip?download
Moasat
13th January 2004 15:51 UTC
I just saw that right before I replied. I'm going to give it a try shortly. Thanks.
- NSIS Discussion
- Big problem with big files...
Archive: Big problem with big files...
Moasat
13th January 2004 17:44 UTC
Doesn't cut it. I need the solid compression to get the 2.2GB down to about 700MB.
Joost Verburg
13th January 2004 18:13 UTC
So non-solid gives no error but solid does? Maybe the system for solid compression can be changed so it will be able to compress more than 2GB of data. Having a compressed installer larger than 2GB won't be useful because most Windows file systems have a 2GB file size limit.
Moasat
13th January 2004 21:08 UTC
I'm building it now but its taking quite a long time. I'm assuming that it will give no error since I can do this same process with zlib, but it won't generate a setup.exe that will fit on a CD. (Zlib ends up around 1GB IIRC) I know by using 7-zip manually that the 2.2GB of files will compress down to <700MB, so I know this would work if the datablock could grow beyond 2GB and then get compressed solidly via LZMA. I'm just at the mercy of the developers and if they feel this is an important enough feature to change/implement. I'm hoping they do.
Joost Verburg
13th January 2004 21:12 UTC
LZMA non-solid is definately not the same as ZLIB. It will probably be a little larger than LZMA solid, but usually it won't make a huge difference.
Moasat
13th January 2004 21:14 UTC
I'll know in a couple of hours. Besides this 2GB issue, I think the rest of it is fantastic. Along with the HM NIS Editor, it is very nice. I'm trying to think of all the things I need to make installers for so I can use this more. Problem is, I have many other installs that also require 2GB of data files. :(
Moasat
14th January 2004 21:11 UTC
Here's the final report. It didn't quite make it.
EXE header size: 49664 / 34816 bytes
Install code: 19821 / 86427 bytes
Install data: 736102408 / -1999567682 bytes
CRC (0xA0163DCE): 4 / 4 bytes
Total size: 736171897 / -1999446435 bytes (-36.-8%)
Joost Verburg
14th January 2004 21:37 UTC
Well, coverted to megabytes it's 702 MB.
Most 80 minute CD's have a capacity of about 703 MB, so it will probably work.
Joost Verburg
17th January 2004 16:25 UTC
You can also increase the dictionary size a bit to make it fit.
Koen van de Sande
18th January 2004 12:44 UTC
Perhaps the install data size variable shouldn't be signed to avoid negative numbers?
Joost Verburg
18th January 2004 13:52 UTC
Yes, it should be unsigned.
Comm@nder21
6th March 2004 20:29 UTC
ok, i'm getting this error:
Processed 1 file, writing output:
Internal compiler error #12345: deflateInit() failed(-1).
Note: you may have one or two (large) stale temporary file(s)
left in your temporary directory (Generally this only happens on Windows 9x).
i'm compiling few files (about 20), each one below 1 kb (only dummyfiles for testing), and some gfx. the whole installer is about 500 kb, with lzma.
the installer compiles correctly, while SetCompressorDictSize isn't used. everytime i use this setting, the error occurs (with any dictsize).
the nsis-script is running on a FAT32 partition with 3gb free space.
the partition with windows-xp has about 5gb free disk space, and the partition expecially for temporary files (including virtualram/swap) has another 3gb space, about 2gb free.
i'm using latest cvs-version of nsis.
why does this error occur at this configuration?
Joost Verburg
6th March 2004 21:43 UTC
I can't reproduce it, please attach a script including the files. Btw, this is not related to this topic.
Did you recompile makensis.exe? Note that the LZMA source code has been updated today (it's smaller and faster) and needs some testing.
When compiling installers to release we always recommend using a stable version, however we really appreciate it that you help testing the development version :)
Comm@nder21
6th March 2004 21:51 UTC
i'm always using the latest cvs, because, when u look at the changelog, it seems to be more stable (and more up to date - look at the language files) than the "stable".
also, this problem exists with some versions before.
and, yes, i think this is related to this topic, because i found this topic (and read all posts) when i searched for error #12345.
i don't know how to recompile makensis.exe, sorry.
hmm, i'll see, if i can attach an example.
kichik
6th March 2004 22:11 UTC
You must have used a dictionary size which your computer can not handle. Remeber that SetCompressorDictSize takes the number of mega bytes to use and not the number of bytes.
All errors related to memory or failed compression are marked as error #12345.
Joost Verburg
7th March 2004 08:54 UTC
Because you did not recompile makensis it is indeed not related to the new CVS source. I missed that line about SetCompressorDictSize.
Comm@nder21
7th March 2004 10:29 UTC
oh, damn, it's in mbyte?
i thought, that was kbyte. kbyte would make more sense, i think.
but, ok, then this is the misstake, i entered 4096 as dict-size :)
edit:
so, does this command support floating numbers?
edit2:
worked with 'SetCompressorDictSize 8' :)
kichik
11th March 2004 18:57 UTC
8 is the default. It doesn't support floating point numbers. I see no real need for that. In 7-zip all of the options are in mega bytes too...
Thorondor
15th April 2004 19:03 UTC
Hi all,
my problem is this:
i've got about 1.5 GByte of files to install (multiple files; biggest not larger than 200 MB). I can only compile using zlib compression. When i try to use bzip2 or lzma I get this error:
Using lzma (compress whole) compression.
EXE header size: 49152 / 35328 bytes
Install code: (38548 bytes)
Install data: (1444031845 bytes)
Uninstall code+data: (61041 bytes)
Compressed data: Error: deflateToFile: deflate()=-3
Error - aborting creation process
I tried it with bigger dictsizes, but this didn't work. These results were with a dictsize of 48 MB (was desperate :).
ps I use Windows2000 SP4 and NTFS drive with about 10 GB free.
I used the latest NSIS build (non-CVS)
Joost Verburg
15th April 2004 19:20 UTC
A larger dictionary size makes compression more complicated, a smaller size makes it faster and reduces the memory requirements.
You can also download a special build from the website that uses non-solid compression.
Thorondor
17th April 2004 11:00 UTC
Thanks non-solid and smaller dictionary size did the trick...
Sometimes :( Most of the time I get the same error with some of the files: deflate()=-3. The weird thing is if try compressing these files alone in NSIS I don't get an error message. Weirder still is that it's not always the same files. Sometimes the compiler stops on a file and sometimes not.
I'm not considering doing these files separate, I want one big file. Please help.
ps: I also tried compressing them outside of nsis with 7-zip and no problem
kichik
17th April 2004 11:07 UTC
-3 means a memory error for bzip2 and an exception for lzma which usually means a memory error too. Are you sure you have enough free memory? Try setting a lower dictionary size, way lower than 48MB...
Thorondor
17th April 2004 12:03 UTC
In the last run I used smaller dictionary sizes <8MB.
And I've got 576MB Physical memory of which usually about 50% is free. Total system memory is about 1.3GB with usually about 80% free. So it's not, not enough memory free.
Joost Verburg
17th April 2004 12:06 UTC
8MB is still quite large, try 4MB.
Thorondor
17th April 2004 12:14 UTC
I used 8, 4, 3, 2 and even 1.
kichik
17th April 2004 12:17 UTC
What error code do you get for bzip2? Is it -3 too?
Thorondor
17th April 2004 12:19 UTC
I haven't tried bzip2 in a while, but I'll try it next
Thorondor
17th April 2004 12:58 UTC
I tried bzip2 and unfortunately I didn't get an error ;). The problem for me is solved, but I think there is something wrong with the lzma compression for big files (solid/non-solid) (or else I'm doing something wrong :))
kichik
17th April 2004 13:17 UTC
Last question before I perper a test which will give some more details. Have you used FileBufSize? If you have, what value? If you haven't, try setting it to 8.
Thorondor
17th April 2004 15:08 UTC
I didn't use FileBufSize, but with it lzma-compression works like a charm. Even with a dictionarysize of 8MB.
Thanks for your help kichik and Joost Verburg.:up: :up:
(Looks like I WAS doing something wrong ;) )
kichik
17th April 2004 15:27 UTC
If setting it to a lower value, you surely had a memory problem. If not shortage of memory, then maybe corrupted memory? Worth a check...
Thorondor
17th April 2004 18:58 UTC
Memory is fine, enough free and not corrupted. Checked it with multiple programs (the amount free and if it's corrupted). I think it purely was the size of the file-buffer. When I increased it I had no more problems.
kichik
8th May 2004 18:40 UTC
I have added meaningful error strings in the latest version. Can you please test your script on makensis.exe from:
http://nsis.sourceforge.net/err.zip
and let me know what the error string is?
Thorondor
10th May 2004 19:34 UTC
I can't tell you the error string, because there is none. *
I think you solved the problem without realizing it ;)
I've tried multiple times, using different settings, even setting the dictionary size to 8, but no problems.
I'll keep on trying though. I'll post if I get an error.
*I mean I don't get any problems/errors anymore.
SonicSD
22nd May 2004 13:40 UTC
Hi!
I'm also having the size prob. I need to compress ~3 GB. I tried to compile a installer, but makensis
stopped at about ~2.1 GB. Ok I also tried it with the Non-Solid, but also stopped sometime.
What I want to know, would it be possible to compress the installation files externally with 7-Zip, and then add NSIS-Data additionally?
This would be a good workaround for now, until NSIS supports larger sizes. :D
Anyway NSIS is a great Installer, keep the good work up.
(using it since v1.98 :D)
SonicSD
Joost Verburg
22nd May 2004 15:36 UTC
You can use external data files and the 7-zip command line tool to decompress the files on the users system.
SonicSD
23rd May 2004 16:18 UTC
I did it that way, before NSIS got native lzma support.
But I thought that it might be possible to compress the data externally and make use of the native functions for decompression. (especially for the installing options :D)
SonicSD
Joost Verburg
23rd May 2004 16:29 UTC
NSIS does not support the 7-zip file format or any other archive structure.
If the compressed data size is below 2 GB, it should work with non-solid compression.