|
Post by Remy on Nov 5, 2004 5:40:15 GMT -5
Hi!
Using D7 on XP (but the problem seems to exist on every plattform)
1) When updating or refreshing files sometimes the resulting zip file is bigger than the original source. No mpeg, zips etc are included, only files that can be compressed. If the file is not refreshing, and the zip file doesn't exists, the size is as expected. But if it exists and the archive is refreshed or updated, then the content of the archive is OK, but the size of the resulting archive grows every time. Any thoughs?
2) Out of memory. When zipping LARGE files, my program will sometimes use a huge amount of memory and sometimes it just dies with an Out of memory message. Does the zipping occurres completly in the memory or is there any possibility to make some swapping to the disk? Or maybe I am doing something wrong here?
|
|
|
Post by Kevin on Nov 5, 2004 12:51:50 GMT -5
1) Is the archive that is growing always one that you had created with VCLZip? Does it grow whether or not any files are actually freshened or updated? 2) There is a problem with the Borland memory manager causing fragmentation when MANY objects are created and this comes into play when you work with archives containing a huge number of files. However, I have found that replacing the memory manager fixes this problem. I have had great success using a free memory manager called MultiMM. You can download it at www.vclzip.net/multimm.zipSimply add the two files to your application's source files and add MultiMM to your Application source's uses list. Not your main form... add it to your application source file. Kevin
|
|
|
Post by Remy on Nov 5, 2004 18:52:15 GMT -5
Hi
1) No, sometimes even when the archive doesn't exists (nothing to refresh), the size is bigger than the original older. It doesn't happen always, but significantly often.
2) Thank you for the multimm.zip. I will take a look at this.
CU
|
|
|
Post by Kevin on Nov 6, 2004 9:31:05 GMT -5
Does this happen with smaller archives, or are you just working with Zip64 type of archives? That is, greater than 65535 compressed files or greater than 4 gig in size?
Any chance you are using different packlevels in some cases than what the original files were zipped with?
Kevin
|
|
|
Post by Remy on Nov 6, 2004 19:41:19 GMT -5
Kevin The memory manager seems to solve the Out of memory problem (even if the operation seems to be much slower, but anyway, this is better than OoM I supose).
About the second problem, I have discovered that sometimes the size problem existes even when a fresh archive is created. I have reproduced this with zip64 but I will try with smaller archives and see if the problem exists there too.
|
|
|
Post by Kevin on Nov 7, 2004 8:32:09 GMT -5
Really? I have not noticed things being slower, especially since when memory began getting fragmenting without it, it got very slow then.
Let me know what you find, it will be much easier to find the time to track this down if you find that it happens for smaller archives.
One thing I don't quite understand, though, you say it happens sometimes even when a fresh archive is created. How do you know there is a size problem with a new archive?
|
|
|
Post by Remy on Nov 8, 2004 4:11:55 GMT -5
Kevin! The memory manager is a bit slower in the tests I have done, bu maybe not considerable slower.I did have problems the first time I used it but it turned out to be me compressing a very badly fragmented 2 GB file. So now I am doing a bwetter test and in comperation with Borland memory managare, the difference is minimal (2-5%), but this is in no way a scientific benchmark. Aboiut the second isssue, the problem is that sometimes the resulting archive is larger than the uncompressed source. For example, I was zipping a 3,4 Gb directory (that doesn't contain any avi, rar, zip, jpg or other non-compressible data), and sometimes thearchive size is OK, about 2,1 GB), but sometimes the archive is 3,8-4 GB)!!! It is difficult to reproduce this error because it doesn't happen every time, but it does happen in my case. The srchive is OK, all the files have a good CRC and every file can be extracted without problems. Compression is on level 6 and its not changed. I'll try to reproduce this on small archives , and if it happens, I'll send you an archive test. The risk is that this may only happen with zip64
|
|
|
Post by Kevin on Nov 8, 2004 7:54:43 GMT -5
That does sound pretty strange. And this is the same data each time?
Kevin
|
|