Fast LZMA2 Compression algorithm Wrapper for .NET
With respect to Fast LZMA2 repo
Windows x86/64 .NET 8 runtime
It is reconmended using x64.
x86 may have potential malfunction.
PM> Install-Package FastLZMA2Net
When using compression occasionally
string SourceFilePath = @"D:\dummy.tar"; string CompressedFilePath = @"D:\dummy.tar.fl2"; string DecompressedFilePath = @"D:\dummy.recovery.tar"; // Simple compression byte[] origin = File.ReadAllBytes(SourceFilePath); byte[] compressed = FL2.Compress(origin,0); byte[] decompressed = FL2.Decompress(compressed);When you have many small file, consider using context to avoid alloc overhead
// Context compression, context can be reuse. Compressor compressor = new(0) { CompressLevel = 10 }; compressed = compressor.Compress(origin); compressed = compressor.Compress(origin); compressed = compressor.Compress(origin); Decompressor decompressor = new Decompressor(); decompressed = decompressor.Decompress(compressed); decompressed = decompressor.Decompress(compressed); decompressed = decompressor.Decompress(compressed);When you have a very large file (>2GB) or slow I/O
byte[] buffer = new byte[256 * 1024 * 1024]; // use 256MB input buffer // This is suitable for most cases// compress using (MemoryStream ms = new MemoryStream()) { using (CompressStream cs = new CompressStream(ms)) { cs.Write(origin); } compressed = ms.ToArray(); } // decompress using (MemoryStream recoveryStream = new MemoryStream()) { using (MemoryStream ms = new MemoryStream(compressed)) { using (DecompressStream ds = new DecompressStream(ms)) { ds.CopyTo(recoveryStream); } } decompress = recoveryStream.ToArray(); }dotnet byte array have size limit <2GB
When processing Large file, It is not acceptable reading all data into memory.
It is recommended to using DFA(direct file access) streaming.
Streaming Compression
//large file streaming compression using Direct file access(>2GB) using (FileStream compressedFile = File.OpenWrite(CompressedFilePath)) { using (CompressStream cs = new CompressStream(compressedFile)) { using (FileStream sourceFile = File.OpenRead(SourceFilePath)) { //DO NOT USE sourceFile.CopyTo(cs) while using block buffer. // CopyTo() calls Write() inside, which terminate stream after 1 cycle. long offset = 0; while (offset < sourceFile.Length) { long remaining = sourceFile.Length - offset; //64M buffer is recommended. int bytesToWrite = (int)Math.Min(64 * 1024 * 1024, remaining); sourceFile.Read(buffer, 0, bytesToWrite); cs.Append(buffer, 0, bytesToWrite); offset += bytesToWrite; } // make sure always use Flush() after all Append() complete // Flush() add checksum to end and finish streaming operation. cs.Flush(); } } }Streaming Decompression
//large file streaming decompress(>2GB) using (FileStream recoveryStream = File.OpenWrite(DecompressedFilePath)) { using (FileStream compressedFile = File.OpenRead(CompressedFilePath)) { using (DecompressStream ds = new DecompressStream(compressedFile)) { ds.CopyTo(recoveryStream); } } }Compressor compressor = new(0) { CompressLevel = 10 }; compressor.SetParameter(FL2Parameter.FastLength, 48);Compressor compressor = new(0) { CompressLevel = 10 }; nuint size = FL2.EstimateCompressMemoryUsage(compressor.ContextPtr); size = EstimateCompressMemoryUsage(compressionLevel=10,nbThreads=8)nuint size = FL2.FindDecompressedSize(data);Open an issue.
PR is welcome.
