QuickLZ - Fastest Compression Library in C, C# and Java

  •        0

QuickLZ is the world's fastest compression library, reaching 308 Mbyte/s per core. It supports Streaming mode for optimal compression ratio of small packets down to 200 - 300 bytes in size.

http://www.quicklz.com/

Tags
Implementation
License
Platform

   




Related Projects

Zstandard - Fast real-time compression algorithm


Zstandard is a real-time compression algorithm, providing high compression ratios. It offers a very wide range of compression / speed trade-off, while being backed by a very fast decoder. It also offers a special mode for small data, called dictionary compression, and can create dictionaries from any sample set.

py-quicklz - python binding for quicklz compression library


python binding for quicklz compression library

zlib - A Massively Spiffy Yet Delicately Unobtrusive Compression Library


zlib is a general purpose data compression library. All the code is thread safe. It is ported to different programming languages like Java, CSharp, Python and Perl.

Lz4 - Extremely Fast Compression algorithm


LZ4 is a very fast lossless compression based on well-known LZ77 (Lempel-Ziv) algorithm, providing compression speed at 300 MB/s per core, scalable with multi-cores CPU. It also features an extremely fast decoder, with speeds up and beyond 1GB/s per core, typically reaching RAM speed limits on multi-core systems.

SHARC - Fastest lossless compression algorithm


SHARC is an extremely fast lossless dictionary-based compression algorithm. It is capable of an unprecedented compression speed of more than 500 MB/s per core on modern Intel CPUs ! It is scalable on multi core/multi CPU, developed in pure C99, and easily portable on many platforms.

Basic Compression Library


Basic Compression Library is a set of open source implementations of RLE (Run Length Encoding), Huffman, Rice, Lempel-Ziv (LZ77) and Shannon-Fano compression algorithms.

heatshrink - data compression/decompression library for embedded/real-time systems


data compression/decompression library for embedded/real-time systems

7-Zip - File archiver with a high compression ratio


7-Zip is a file archiver with the high compression ratio. The program supports 7z, XZ, BZIP2, GZIP, TAR, ZIP, WIM, ARJ, CAB, CHM, CPIO, CramFS, DEB, DMG, FAT, HFS, ISO, LZH, LZMA, MBR, MSI, NSIS, NTFS, RAR, RPM, SquashFS, UDF, VHD, WIM, XAR, Z.

Givati-Compression


A smart and lossless data compression algorithm with a focus on repeating data structures like JSON/JSONH, network and language permeability, decompression without requiring a pre-agreed dictionary, and minimum size over speed. Compression and speed comparable to Lempel–Ziv–Welch (LZW)

Huffman-Compression - Implemenation of the Huffman compression and decompression algorithms.


Implemenation of the Huffman compression and decompression algorithms.

pigz - A parallel implementation of gzip for modern multi-processor, multi-core machines


pigz, which stands for parallel implementation of gzip, it compresses using threads to make use of multiple processors and cores. The input is broken up in to 128 KB chunks with each compressed in parallel. The individual check value for each chunk is also calculated in parallel. The compressed data is written in order to the output, and a combined check value is calculated from the individual check values.

Info ZIP - Compressor Archiver Utilities


Info-ZIP's purpose is to provide free, portable, high-quality versions of the Zip and UnZip compressor-archiver utilities that are compatible with the DOS-based PKZIP.

PeaZip - Cross-platform file and archive manager


PeaZip is a free file archiver utility and rar extractor for Windows and Linux, work with 150+ archive types and variants (7z, ace, arc, bz2, cab, gz, iso, paq, pea, rar, tar, wim, zip, zipx...), handle spanned archives and support multiple archive encryption standards. The project aims to provide a cross-platform, portable, GUI frontend for multiple Open Source technologies (7-Zip, FreeArc, PAQ, PEA, UPX) focused on file and archive management, and security

dictionary - High-performance dictionary coding


Suppose you want to compress a large array of values with (relatively) few distinct values. For example, maybe you have 16 distinct 64-bit values. Only four bits are needed to store a value in the range [0,16) using binary packing, so if you have long arrays, it is possible to save 60 bits per value (compress the data by a factor of 16).We consider the following (simple) form of dictionary coding. We have a dictionary of 64-bit values (could be pointers) stored in an array. In the compression phase, we convert the values to indexes and binary pack them. In the decompression phase, we try to recover the dictionary-coded values as fast as possible.

zlib - Compression and decompression in the gzip and zlib formats


Compression and decompression in the gzip and zlib formats

abbrev - A library of lossless compression/decompression algorithms, written in Clojure.


A library of lossless compression/decompression algorithms, written in Clojure.

huffman_coding - compression and decompression


compression and decompression

LZW - LZW Compression/Decompression in C


LZW Compression/Decompression in C

huffman - Compression (and decompression) utility implemented with Huffman coding algorithm


Compression (and decompression) utility implemented with Huffman coding algorithm

node-tar.gz - Native gzip compression and decompression utility for Node.js.


Native gzip compression and decompression utility for Node.js.