نتایج جستجو برای: huffman

تعداد نتایج: 1461  

Journal: :IEEE Transactions on Information Theory 1976

2009
A. O. Oke

With the recent trend in Information and Communication Technology, Storage and Transfer of data and Information are two vital issues which have Cost and Speed implication respectively. Large volume of data (text or image) is constantly being processed on the internet or on a Personal Computer, which has led to the Upgrade of current System. Hence the need for compression, which reduces storage ...

2005
Stephen Huffman

Dr. Stephen Huffman [email protected] [Editor’s note: This paper describes a series of maps produced by Dr. Huffman using a pre-release version of GMI’s World Language Mapping System (WLMS) GIS data set. Most of these maps have been adjusted to use the current released version of the WLMS along with GMI’s Seamless Digital Chart of the World. Images of the maps, Portable Document File (PDF) files of ...

2016
Sona Khanna Suman Kumari

Lossless compression of a progression of symbols is a decisive part of data and signal compression. Huffman coding is lossless in nature; it is also generally utilized in lossy compression as the eventual step after decomposition and quantization of a signal. In signal compression, the disintegration and quantization part seldom manages to harvest a progression of completely autonomous symbols....

2001
Yi-Ping You Shi-Chun Tsai

Though Huffman codes [2,3,4,5,9] have shown their power in data compression, there are still some issues that are not noticed. In the present paper, we address the issue on the random property of compressed data via Huffman coding. Randomized computation is the only known method for many notoriously difficult #P-complete problems such as permanent, and some network reliability problems, etc [1,...

Journal: :Comput. J. 1998
Roberto De Prisco Alfredo De Santis

While compressing a file with a Huffman code, it is possible that the size of the file grows temporarily. This happens when the source letters with low frequencies (to which long codewords are assigned) are encoded first. The maximum data expansion is the average growth in bits per source letter resulting from the encoding of a source letter with a long codeword. It is a measure of the worst ca...

1998
Nick B. Body Donald G. Bailey

The lossless entropy coding used in many image coding schemes often is overlooked as most research is based around the lossy stages of image compression. This paper examines the relative merits of using static Huffman coding with a compact optimal table verses more sophisticated adaptive arithmetic methods. For very low bit rate image compression, the computationally simple Huffman method is sh...

2005
Soheil Mohajer Payam Pakzad Ali Kakhbod

Consider a discrete finite source with N symbols, and with the probability distribution p := (u1, u2, . . . , uN). It is well-known that the Huffman encoding algorithm [1] provides an optimal prefix code for this source. A D-ary Huffman code is usually represented using a D-ary tree T , whose leaves correspond to the source symbols; The D edges emanating from each intermediate node of T are lab...

1996
David W. Redmill David R. Bull

This paper examines the use of arithmetic coding in conjunction with the error resilient entropy code (EREC). The constraints on the coding model are discussed and simulation results are presented and compared to those obtained using Huffman coding. These results show that without the EREC, arithmetic coding is less resilient than Huffman coding, while with the EREC both systems yield comparabl...

Journal: :CoRR 2015
Song Han Huizi Mao William J. Dally

Deep Compression is a three stage compression pipeline: pruning, quantization and Huffman coding. Pruning reduces the number of weights by 10x, quantization further improves the compression rate between 27x and 31x. Huffman coding gives more compression: between 35x and 49x. The compression rate already included the metadata for sparse representation. Deep Compression doesn’t incur loss of accu...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید