Analysis of the Sufficient Path Elimination Window for the Maximum-Likelihood Sequential-Search Decoding Algorithm for Binary Convolutional Codes
نویسندگان
چکیده
In this work, the priority-first sequential-search decoding algorithm proposed in [8] is revisited. By replacing the conventional Fano metric by one that is derived based on the Wagner rule, the sequentialsearch decoding in [8] guarantees the maximum-likelihood (ML) performance, and hence, was named the maximum-likelihood sequential decoding algorithm (MLSDA). It was then concluded by simulations that the software computational complexity of the MLSDA is in general considerably smaller than that of the Viterbi algorithm. A common problem on sequential-type decoding is that at the signal-to-noise ratio (SNR) below the one corresponding to the cutoff rate, the average decoding complexity per information bit and the required stack size grow rapidly with the information length [13]. This problem to some extent prevent the practical use of sequential-type decoding from convolutional codes with long information sequence at low SNRs. In order to alleviate the problem in the MLSDA, we propose to directly eliminate the top path whose end node is ∆-trellis-level prior to the farthest one among all nodes that have been expanded thus far by the sequential search, which we termed the early elimination. Following random coding argument, we analyze the early-elimination window ∆ that results in negligible performance degradation SunplusMM Technology Co., Ltd, HsinChu City, Taiwan 300, ROC Dept. of Communications Eng., National Chiao-Tung Univ., HsinChu City, Taiwan 300, ROC Graduate Institute of Communication Eng., National Taipei Univ., Taipei, Taiwan 237, ROC February 1, 2008 DRAFT 2 for the MLSDA. Our analytical results indicate that the required early elimination window for negligible performance degradation is just twice of the constraint length for rate one-half convolutional codes. For rate one-third convolutional codes, the required early-elimination window even reduces to the constraint length. The suggestive theoretical level thresholds almost coincide with the simulation results. As a consequence of the small early-elimination window required for near maximum-likelihood performance, the MLSDA with early-elimination modification rules out considerable computational burdens, as well as memory requirement, by directly eliminating a big number of the top paths, which makes the MLSDA with early elimination very suitable for applications that dictate a low-complexity software implementation with near maximum-likelihood performance. Index Terms Sequential decoding, maximum-likelihood, soft-decision, random coding
منابع مشابه
Sequential Decoding of Convolutional Codes
This article surveys many variants of sequential decoding in literature. Rather than introducing them chronologically, this article first presents the Algorithm A, a general sequential search algorithm. The stack algorithm and the Fano algorithm are then described in details. Next, trellis variants of sequential decoding, including the recently proposed maximum-likelihood sequential decoding al...
متن کاملOn Optimality Test in Low Complexity Maximum Likelihood Decoding of Convolutional Codes
This paper considers the average complexity of maximum likelihood (ML) decoding of convolutional codes. ML decoding can be modeled as finding the most probable path taken through a Markov graph. Integrated with the Viterbi algorithm (VA), complexity reduction methods such as the sphere decoder often use the sum log likelihood (SLL) of a Markov path as a bound to disprove (or test) the optimalit...
متن کاملA Study of Viterbi Decoder Algorithm for Wireless LANs
Viterbi Decoders are commonly used to decode convolutional codes in communications systems. This Viterbi Decoder is a fully parallel implementation which gives fast data throughput. The decoder is targeted for WiMAX and Wireless LAN applications. Input symbol metric pairs are decoded into output data bits by the maximum likelihood Viterbi processor core. Decoder supports both hard and soft inpu...
متن کاملSyndrome decoding of binary convolutional codes
This thesis concerns the decoding of conventional convolutionalcodes. Two new decoding methods are developed that both use a channelnoise generated syndrome. The first decoder is based on the polynomialparity matrix HT, whereas the second one uses both the matrices BT and(H-l)T, i.e. the transpose of the right inverse of H. Hence, the codegenerator G does not play the promin...
متن کاملSyndrome decoding of binary convolutional codes
This thesis concerns the decoding of conventional convolutionalcodes. Two new decoding methods are developed that both use a channelnoise generated syndrome. The first decoder is based on the polynomialparity matrix HT, whereas the second one uses both the matrices BT and(H-l)T, i.e. the transpose of the right inverse of H. Hence, the codegenerator G does not play the promin...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/cs/0701080 شماره
صفحات -
تاریخ انتشار 2007