Citation

Abstract

Sequential decoding has been found to be an efficient means of communicating at low undetected error rates from deep space probes, but a failure mechanism known as erasure or computational overflow remains a significant problem. The erasure of a block occurs when the decoder has not finished decoding that block at the time that it must be output. A recent article developed a technique for scheduling the operations of a sequential decoder which has the potential for significant reduction in erasure probability relative to a decoder with the same parameters and using the conventional method of scheduling. Performance results reported previously depend upon the accuracy of an accepted model for the number of computations needed to decode a block of data. This article presents a reevaluation of decoder performance using actual sequential decoding data. Results are generally unchanged: a decoder with a 10-block buffer will achieve less than a 10“ erasure probability with the new scheduling technique whenever a similar decoder had achieved less than a 10-* erasure probability in conventional operation.

Details

Volume
IX
Published
June 15, 1972
Pages
88–96
File Size
920.7 KB