Hard&soft decision decoding
Hard decision decoding takes a stream of bits say from the'threshold detector'stage of a receiver,where each bit is considered definitely one or zero.E战争的故事
g.for binary signaling,received puls are sampled and the resulting voltages are compared with a single threshold.If a voltage is greater than the threshold it is considered to be definitely a'one'say regardl快手个人介绍
ess of how clo it is to the threshold.If its less,its definitely zero.
Soft decision decoding requires a stream of'soft bits'where we get not only the1or0decision but also an indication of how certain we are that the decision is correct.
One way of implementing this would be to make the threshold自然之道课文
detector generate instead of0or1, say:
000(definitely0婴儿发烧怎么办
),001(probably0),010(maybe0),011(guess0),
100(guess1),101(maybe1),110(probably1),111(definitely1).
We may call the last two bits'confidence'bits.
This is easy to do with eight voltage thresholds rather than one.
This helps when we anticipate errors and have some'forward error correction'coding built into the transmission.Define FEC precily.
Example:A receiver receives a bit stream consisting of quences of8bit words which contain 7information bits and one parity bit.The parity精神病的表现
bit is t at the recei九年级下册政治
ver in such a way that the total number of ones in each8bit word is even.Even parity.
A soft decision threshold detector as described above generates the following outputs.
(i)000110010111001011110111
(ii)000110010111001011110001
What is the most likely8-bit word in each ca?
Say what convolutional coding is and a half rate coder abd3/4.
The Viterbi alg can take the'soft bit'words and compute distances etc.as easily as it deals with hard bits.No great additional complexity apart from dealing with words(in this example3-bit words)rather than one bit words.But the decisions are likely to be much much better with the greater reliability being placed on bits we are certain about than on but we are more uncertain about.