ТОП просматриваемых книг сайта:
Digital Communications 1. Safwan El Assad
Читать онлайн.Название Digital Communications 1
Год выпуска 0
isbn 9781119779773
Автор произведения Safwan El Assad
Жанр Программы
Издательство John Wiley & Sons Limited
Special cases.
– Noiseless channel: X and Y symbols are linked, so:I(X, Y)=H(X)=H(Y)
– Channel with maximum power noise: X and Y symbols are independent, therefore:I(X, Y) = 0
2.7. Capacity, redundancy and efficiency of a discrete channel
Claude Shannon introduced the concept of channel capacity, to measure the efficiency with which information is transmitted, and to find its upper limit.
The capacity C of a channel: (information bit/symbol) is the maximum value of the mutual information I(X, Y) over the set of input symbols probabilities
[2.54]
The maximization of I(X, Y) is performed under the constraints that:
The maximum value of I(X, Y)occurs for some well-defined values of these probabilities, which thus define a certain so-called secondary source.
The capacity of the channel can also be related to the unit of time (bitrate Ct of the channel), in this case, one has:
[2.55]
The channel redundancy Rc and the relative channel redundancy pc are defined by:
[2.56]
[2.57]
The efficiency of the use of the channel
[2.58]
2.7.1. Shannon’s theorem: capacity of a communication system
Shannon also formulated the capacity of a communication system by the following relation:
[2.59]
where:
– B: is the channel bandwidth, in hertz;
– Ps: is the signal power, in watts;
is the power spectral density of the (supposed) Gaussian and white noise in its frequency band B;
is the noise power, in watts.
EXAMPLE.– Binary symmetric channel (BSC).
Any binary channel will be characterized by the noise matrix:
If the binary channel is symmetric, then one has:
p(y1/x2) = p(y2/x1) = p
p(y1/x1) = p(y2/x2) = 1 − p
Figure 2.5. Binary symmetric channel
The channel capacity is:
The conditional entropy H(Y/X) is:
Hence:
But max H(Y) = 1 for p(y1) = p(y2). It follows from the symmetry of the channel that if p(y1) = p(y2), then p(x1) = p(x2) = 1/2, and C will be given by:
Figure 2.6. Variation of the capacity of a BSC according to p
2.8. Entropies with k random variables
The joined entropy of k random variables is written:
[2.60]
Furthermore, by setting down in the relationship:
One has:
[2.61]
Equality occurs when the variables are independent.
Конец ознакомительного фрагмента.
Текст предоставлен ООО «ЛитРес».
Прочитайте эту книгу целиком, купив полную легальную версию на ЛитРес.
Безопасно оплатить книгу можно банковской картой Visa, MasterCard, Maestro, со счета мобильного телефона, с платежного терминала, в салоне МТС или Связной, через PayPal, WebMoney, Яндекс.Деньги, QIWI Кошелек, бонусными картами или другим удобным Вам способом.