Clever Geek Handbook
📜 ⬆️ ⬇️

Shannon's theorem - Hartley

The Shannon-Hartley theorem in information theory is the application of the channel-coding theorem with noise to the archetypal case of a continuous analogue communication channel time distorted by Gaussian noise . The theorem establishes the Shannon channel capacity, the upper limit of the maximum amount of error-free digital data (that is, information ) that can be transmitted via such communication to a specified bandwidth in the presence of noise interference, under the assumption that the signal power is limited and Gaussian noise is characterized by a known power or power spectral density . The law is named after Claude Shannon and Ralph Hartley .

Content

Approval of the theorem

Considering all possible multi-level and multi-phase coding methods, the Shannon-Hartley theorem states that the channel bandwidthC {\ displaystyle C}   meaning the theoretical upper bound of the data rate that can be transmitted with a given average signal powerS {\ displaystyle S}   via analog communication channel, subject to additive white Gaussian noise powerN {\ displaystyle N}   equals

C=Blog2⁡(one+SN),{\ displaystyle C = B \ log _ {2} \ left (1 + {\ frac {S} {N}} \ right),}  

Where

C{\ displaystyle C}   - channel bandwidth, bps ;
B{\ displaystyle B}   - channel bandwidth, Hz ;
S{\ displaystyle S}   - total signal power over the passband, W or V ²;
N{\ displaystyle N}   - total noise power over the passband, W or V ²;
S/N{\ displaystyle S / N}   - signal to noise power ratio (SNR) .

Development History

During the late 1920s, Harry Nyquist and Ralph Hartley developed fundamental ideas related to the transmission of information via the telegraph as a communication system. It was a breakthrough at the time, but science as such did not exist. In the 1940s, Claude Shannon introduced the concept of channel capacity , which was based on the ideas of Nyquist and Hartley, and then formulated a complete theory of information transfer.

Nyquist criterion

In 1927, Nyquist established that the number of independent pulses per unit of time that can be transmitted via the telegraph channel is limited to twice the maximum channel bandwidth (this frequency corresponds to an alternating sequence of zeros and ones, the other combinations of signals correspond to lower frequencies):

fp⩽2B,{\ displaystyle f_ {p} \ leqslant 2B,}  

Wherefp {\ displaystyle f_ {p}}   - pulse frequency (imp / s), andB {\ displaystyle B}   - bandwidth (Hz).

Formula Hartley

Shannon theorems for the noisy channel

Shannon's theorems for a channel with noises (Shannon's theorems for transmission over a channel with noises) link the transmission capacity of the information transmission channel and the existence of a code that can be used to transmit information through the channel with an error tending to zero (with an increase in the block length).

If the message transfer rate is less than the communication bandwidth

R<C,{\ displaystyle R <C,}  

then there are codes and decoding methods such that the average and maximum decoding error probabilities tend to zero when the block length tends to infinity.

If

R>C,{\ displaystyle R> C,}  

there is no code on the basis of which one can achieve as much as possible a small probability of an error.

Shannon's theorem - Hartley

In this theorem, it is determined that the maximum speed (bit / s) can be achieved by increasing the bandwidth and signal power and, at the same time, reducing noise.

The Shannon-Hartley theorem limits the information rate (bit / s) for a given bandwidth and signal-to-noise ratio. To increase the speed, it is necessary to increase the level of the useful signal in relation to the noise level.

If there was a noiseless analog channel with infinite bandwidth, then it would be possible to transfer an unlimited amount of error-free data per unit of time. Real channels have frequency limits and noise is always present in them.

Surprisingly, not only bandwidth limitations affect the amount of information transmitted. If we combine noise and bandwidth limitations, we really see that there is a limit to the amount of information that could be transmitted, even using multi-level encoding methods. In the channel, which is considered a theorem of Shannon - Hartley, noise and signal complement each other. Thus, the receiver perceives a signal that is equal to the sum of the signals that encode the necessary information and the continuous random signal that represents the noise.

This addition creates uncertainty about the value of the original signal. If the receiver has information about the likelihood of an unnecessary signal that creates noise, then you can restore the information in its original form, considering all possible effects of the noise process. In the case of the Shannon – Hartley theorem, the noise itself is produced by a Gaussian process with some deviations in the transmission channel. Such a channel is called an aggregate white Gaussian noise channel , since Gaussian noise is part of the wanted signal. "White" implies an equal amount of noise in all frequencies within the channel bandwidth. Such noise may occur when exposed to random sources of energy, as well as be associated with errors that occurred during coding. Knowledge of the probability of Gaussian noise greatly simplifies the determination of the useful signal.

Meaning of the theorem

Channel capacity and Hartley formula

By comparing the channel bandwidth and the Hartley formula, we can find the effective numberM {\ displaystyle M}   distinguishable levels:

2Blog2⁡M=Blog2⁡(one+SN),{\ displaystyle 2B \ log _ {2} M = B \ log _ {2} \ left (1 + {\ frac {S} {N}} \ right),}  
M=one+SN.{\ displaystyle M = {\ sqrt {1 + {\ frac {S} {N}}}}.}  

Taking the square root essentially returns the power to voltage ratio, so the number of levels is approximately equal to the ratio of the rms amplitude of the signal to the noise standard deviation. This similarity in the form between the Shannon bandwidth and the Hartley formula should not be understood literally that for an error-free transmissionM {\ displaystyle M}   signal levels. Redundant coding to eliminate errors will require more levels, but the maximum data transfer rate that can be approached with coding is equivalent to using thatM {\ displaystyle M}   from the Hartley formula.

See also

  • Kotelnikov theorem

Links

  • http://dsp7.ee.uct.ac.za/~nicolls/lectures/eee482f/04_chancap_2up.pdf
  • http://www.linfo.org/shannon-hartley_theorem.html
  • http://www.ingenu.com/2016/07/back-to-basics-the-shannon-hartley-theorem/
  • http://web.mit.edu/~ecprice/www/papers/isit.pdf
  • http://www.electronics.dit.ie/staff/amoloney/dt008_2/dig-comms-ii-lecture-11-12.pdf
Source - https://ru.wikipedia.org/w/index.php?title=Teorema_Shennona_—_Hartley&oldid=100996210


More articles:

  • Atherospermic
  • Mitchell, John Cameron
  • Jetstar Airways
  • Daich Samuel Aronovich
  • Nicholson, John (race car driver)
  • Tonner (football club)
  • Nanshiqiao (subway station)
  • Water Vole
  • Today, Tomorrow & Forever
  • Price, Lewis

All articles

Clever Geek | 2019