The Capacity of the Band-Limited Gaussian Channel
01 March 1966
As an idealized model for the time-continuous Gaussian channel (with bandwidth W cycles per second, two-sided noise spectral density Na/2, and average power P,,), Shannon1,2 employed the mathematical timediscrete channel which passes 2W real numbers x per second, with the average of x2 restricted to be Pa. Each input x is perturbed by an independent "noise" random variable which is Gaussian with mean zero and variance NaW. If by "channel capacity" we mean the maximum rate at which a channel is capable of transmitting information with arbitrarily small error probability as the coding and decoding delay becomes large, then the capacity of this time-discrete channel is given by the celebrated formula W log2 (1 + P0/N0W) bits per second (or W In (1 + Po/NoW) nats per second). In order to show that the capacity is given by this formula, it is necessary to prove a coding theorem (showing the possibility of achieving "error-free" communication at any rate less than W log2 (1 + Po/NuW)), and a "converse" (showing the impossibility of achieving "error-free" coding at a rate exceeding this quantity). For this -- purely mathematical -- channel these theorems have been proved, and there is no question as to the meaning and validity of the capacity formula. 359