390,396 research outputs found
On the capacity of channels with Gaussian and non-Gaussian noise
We evaluate the information capacity of channels for which the noise process is a Gaussian measure on a quasi-complete locally convex space. The coding capacity is calculated in this setting and for time-continuous Gaussian channels using the information capacity result. The coding capacity of channels with non-Gaussian noise having finite entropy with respect to Gaussian noise of the same covariance is shown not to exceed the coding capacity of the Gaussian channel. The sensitivity of the information capacity to deviations from normality in the noise process is also investigated
Probabilistic Monads, Domains and Classical Information
Shannon's classical information theory uses probability theory to analyze
channels as mechanisms for information flow. In this paper, we generalize
results of Martin, Allwein and Moskowitz for binary channels to show how some
more modern tools - probabilistic monads and domain theory in particular - can
be used to model classical channels. As initiated Martin, et al., the point of
departure is to consider the family of channels with fixed inputs and outputs,
rather than trying to analyze channels one at a time. The results show that
domain theory has a role to play in the capacity of channels; in particular,
the (n x n)-stochastic matrices, which are the classical channels having the
same sized input as output, admit a quotient compact ordered space which is a
domain, and the capacity map factors through this quotient via a
Scott-continuous map that measures the quotient domain. We also comment on how
some of our results relate to recent discoveries about quantum channels and
free affine monoids.Comment: In Proceedings DCM 2011, arXiv:1207.682
Information capacity in the weak-signal approximation
We derive an approximate expression for mutual information in a broad class
of discrete-time stationary channels with continuous input, under the
constraint of vanishing input amplitude or power. The approximation describes
the input by its covariance matrix, while the channel properties are described
by the Fisher information matrix. This separation of input and channel
properties allows us to analyze the optimality conditions in a convenient way.
We show that input correlations in memoryless channels do not affect channel
capacity since their effect decreases fast with vanishing input amplitude or
power. On the other hand, for channels with memory, properly matching the input
covariances to the dependence structure of the noise may lead to almost
noiseless information transfer, even for intermediate values of the noise
correlations. Since many model systems described in mathematical neuroscience
and biophysics operate in the high noise regime and weak-signal conditions, we
believe, that the described results are of potential interest also to
researchers in these areas.Comment: 11 pages, 4 figures; accepted for publication in Physical Review
Delays and the Capacity of Continuous-time Channels
Any physical channel of communication offers two potential reasons why its
capacity (the number of bits it can transmit in a unit of time) might be
unbounded: (1) Infinitely many choices of signal strength at any given instant
of time, and (2) Infinitely many instances of time at which signals may be
sent. However channel noise cancels out the potential unboundedness of the
first aspect, leaving typical channels with only a finite capacity per instant
of time. The latter source of infinity seems less studied. A potential source
of unreliability that might restrict the capacity also from the second aspect
is delay: Signals transmitted by the sender at a given point of time may not be
received with a predictable delay at the receiving end. Here we examine this
source of uncertainty by considering a simple discrete model of delay errors.
In our model the communicating parties get to subdivide time as microscopically
finely as they wish, but still have to cope with communication delays that are
macroscopic and variable. The continuous process becomes the limit of our
process as the time subdivision becomes infinitesimal. We taxonomize this class
of communication channels based on whether the delays and noise are stochastic
or adversarial; and based on how much information each aspect has about the
other when introducing its errors. We analyze the limits of such channels and
reach somewhat surprising conclusions: The capacity of a physical channel is
finitely bounded only if at least one of the two sources of error (signal noise
or delay noise) is adversarial. In particular the capacity is finitely bounded
only if the delay is adversarial, or the noise is adversarial and acts with
knowledge of the stochastic delay. If both error sources are stochastic, or if
the noise is adversarial and independent of the stochastic delay, then the
capacity of the associated physical channel is infinite
On the Sensitivity of Continuous-Time Noncoherent Fading Channel Capacity
The noncoherent capacity of stationary discrete-time fading channels is known
to be very sensitive to the fine details of the channel model. More
specifically, the measure of the support of the fading-process power spectral
density (PSD) determines if noncoherent capacity grows logarithmically in SNR
or slower than logarithmically. Such a result is unsatisfactory from an
engineering point of view, as the support of the PSD cannot be determined
through measurements. The aim of this paper is to assess whether, for general
continuous-time Rayleigh-fading channels, this sensitivity has a noticeable
impact on capacity at SNR values of practical interest.
To this end, we consider the general class of band-limited continuous-time
Rayleigh-fading channels that satisfy the wide-sense stationary
uncorrelated-scattering (WSSUS) assumption and are, in addition, underspread.
We show that, for all SNR values of practical interest, the noncoherent
capacity of every channel in this class is close to the capacity of an AWGN
channel with the same SNR and bandwidth, independently of the measure of the
support of the scattering function (the two-dimensional channel PSD). Our
result is based on a lower bound on noncoherent capacity, which is built on a
discretization of the channel input-output relation induced by projecting onto
Weyl-Heisenberg (WH) sets. This approach is interesting in its own right as it
yields a mathematically tractable way of dealing with the mutual information
between certain continuous-time random signals.Comment: final versio
- …