3 research outputs found
Interplay Between Transmission Delay, Average Data Rate, and Performance in Output Feedback Control over Digital Communication Channels
The performance of a noisy linear time-invariant (LTI) plant, controlled over
a noiseless digital channel with transmission delay, is investigated in this
paper. The rate-limited channel connects the single measurement output of the
plant to its single control input through a causal, but otherwise arbitrary,
coder-controller pair. An infomation-theoretic approach is utilized to analyze
the minimal average data rate required to attain the quadratic performance when
the channel imposes a known constant delay on the transmitted data. This
infimum average data rate is shown to be lower bounded by minimizing the
directed information rate across a set of LTI filters and an additive white
Gaussian noise (AWGN) channel. It is demonstrated that the presence of time
delay in the channel increases the data rate needed to achieve a certain level
of performance. The applicability of the results is verified through a
numerical example. In particular, we show by simulations that when the optimal
filters are used but the AWGN channel (used in the lower bound) is replaced by
a simple scalar uniform quantizer, the resulting operational data rates are at
most around 0.3 bits above the lower bounds.Comment: A less-detailed version of this paper has been accepted for
publication in the proceedings of ACC 201