65 research outputs found
Exploiting the Superposition Property of Wireless Communication for Max-Consensus Problems in Multi-Agent Systems
This paper presents a consensus protocol that achieves max-consensus in
multi-agent systems over wireless channels. Interference, a feature of the
wireless channel, is exploited: each agent receives a superposition of
broadcast data, rather than individual values. With this information, the
system endowed with the proposed consensus protocol reaches max-consensus in a
finite number of steps. A comparison with traditional approaches shows that the
proposed consensus protocol achieves a faster convergence.Comment: Submitted for IFAC Workshop on Distributed Estimation and Control in
Networked System
Computability and Algorithmic Complexity in Economics
This is an outline of the origins and development of the way computability theory and algorithmic complexity theory were incorporated into economic and finance theories. We try to place, in the context of the development of computable economics, some of the classics of the subject as well as those that have, from time to time, been credited with having contributed to the advancement of the field. Speculative thoughts on where the frontiers of computable economics are, and how to move towards them, conclude the paper. In a precise sense - both historically and analytically - it would not be an exaggeration to claim that both the origins of computable economics and its frontiers are defined by two classics, both by Banach and Mazur: that one page masterpiece by Banach and Mazur ([5]), built on the foundations of Turing’s own classic, and the unpublished Mazur conjecture of 1928, and its unpublished proof by Banach ([38], ch. 6 & [68], ch. 1, #6). For the undisputed original classic of computable economics is RabinĂs effectivization of the Gale-Stewart game ([42];[16]); the frontiers, as I see them, are defined by recursive analysis and constructive mathematics, underpinning computability over the computable and constructive reals and providing computable foundations for the economist’s Marshallian penchant for curve-sketching ([9]; [19]; and, in general, the contents of Theoretical Computer Science, Vol. 219, Issue 1-2). The former work has its roots in the Banach-Mazur game (cf. [38], especially p.30), at least in one reading of it; the latter in ([5]), as well as other, earlier, contributions, not least by Brouwer.
A Computable Economist’s Perspective on Computational Complexity
A computable economist.s view of the world of computational complexity theory is described. This means the model of computation underpinning theories of computational complexity plays a central role. The emergence of computational complexity theories from diverse traditions is emphasised. The unifications that emerged in the modern era was codified by means of the notions of efficiency of computations, non-deterministic computations, completeness, reducibility and verifiability - all three of the latter concepts had their origins on what may be called "Post's Program of Research for Higher Recursion Theory". Approximations, computations and constructions are also emphasised. The recent real model of computation as a basis for studying computational complexity in the domain of the reals is also presented and discussed, albeit critically. A brief sceptical section on algorithmic complexity theory is included in an appendix.
A Computable Economist’s Perspective on Computational Complexity
A computable economist's view of the world of computational complexity theory is described. This means the model of computation underpinning theories of computational complexity plays a central role. The emergence of computational complexity theories from diverse traditions is emphasised. The unifications that emerged in the modern era was codified by means of the notions of efficiency of computations, non-deterministic computations, completeness, reducibility and verifiability - all three of the latter concepts had their origins on what may be called 'Post's Program of Research for Higher Recursion Theory'. Approximations, computations and constructions are also emphasised. The recent real model of computation as a basis for studying computational complexity in the domain of the reals is also presented and discussed, albeit critically. A brief sceptical section on algorithmic complexity theory is included in an appendix
Over-The-Air Computation in Correlated Channels
This paper presents and analyzes a one-shot coding scheme for the \gls{ota}
computation over a fast-fading multiple-access wireless channel. The assumed
channel model incorporates correlations both in fading and noise over time as
well as among users. The model also allows for non-Gaussian components in
fading and noise, provided that the distributions are sub-Gaussian (as is the
case for a sum of Gaussian and bounded random variables), rendering the
proposed scheme robust to a large class of non-Gaussian interference and noise
known to occur in many practical scenarios. OTA computation has a huge
potential for reducing communication cost in applications such as Machine
Learning (ML)-based distributed anomaly detection in large wireless sensor
networks. We illustrate this potential through extensive numerical simulations
- …