9,597 research outputs found
Recommended from our members
An intractability result for multiple integration
Our aim is to show that in the worst case setting the integration problem is intractable. The implications of the intractability result for lattice methods are considered briefly in Section 3
Bayesian model selection for exponential random graph models via adjusted pseudolikelihoods
Models with intractable likelihood functions arise in areas including network
analysis and spatial statistics, especially those involving Gibbs random
fields. Posterior parameter es timation in these settings is termed a
doubly-intractable problem because both the likelihood function and the
posterior distribution are intractable. The comparison of Bayesian models is
often based on the statistical evidence, the integral of the un-normalised
posterior distribution over the model parameters which is rarely available in
closed form. For doubly-intractable models, estimating the evidence adds
another layer of difficulty. Consequently, the selection of the model that best
describes an observed network among a collection of exponential random graph
models for network analysis is a daunting task. Pseudolikelihoods offer a
tractable approximation to the likelihood but should be treated with caution
because they can lead to an unreasonable inference. This paper specifies a
method to adjust pseudolikelihoods in order to obtain a reasonable, yet
tractable, approximation to the likelihood. This allows implementation of
widely used computational methods for evidence estimation and pursuit of
Bayesian model selection of exponential random graph models for the analysis of
social networks. Empirical comparisons to existing methods show that our
procedure yields similar evidence estimates, but at a lower computational cost.Comment: Supplementary material attached. To view attachments, please download
and extract the gzzipped source file listed under "Other formats
On the Parameterized Intractability of Monadic Second-Order Logic
One of Courcelle's celebrated results states that if C is a class of graphs
of bounded tree-width, then model-checking for monadic second order logic
(MSO_2) is fixed-parameter tractable (fpt) on C by linear time parameterized
algorithms, where the parameter is the tree-width plus the size of the formula.
An immediate question is whether this is best possible or whether the result
can be extended to classes of unbounded tree-width. In this paper we show that
in terms of tree-width, the theorem cannot be extended much further. More
specifically, we show that if C is a class of graphs which is closed under
colourings and satisfies certain constructibility conditions and is such that
the tree-width of C is not bounded by \log^{84} n then MSO_2-model checking is
not fpt unless SAT can be solved in sub-exponential time. If the tree-width of
C is not poly-logarithmically bounded, then MSO_2-model checking is not fpt
unless all problems in the polynomial-time hierarchy can be solved in
sub-exponential time
Noisy Hamiltonian Monte Carlo for doubly-intractable distributions
Hamiltonian Monte Carlo (HMC) has been progressively incorporated within the
statistician's toolbox as an alternative sampling method in settings when
standard Metropolis-Hastings is inefficient. HMC generates a Markov chain on an
augmented state space with transitions based on a deterministic differential
flow derived from Hamiltonian mechanics. In practice, the evolution of
Hamiltonian systems cannot be solved analytically, requiring numerical
integration schemes. Under numerical integration, the resulting approximate
solution no longer preserves the measure of the target distribution, therefore
an accept-reject step is used to correct the bias. For doubly-intractable
distributions -- such as posterior distributions based on Gibbs random fields
-- HMC suffers from some computational difficulties: computation of gradients
in the differential flow and computation of the accept-reject proposals poses
difficulty. In this paper, we study the behaviour of HMC when these quantities
are replaced by Monte Carlo estimates
Artificial Noise-Aided Biobjective Transmitter Optimization for Service Integration in Multi-User MIMO Gaussian Broadcast Channel
This paper considers an artificial noise (AN)-aided transmit design for
multi-user MIMO systems with integrated services. Specifically, two sorts of
service messages are combined and served simultaneously: one multicast message
intended for all receivers and one confidential message intended for only one
receiver and required to be perfectly secure from other unauthorized receivers.
Our interest lies in the joint design of input covariances of the multicast
message, confidential message and artificial noise (AN), such that the
achievable secrecy rate and multicast rate are simultaneously maximized. This
problem is identified as a secrecy rate region maximization (SRRM) problem in
the context of physical-layer service integration. Since this bi-objective
optimization problem is inherently complex to solve, we put forward two
different scalarization methods to convert it into a scalar optimization
problem. First, we propose to prefix the multicast rate as a constant, and
accordingly, the primal biobjective problem is converted into a secrecy rate
maximization (SRM) problem with quality of multicast service (QoMS) constraint.
By varying the constant, we can obtain different Pareto optimal points. The
resulting SRM problem can be iteratively solved via a provably convergent
difference-of-concave (DC) algorithm. In the second method, we aim to maximize
the weighted sum of the secrecy rate and the multicast rate. Through varying
the weighted vector, one can also obtain different Pareto optimal points. We
show that this weighted sum rate maximization (WSRM) problem can be recast into
a primal decomposable form, which is amenable to alternating optimization (AO).
Then we compare these two scalarization methods in terms of their overall
performance and computational complexity via theoretical analysis as well as
numerical simulation, based on which new insights can be drawn.Comment: 14 pages, 5 figure
Computers and Liquid State Statistical Mechanics
The advent of electronic computers has revolutionised the application of
statistical mechanics to the liquid state. Computers have permitted, for
example, the calculation of the phase diagram of water and ice and the folding
of proteins. The behaviour of alkanes adsorbed in zeolites, the formation of
liquid crystal phases and the process of nucleation. Computer simulations
provide, on one hand, new insights into the physical processes in action, and
on the other, quantitative results of greater and greater precision. Insights
into physical processes facilitate the reductionist agenda of physics, whilst
large scale simulations bring out emergent features that are inherent (although
far from obvious) in complex systems consisting of many bodies. It is safe to
say that computer simulations are now an indispensable tool for both the
theorist and the experimentalist, and in the future their usefulness will only
increase.
This chapter presents a selective review of some of the incredible advances
in condensed matter physics that could only have been achieved with the use of
computers.Comment: 22 pages, 2 figures. Chapter for a boo
- …