5 research outputs found
A new graph perspective on max-min fairness in Gaussian parallel channels
In this work we are concerned with the problem of achieving max-min fairness
in Gaussian parallel channels with respect to a general performance function,
including channel capacity or decoding reliability as special cases. As our
central results, we characterize the laws which determine the value of the
achievable max-min fair performance as a function of channel sharing policy and
power allocation (to channels and users). In particular, we show that the
max-min fair performance behaves as a specialized version of the Lovasz
function, or Delsarte bound, of a certain graph induced by channel sharing
combinatorics. We also prove that, in addition to such graph, merely a certain
2-norm distance dependent on the allowable power allocations and used
performance functions, is sufficient for the characterization of max-min fair
performance up to some candidate interval. Our results show also a specific
role played by odd cycles in the graph induced by the channel sharing policy
and we present an interesting relation between max-min fairness in parallel
channels and optimal throughput in an associated interference channel.Comment: 41 pages, 8 figures. submitted to IEEE Transactions on Information
Theory on August the 6th, 200
Recommended from our members
Non-Convex Optimization and Applications to Bilinear Programming and Super-Resolution Imaging
Bilinear programs and Phase Retrieval are two instances of nonconvex problems that arise in engineering and physical applications, and both occur with their fundamental difficulties. In this thesis, we consider various methods and algorithms for tackling these challenging problems and discuss their effectiveness. Bilinear programs (BLPs) are ubiquitous in engineering applications, economics, and operations research, and have a natural encoding to quadratic programs. They appear in the study of Lyapunov functions used to deduce the stability of solutions to differential equations describing dynamical systems. For multivariate dynamical systems, the problem formulation for computing an appropriate Lyapunov function is a BLP. In electric power systems engineering, one of the most practically important and well-researched subfields of constrained nonlinear optimization is Optimal Power Flow wherein one attempts to optimize an electric power system subject to physical constraints imposed by electrical laws and engineering limits, which can be naturally formulated as a quadratic program. In a recent publication, we studied the relationship between data flow constraints for numerical domains such as polyhedra and bilinear constraints. The problem of recovering an image from its Fourier modulus, or intensity, measurements emerges in many physical and engineering applications. The problem is known as Fourier phase retrieval wherein one attempts to recover the phase information of a signal in order to accurately reconstruct it from estimated intensity measurements by applying the inverse Fourier transform. The problem of recovering phase information from a set of measurements can be formulated as a quadratic program. This problem is well-studied but still presents many challenges. The resolution of an optical device is defined as the smallest distance between two objects such that the two objects can still be recognized as separate entities. Due to the physics of diffraction, and the way that light bends around an obstacle, the resolving power of an optical system is limited. This limit, known as the diffraction limit, was first introduced by Ernst Abbe in 1873. Obtaining the complete phase information would enable one to perfectly reconstruct an image; however, the problem is severely ill-posed and the leads to a specialized type of quadratic program, known as super-resolution imaging, wherein one attempts to learn phase information beyond the limits of diffraction and the limitations imposed by the imaging device
Recommended from our members
Non-Convex Optimization and Applications to Bilinear Programming and Super-Resolution Imaging
Bilinear programs and Phase Retrieval are two instances of nonconvex problems that arise in engineering and physical applications, and both occur with their fundamental difficulties. In this thesis, we consider various methods and algorithms for tackling these challenging problems and discuss their effectiveness. Bilinear programs (BLPs) are ubiquitous in engineering applications, economics, and operations research, and have a natural encoding to quadratic programs. They appear in the study of Lyapunov functions used to deduce the stability of solutions to differential equations describing dynamical systems. For multivariate dynamical systems, the problem formulation for computing an appropriate Lyapunov function is a BLP. In electric power systems engineering, one of the most practically important and well-researched subfields of constrained nonlinear optimization is Optimal Power Flow wherein one attempts to optimize an electric power system subject to physical constraints imposed by electrical laws and engineering limits, which can be naturally formulated as a quadratic program. We study the relationship between data flow constraints for numerical domains such as polyhedra and bilinear constraints. The problem of recovering an image from its Fourier modulus, or intensity, measurements emerges in many physical and engineering applications. The problem is known as Fourier phase retrieval wherein one attempts to recover the phase information of a signal in order to accurately reconstruct it from estimated intensity measurements by applying the inverse Fourier transform. The problem of recovering phase information from a set of measurements can be formulated as a quadratic program. This problem is well-studied but still presents many challenges. The resolution of an optical device is defined as the smallest distance between two objects such that the two objects can still be recognized as separate entities. Due to the physics of diffraction, and the way that light bends around an obstacle, the resolving power of an optical system is limited. This limit, known as the diffraction limit, was first introduced by Ernst Abbe in 1873. Obtaining the complete phase information would enable one to perfectly reconstruct an image; however, the problem is severely ill-posed and the leads to a specialized type of quadratic program, known as super-resolution imaging, wherein one attempts to learn phase information beyond the limits of diffraction and the limitations imposed by the imaging device.</p
Managing uncertainty in modelling of wicked problems: theory and application to Sustainable Aquifer Yield
This thesis presents two approaches to help manage uncertainty in
modelling for the resolution of wicked problems , which have no
clear problem definition, solution or measure of success. It
focuses on Sustainable Aquifer Yield (SAY) as an example. SAY is
defined as the pumping volume obtained by a management plan that
is expected to satisfy objectives under future conditions within
a groundwater system. Integrated modelling can help express,
systematise and use knowledge of relevant behaviour of the
system, while engaging diverse stakeholders and addressing their
interests. Uncertainty is however a key and multifaceted issue
when dealing with wicked problems. While many modelling methods
exist to help address this uncertainty, there is a need for
modellers to be able to integrate these methods purposefully for
an applied problem.
The research presented involved iteratively proposing two
approaches to manage uncertainties in integrated modelling that
supports decision making, and exploring the value of each
approach by applying it to case studies. For each approach, the
applications specifically a) address a technical problem, b) push
boundaries on how the problem is viewed, specifically identifying
hitherto neglected aspects, and c) address a context where
accounting for contested views and surprise is imperative. This
research process is described in terms of Critical Systems
Practice and resulted in a compilation of linked publications.
The first approach proposed is an Uncertainty Management
Framework that can be used to help audit the treatment of
uncertainty in a step-wise description of an analysis (e.g.
evaluating a management plan). The framework provides a formal
structure for managing uncertainty by incorporating an
uncertainty typology and a set of fundamental uncertainty
management actions, but may be too restrictive and demanding for
some contexts.
To address these limitations, a complementary second approach,
designated Iterative Closed Question Modelling, addresses
uncertainty by constructing models to test whether each possible
answer to a closed question is plausible. The question,
assumptions about plausibility and the process of constructing
models are all considered uncertain and therefore themselves
iteratively critiqued. This approach is formalised in terms of
Boundary Critique such that it provides a philosophical
foundation justifying the use of a broad range of methods to
manage uncertainty in predictive modelling.
The thesis concludes that uncertainty needs to be embraced as a
natural part of researchers, policy makers and community coming
to grips with an evolving situation, rather than being an
obstacle to be eliminated. Training of modellers to manage
uncertainty needs to specifically address: identification of
model scenarios that contradict dominant conclusions; critique of
model assumptions and questions from multiple stakeholders’
points of view; and negotiation of the modeller’s role in
anticipating surprise (e.g. through understanding consequences of
error, design of monitoring, contingency planning and adaptive
management). The resulting emphasis on critical thinking about
alternative models helps to remind the user that modelling is not
a magic trick for seeing the future, but a structured way to
reason about both what we do and do not know