26,143 research outputs found
Towards More Practical Linear Programming-based Techniques for Algorithmic Mechanism Design
R. Lavy and C. Swamy (FOCS 2005, J. ACM 2011) introduced a general method for
obtaining truthful-in-expectation mechanisms from linear programming based
approximation algorithms. Due to the use of the Ellipsoid method, a direct
implementation of the method is unlikely to be efficient in practice. We
propose to use the much simpler and usually faster multiplicative weights
update method instead. The simplification comes at the cost of slightly weaker
approximation and truthfulness guarantees
Towards More Practical Linear Programming-based Techniques for Algorithmic Mechanism Design
R. Lavy and C. Swamy (FOCS 2005, J. ACM 2011) introduced a general method for obtaining truthful-in-expectation mechanisms from linear programming based approximation algorithms. Due to the use of the Ellipsoid method, a direct implementation of the method is unlikely to be efficient in practice. We propose to use the much simpler and usually faster multiplicative weights update method instead. The simplification comes at the cost of slightly weaker approximation and truthfulness guarantees
Mechanism Design via Dantzig-Wolfe Decomposition
In random allocation rules, typically first an optimal fractional point is
calculated via solving a linear program. The calculated point represents a
fractional assignment of objects or more generally packages of objects to
agents. In order to implement an expected assignment, the mechanism designer
must decompose the fractional point into integer solutions, each satisfying
underlying constraints. The resulting convex combination can then be viewed as
a probability distribution over feasible assignments out of which a random
assignment can be sampled. This approach has been successfully employed in
combinatorial optimization as well as mechanism design with or without money.
In this paper, we show that both finding the optimal fractional point as well
as its decomposition into integer solutions can be done at once. We propose an
appropriate linear program which provides the desired solution. We show that
the linear program can be solved via Dantzig-Wolfe decomposition. Dantzig-Wolfe
decomposition is a direct implementation of the revised simplex method which is
well known to be highly efficient in practice. We also show how to use the
Benders decomposition as an alternative method to solve the problem. The
proposed method can also find a decomposition into integer solutions when the
fractional point is readily present perhaps as an outcome of other algorithms
rather than linear programming. The resulting convex decomposition in this case
is tight in terms of the number of integer points according to the
Carath{\'e}odory's theorem
Diseño para operabilidad: Una revisión de enfoques y estrategias de solución
In the last decades the chemical engineering scientific research community has largely addressed the design-foroperability problem. Such an interest responds to the fact that the operability quality of a process is determined by design, becoming evident the convenience of considering operability issues in early design stages rather than later when the impact of modifications is less effective and more expensive. The necessity of integrating design and operability is dictated by the increasing complexity of the processes as result of progressively stringent economic, quality, safety and environmental constraints. Although the design-for-operability problem concerns to practically every technical discipline, it has achieved a particular identity within the chemical engineering field due to the economic magnitude of the involved processes. The work on design and analysis for operability in chemical engineering is really vast and a complete review in terms of papers is beyond the scope of this contribution. Instead, two major approaches will be addressed and those papers that in our belief had the most significance to the development of the field will be described in some detail.En las últimas décadas, la comunidad cientÃfica de ingenierÃa quÃmica ha abordado intensamente el problema de diseño-para-operabilidad. Tal interés responde al hecho de que la calidad operativa de un proceso esta determinada por diseño, resultando evidente la conveniencia de considerar aspectos operativos en las etapas tempranas del diseño y no luego, cuando el impacto de las modificaciones es menos efectivo y más costoso. La necesidad de integrar diseño y operabilidad esta dictada por la creciente complejidad de los procesos como resultado de las cada vez mayores restricciones económicas, de calidad de seguridad y medioambientales. Aunque el problema de diseño para operabilidad concierne a prácticamente toda disciplina, ha adquirido una identidad particular dentro de la ingenierÃa quÃmica debido a la magnitud económica de los procesos involucrados. El trabajo sobre diseño y análisis para operabilidad es realmente vasto y una revisión completa en términos de artÃculos supera los alcances de este trabajo. En su lugar, se discutirán los dos enfoques principales y aquellos artÃculos que en nuestra opinión han tenido mayor impacto para el desarrollo de la disciplina serán descriptos con cierto detalle.Fil: Blanco, Anibal Manuel. Consejo Nacional de Investigaciones CientÃficas y Técnicas. Centro CientÃfico Tecnológico Conicet - BahÃa Blanca. Planta Piloto de IngenierÃa QuÃmica. Universidad Nacional del Sur. Planta Piloto de IngenierÃa QuÃmica; ArgentinaFil: Bandoni, Jose Alberto. Consejo Nacional de Investigaciones CientÃficas y Técnicas. Centro CientÃfico Tecnológico Conicet - BahÃa Blanca. Planta Piloto de IngenierÃa QuÃmica. Universidad Nacional del Sur. Planta Piloto de IngenierÃa QuÃmica; Argentin
Recommended from our members
Preparing sparse solvers for exascale computing.
Sparse solvers provide essential functionality for a wide variety of scientific applications. Highly parallel sparse solvers are essential for continuing advances in high-fidelity, multi-physics and multi-scale simulations, especially as we target exascale platforms. This paper describes the challenges, strategies and progress of the US Department of Energy Exascale Computing project towards providing sparse solvers for exascale computing platforms. We address the demands of systems with thousands of high-performance node devices where exposing concurrency, hiding latency and creating alternative algorithms become essential. The efforts described here are works in progress, highlighting current success and upcoming challenges. This article is part of a discussion meeting issue 'Numerical algorithms for high-performance computational science'
Dagstuhl Reports : Volume 1, Issue 2, February 2011
Online Privacy: Towards Informational Self-Determination on the Internet (Dagstuhl Perspectives Workshop 11061) : Simone Fischer-Hübner, Chris Hoofnagle, Kai Rannenberg, Michael Waidner, Ioannis Krontiris and Michael Marhöfer Self-Repairing Programs (Dagstuhl Seminar 11062) : Mauro Pezzé, Martin C. Rinard, Westley Weimer and Andreas Zeller Theory and Applications of Graph Searching Problems (Dagstuhl Seminar 11071) : Fedor V. Fomin, Pierre Fraigniaud, Stephan Kreutzer and Dimitrios M. Thilikos Combinatorial and Algorithmic Aspects of Sequence Processing (Dagstuhl Seminar 11081) : Maxime Crochemore, Lila Kari, Mehryar Mohri and Dirk Nowotka Packing and Scheduling Algorithms for Information and Communication Services (Dagstuhl Seminar 11091) Klaus Jansen, Claire Mathieu, Hadas Shachnai and Neal E. Youn
A Data Science Course for Undergraduates: Thinking with Data
Data science is an emerging interdisciplinary field that combines elements of
mathematics, statistics, computer science, and knowledge in a particular
application domain for the purpose of extracting meaningful information from
the increasingly sophisticated array of data available in many settings. These
data tend to be non-traditional, in the sense that they are often live, large,
complex, and/or messy. A first course in statistics at the undergraduate level
typically introduces students with a variety of techniques to analyze small,
neat, and clean data sets. However, whether they pursue more formal training in
statistics or not, many of these students will end up working with data that is
considerably more complex, and will need facility with statistical computing
techniques. More importantly, these students require a framework for thinking
structurally about data. We describe an undergraduate course in a liberal arts
environment that provides students with the tools necessary to apply data
science. The course emphasizes modern, practical, and useful skills that cover
the full data analysis spectrum, from asking an interesting question to
acquiring, managing, manipulating, processing, querying, analyzing, and
visualizing data, as well communicating findings in written, graphical, and
oral forms.Comment: 21 pages total including supplementary material
- …