599 research outputs found
Combining Voting Rules Together
We propose a simple method for combining together voting rules that performs
a run-off between the different winners of each voting rule. We prove that this
combinator has several good properties. For instance, even if just one of the
base voting rules has a desirable property like Condorcet consistency, the
combination inherits this property. In addition, we prove that combining voting
rules together in this way can make finding a manipulation more computationally
difficult. Finally, we study the impact of this combinator on approximation
methods that find close to optimal manipulations
Wisdom of artificial crowds feature selection in untargeted metabolomics: An application to the development of a blood-based diagnostic test for thrombotic myocardial infarction
Introduction: Heart disease remains a leading cause of global mortality. While acute myocardial infarction (colloquially: heart attack), has multiple proximate causes, proximate etiology cannot be determined by a blood-based diagnostic test. We enrolled a suitable patient cohort and conducted a non-targeted quantification of plasma metabolites by mass spectrometry for developing a test that can differentiate between thrombotic MI, non-thrombotic MI, and stable disease. A significant challenge in developing such a diagnostic test is solving the NP-hard problem of feature selection for constructing an optimal statistical classifier. Objective: We employed a Wisdom of Artificial Crowds (WoAC) strategy for solving the feature selection problem and evaluated the accuracy and parsimony of downstream classifiers in comparison with traditional feature selection techniques including the Lasso and selection using Random Forest variable importance criteria. Materials and methods: Artificial Crowd Wisdom was generated via aggregation of the best solutions from independent and diverse genetic algorithm populations that were initialized with bootstrapping and a random subspaces constraint. Results/Conclusions: Strong evidence was observed that a statistical classifier utilizing WoAC feature selection can discriminate between human subjects presenting with thrombotic MI, non-thrombotic MI, and stable Coronary Artery Disease given abundances of selected plasma metabolites. Utilizing the abundances of twenty selected metabolites, a leave-one-out cross-validation estimated misclassification rate of 2.6% was observed. However, the WoAC feature selection strategy did not perform better than the Lasso over the current study
Human Computation and Convergence
Humans are the most effective integrators and producers of information,
directly and through the use of information-processing inventions. As these
inventions become increasingly sophisticated, the substantive role of humans in
processing information will tend toward capabilities that derive from our most
complex cognitive processes, e.g., abstraction, creativity, and applied world
knowledge. Through the advancement of human computation - methods that leverage
the respective strengths of humans and machines in distributed
information-processing systems - formerly discrete processes will combine
synergistically into increasingly integrated and complex information processing
systems. These new, collective systems will exhibit an unprecedented degree of
predictive accuracy in modeling physical and techno-social processes, and may
ultimately coalesce into a single unified predictive organism, with the
capacity to address societies most wicked problems and achieve planetary
homeostasis.Comment: Pre-publication draft of chapter. 24 pages, 3 figures; added
references to page 1 and 3, and corrected typ
Efficiency Theory: a Unifying Theory for Information, Computation and Intelligence
The paper serves as the first contribution towards the development of the
theory of efficiency: a unifying framework for the currently disjoint theories
of information, complexity, communication and computation. Realizing the
defining nature of the brute force approach in the fundamental concepts in all
of the above mentioned fields, the paper suggests using efficiency or
improvement over the brute force algorithm as a common unifying factor
necessary for the creation of a unified theory of information manipulation. By
defining such diverse terms as randomness, knowledge, intelligence and
computability in terms of a common denominator we are able to bring together
contributions from Shannon, Levin, Kolmogorov, Solomonoff, Chaitin, Yao and
many others under a common umbrella of the efficiency theory
Efficiency Theory: a Unifying Theory for Information, Computation and Intelligence
The paper serves as the first contribution towards the development of the theory of efficiency: a unifying framework for the currently disjoint theories of information, complexity, communication and computation. Realizing the defining nature of the brute force approach in the fundamental concepts in all of the above mentioned fields, the paper suggests using efficiency or improvement over the brute force algorithm as a common unifying factor necessary for the creation of a unified theory of information manipulation. By defining such diverse terms as randomness, knowledge, intelligence and computability in terms of a common denominator we are able to bring together contributions from Shannon, Levin, Kolmogorov, Solomonoff, Chaitin, Yao and many others under a common umbrella of the efficiency theory. © Taru Publications
Construction of an NP Problem with an Exponential Lower Bound
In this paper we present a Hashed-Path Traveling Salesperson Problem (HPTSP),
a new type of problem which has the interesting property of having no
polynomial time solutions. Next we show that HPTSP is in the class NP by
demonstrating that local information about sub-routes is insufficient to
compute the complete value of each route. As a consequence, via Ladner's
theorem, we show that the class NPI is non-empty
- …