1,525 research outputs found

    Monetary Neutrality

    Get PDF
    Prize Lecture to the memory of Alfred Nobel, December 7, 1995.Money neutrality;

    Multiple Relevant Feature Ensemble Selection Based on Multilayer Co-Evolutionary Consensus MapReduce

    Full text link
    IEEE Although feature selection for large data has been intensively investigated in data mining, machine learning, and pattern recognition, the challenges are not just to invent new algorithms to handle noisy and uncertain large data in applications, but rather to link the multiple relevant feature sources, structured, or unstructured, to develop an effective feature reduction method. In this paper, we propose a multiple relevant feature ensemble selection (MRFES) algorithm based on multilayer co-evolutionary consensus MapReduce (MCCM). We construct an effective MCCM model to handle feature ensemble selection of large-scale datasets with multiple relevant feature sources, and explore the unified consistency aggregation between the local solutions and global dominance solutions achieved by the co-evolutionary memeplexes, which participate in the cooperative feature ensemble selection process. This model attempts to reach a mutual decision agreement among co-evolutionary memeplexes, which calls for the need for mechanisms to detect some noncooperative co-evolutionary behaviors and achieve better Nash equilibrium resolutions. Extensive experimental comparative studies substantiate the effectiveness of MRFES to solve large-scale dataset problems with the complex noise and multiple relevant feature sources on some well-known benchmark datasets. The algorithm can greatly facilitate the selection of relevant feature subsets coming from the original feature space with better accuracy, efficiency, and interpretability. Moreover, we apply MRFES to human cerebral cortex-based classification prediction. Such successful applications are expected to significantly scale up classification prediction for large-scale and complex brain data in terms of efficiency and feasibility

    Towards Machine Wald

    Get PDF
    The past century has seen a steady increase in the need of estimating and predicting complex systems and making (possibly critical) decisions with limited information. Although computers have made possible the numerical evaluation of sophisticated statistical models, these models are still designed \emph{by humans} because there is currently no known recipe or algorithm for dividing the design of a statistical model into a sequence of arithmetic operations. Indeed enabling computers to \emph{think} as \emph{humans} have the ability to do when faced with uncertainty is challenging in several major ways: (1) Finding optimal statistical models remains to be formulated as a well posed problem when information on the system of interest is incomplete and comes in the form of a complex combination of sample data, partial knowledge of constitutive relations and a limited description of the distribution of input random variables. (2) The space of admissible scenarios along with the space of relevant information, assumptions, and/or beliefs, tend to be infinite dimensional, whereas calculus on a computer is necessarily discrete and finite. With this purpose, this paper explores the foundations of a rigorous framework for the scientific computation of optimal statistical estimators/models and reviews their connections with Decision Theory, Machine Learning, Bayesian Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty Quantification and Information Based Complexity.Comment: 37 page

    Shared Nearest-Neighbor Quantum Game-Based Attribute Reduction with Hierarchical Coevolutionary Spark and Its Application in Consistent Segmentation of Neonatal Cerebral Cortical Surfaces

    Full text link
    © 2012 IEEE. The unprecedented increase in data volume has become a severe challenge for conventional patterns of data mining and learning systems tasked with handling big data. The recently introduced Spark platform is a new processing method for big data analysis and related learning systems, which has attracted increasing attention from both the scientific community and industry. In this paper, we propose a shared nearest-neighbor quantum game-based attribute reduction (SNNQGAR) algorithm that incorporates the hierarchical coevolutionary Spark model. We first present a shared coevolutionary nearest-neighbor hierarchy with self-evolving compensation that considers the features of nearest-neighborhood attribute subsets and calculates the similarity between attribute subsets according to the shared neighbor information of attribute sample points. We then present a novel attribute weight tensor model to generate ranking vectors of attributes and apply them to balance the relative contributions of different neighborhood attribute subsets. To optimize the model, we propose an embedded quantum equilibrium game paradigm (QEGP) to ensure that noisy attributes do not degrade the big data reduction results. A combination of the hierarchical coevolutionary Spark model and an improved MapReduce framework is then constructed that it can better parallelize the SNNQGAR to efficiently determine the preferred reduction solutions of the distributed attribute subsets. The experimental comparisons demonstrate the superior performance of the SNNQGAR, which outperforms most of the state-of-the-art attribute reduction algorithms. Moreover, the results indicate that the SNNQGAR can be successfully applied to segment overlapping and interdependent fuzzy cerebral tissues, and it exhibits a stable and consistent segmentation performance for neonatal cerebral cortical surfaces

    On constructions of quantum-secure device-independent randomness expansion protocols

    Get PDF
    Device-independent randomness expansion protocols aim to expand a short uniformly random string into a much longer one whilst guaranteeing that their output is truly random. They are device-independent in the sense that this guarantee does not dependent on the specifics of an implementation. Rather, through the observation of nonlocal correlations we can conclude that the outputs generated are necessarily random. This thesis reports a general method for constructing these protocols and evaluating their security. Using this method, we then construct several explicit protocols and analyse their performance on noisy qubit systems. With a view towards near-future quantum technologies, we also investigate whether randomness expansion is possible using current nonlocality experiments. We find that, by combining the recent theoretical and experimental advances, it is indeed now possible to reliably and securely expand randomness

    A Game-theoretical Perspective

    Get PDF
    Die Ausgestaltung von Wirtschaftspolitik wird als Prozess wiederholter Interaktion zwischen einer zentralisierten Politikinstanz und privaten Individuen vorgestellt. Diese Interaktion ist inhärent strategisch und eignet sich daher für eine spieltheoretische Behandlung. Die in dieser Arbeit auf der Grundlage der Spieltheorie erarbeiteten makroökonomischen Modelle dienen der Formalisierung eines dynamischen Prozesses von Politikgestaltung. In diesen Spielen bildet ein autonom handelnder Privatsektor Erwartungen über die zukünfitgen Handlungen einer Regierung. Die dabei entstehende Rückkopplung von der Erwartungsbildung des Privatsektors auf die Anreize der Regierung hat weitreichende Konsequenzen für die Entwicklung sowohl nominaler als auch realer Größen in der Wirtschaft. Aufbauend auf dem grundlegenden Prinzip der rationalen Erwartungen erarbeitet die Analyse eine Konzeptionalisierung der makroökonomischen Glaubwürdigkeit (Zeitkonsistenz) und Reputation vor dem Hintergrund vo! n Strukturreform und Stabilierungspoltik in Entwicklungsländern. Unterschiedliche Ansätze sind besonders der Geld- und Wechselkurspoltiik zur Inflationsbekämpfung gewidmet
    • …
    corecore