86,351 research outputs found

    Informational Substitutes

    Full text link
    We propose definitions of substitutes and complements for pieces of information ("signals") in the context of a decision or optimization problem, with game-theoretic and algorithmic applications. In a game-theoretic context, substitutes capture diminishing marginal value of information to a rational decision maker. We use the definitions to address the question of how and when information is aggregated in prediction markets. Substitutes characterize "best-possible" equilibria with immediate information aggregation, while complements characterize "worst-possible", delayed aggregation. Game-theoretic applications also include settings such as crowdsourcing contests and Q\&A forums. In an algorithmic context, where substitutes capture diminishing marginal improvement of information to an optimization problem, substitutes imply efficient approximation algorithms for a very general class of (adaptive) information acquisition problems. In tandem with these broad applications, we examine the structure and design of informational substitutes and complements. They have equivalent, intuitive definitions from disparate perspectives: submodularity, geometry, and information theory. We also consider the design of scoring rules or optimization problems so as to encourage substitutability or complementarity, with positive and negative results. Taken as a whole, the results give some evidence that, in parallel with substitutable items, informational substitutes play a natural conceptual and formal role in game theory and algorithms.Comment: Full version of FOCS 2016 paper. Single-column, 61 pages (48 main text, 13 references and appendix

    An alternative approach to firms’ evaluation: expert systems and fuzzy logic

    Get PDF
    Discounted Cash Flow techniques are the generally accepted methods for valuing firms. Such methods do not provide explicit acknowledgment of the value determinants and overlook their interrelations. This paper proposes a different method of firm valuation based on fuzzy logic and expert systems. It does represent a conceptual transposition of Discounted Cash Flow techniques but, unlike the latter, it takes explicit account of quantitative and qualitative variables and their mutual integration. Financial, strategic and business aspects are considered by focusing on twenty-nine value drivers that are combined together via “if-then” rules. The output of the system is a real number in the interval [0,1], which represents the value-creation power of the firm. To corroborate the model a sensitivity analysis is conducted. The system may be used for rating and ranking firms as well as for assessing the impact of managers’ decisions on value creation and as a tool of corporate governance.Firms’ evaluation, fuzzy logic, expert system, rating, acquisition, sensitivity analysis

    An overview of decision table literature 1982-1995.

    Get PDF
    This report gives an overview of the literature on decision tables over the past 15 years. As much as possible, for each reference, an author supplied abstract, a number of keywords and a classification are provided. In some cases own comments are added. The purpose of these comments is to show where, how and why decision tables are used. The literature is classified according to application area, theoretical versus practical character, year of publication, country or origin (not necessarily country of publication) and the language of the document. After a description of the scope of the interview, classification results and the classification by topic are presented. The main body of the paper is the ordered list of publications with abstract, classification and comments.

    AI Solutions for MDS: Artificial Intelligence Techniques for Misuse Detection and Localisation in Telecommunication Environments

    Get PDF
    This report considers the application of Articial Intelligence (AI) techniques to the problem of misuse detection and misuse localisation within telecommunications environments. A broad survey of techniques is provided, that covers inter alia rule based systems, model-based systems, case based reasoning, pattern matching, clustering and feature extraction, articial neural networks, genetic algorithms, arti cial immune systems, agent based systems, data mining and a variety of hybrid approaches. The report then considers the central issue of event correlation, that is at the heart of many misuse detection and localisation systems. The notion of being able to infer misuse by the correlation of individual temporally distributed events within a multiple data stream environment is explored, and a range of techniques, covering model based approaches, `programmed' AI and machine learning paradigms. It is found that, in general, correlation is best achieved via rule based approaches, but that these suffer from a number of drawbacks, such as the difculty of developing and maintaining an appropriate knowledge base, and the lack of ability to generalise from known misuses to new unseen misuses. Two distinct approaches are evident. One attempts to encode knowledge of known misuses, typically within rules, and use this to screen events. This approach cannot generally detect misuses for which it has not been programmed, i.e. it is prone to issuing false negatives. The other attempts to `learn' the features of event patterns that constitute normal behaviour, and, by observing patterns that do not match expected behaviour, detect when a misuse has occurred. This approach is prone to issuing false positives, i.e. inferring misuse from innocent patterns of behaviour that the system was not trained to recognise. Contemporary approaches are seen to favour hybridisation, often combining detection or localisation mechanisms for both abnormal and normal behaviour, the former to capture known cases of misuse, the latter to capture unknown cases. In some systems, these mechanisms even work together to update each other to increase detection rates and lower false positive rates. It is concluded that hybridisation offers the most promising future direction, but that a rule or state based component is likely to remain, being the most natural approach to the correlation of complex events. The challenge, then, is to mitigate the weaknesses of canonical programmed systems such that learning, generalisation and adaptation are more readily facilitated

    Social Dilemmas, Revisited from a Heuristics Perspective

    Get PDF
    The standard tool for analysing social dilemmas is game theory. They are reconstructed as prisoner dilemma games. This is helpful for understanding the incentive structure. Yet this analysis is based on the classic homo oeconomicus assumptions. In many real world dilemma situations, these assumptions are misleading. A case in point is the contribution of households to climate change. Decisions about using cars instead of public transport, or about extensive air conditioning, are typically not based on ad hoc calculation. Rather, individuals rely on situational heuristics for the purpose. This paper does two things: it offers a model of heuristics, in the interest of making behaviour that is guided by heuristics comparable to behaviour based on rational reasoning. Based on this model, the paper determines the implications for the definition of social dilemmas. In some contexts, the social dilemma vanishes. In other contexts, it must be understood, and hence solved, in substantially different ways.Heuristic, Social Dilemma, Public Good, Prisoner’s Dilemma

    Non-Invasive Ambient Intelligence in Real Life: Dealing with Noisy Patterns to Help Older People

    Get PDF
    This paper aims to contribute to the field of ambient intelligence from the perspective of real environments, where noise levels in datasets are significant, by showing how machine learning techniques can contribute to the knowledge creation, by promoting software sensors. The created knowledge can be actionable to develop features helping to deal with problems related to minimally labelled datasets. A case study is presented and analysed, looking to infer high-level rules, which can help to anticipate abnormal activities, and potential benefits of the integration of these technologies are discussed in this context. The contribution also aims to analyse the usage of the models for the transfer of knowledge when different sensors with different settings contribute to the noise levels. Finally, based on the authors’ experience, a framework proposal for creating valuable and aggregated knowledge is depicted.This research was partially funded by Fundación Tecnalia Research & Innovation, and J.O.-M. also wants to recognise the support obtained from the EU RFCS program through project number 793505 ‘4.0 Lean system integrating workers and processes (WISEST)’ and from the grant PRX18/00036 given by the Spanish Secretaría de Estado de Universidades, Investigación, Desarrollo e Innovación del Ministerio de Ciencia, Innovación y Universidades
    corecore