45 research outputs found

    Optimal strategies for a game on amenable semigroups

    Full text link
    The semigroup game is a two-person zero-sum game defined on a semigroup S as follows: Players 1 and 2 choose elements x and y in S, respectively, and player 1 receives a payoff f(xy) defined by a function f from S to [-1,1]. If the semigroup is amenable in the sense of Day and von Neumann, one can extend the set of classical strategies, namely countably additive probability measures on S, to include some finitely additive measures in a natural way. This extended game has a value and the players have optimal strategies. This theorem extends previous results for the multiplication game on a compact group or on the positive integers with a specific payoff. We also prove that the procedure of extending the set of allowed strategies preserves classical solutions: if a semigroup game has a classical solution, this solution solves also the extended game.Comment: 17 pages. To appear in International Journal of Game Theor

    The Value of p-Value in Biomedical Research

    Get PDF
    Significance tests and the corresponding p-values play a crucial role in decision making. In this commentary the meaning, interpretation and misinterpretation of p-values is presented. Alternatives for evaluating the reported evidence are also discussed

    A weak characterization of slow variables in stochastic dynamical systems

    Full text link
    We present a novel characterization of slow variables for continuous Markov processes that provably preserve the slow timescales. These slow variables are known as reaction coordinates in molecular dynamical applications, where they play a key role in system analysis and coarse graining. The defining characteristics of these slow variables is that they parametrize a so-called transition manifold, a low-dimensional manifold in a certain density function space that emerges with progressive equilibration of the system's fast variables. The existence of said manifold was previously predicted for certain classes of metastable and slow-fast systems. However, in the original work, the existence of the manifold hinges on the pointwise convergence of the system's transition density functions towards it. We show in this work that a convergence in average with respect to the system's stationary measure is sufficient to yield reaction coordinates with the same key qualities. This allows one to accurately predict the timescale preservation in systems where the old theory is not applicable or would give overly pessimistic results. Moreover, the new characterization is still constructive, in that it allows for the algorithmic identification of a good slow variable. The improved characterization, the error prediction and the variable construction are demonstrated by a small metastable system

    Bayesian modeling of recombination events in bacterial populations

    Get PDF
    Background: We consider the discovery of recombinant segments jointly with their origins within multilocus DNA sequences from bacteria representing heterogeneous populations of fairly closely related species. The currently available methods for recombination detection capable of probabilistic characterization of uncertainty have a limited applicability in practice as the number of strains in a data set increases. Results: We introduce a Bayesian spatial structural model representing the continuum of origins over sites within the observed sequences, including a probabilistic characterization of uncertainty related to the origin of any particular site. To enable a statistically accurate and practically feasible approach to the analysis of large-scale data sets representing a single genus, we have developed a novel software tool (BRAT, Bayesian Recombination Tracker) implementing the model and the corresponding learning algorithm, which is capable of identifying the posterior optimal structure and to estimate the marginal posterior probabilities of putative origins over the sites. Conclusion: A multitude of challenging simulation scenarios and an analysis of real data from seven housekeeping genes of 120 strains of genus Burkholderia are used to illustrate the possibilities offered by our approach. The software is freely available for download at URL http://web.abo.fi/fak/ mnf//mate/jc/software/brat.html

    A frequentist framework of inductive reasoning

    Full text link
    Reacting against the limitation of statistics to decision procedures, R. A. Fisher proposed for inductive reasoning the use of the fiducial distribution, a parameter-space distribution of epistemological probability transferred directly from limiting relative frequencies rather than computed according to the Bayes update rule. The proposal is developed as follows using the confidence measure of a scalar parameter of interest. (With the restriction to one-dimensional parameter space, a confidence measure is essentially a fiducial probability distribution free of complications involving ancillary statistics.) A betting game establishes a sense in which confidence measures are the only reliable inferential probability distributions. The equality between the probabilities encoded in a confidence measure and the coverage rates of the corresponding confidence intervals ensures that the measure's rule for assigning confidence levels to hypotheses is uniquely minimax in the game. Although a confidence measure can be computed without any prior distribution, previous knowledge can be incorporated into confidence-based reasoning. To adjust a p-value or confidence interval for prior information, the confidence measure from the observed data can be combined with one or more independent confidence measures representing previous agent opinion. (The former confidence measure may correspond to a posterior distribution with frequentist matching of coverage probabilities.) The representation of subjective knowledge in terms of confidence measures rather than prior probability distributions preserves approximate frequentist validity.Comment: major revisio

    Bayesian calibration, validation and uncertainty quantification for predictive modelling of tumour growth: a tutorial

    Get PDF
    In this work we present a pedagogical tumour growth example, in which we apply calibration and validation techniques to an uncertain, Gompertzian model of tumour spheroid growth. The key contribution of this article is the discussion and application of these methods (that are not commonly employed in the field of cancer modelling) in the context of a simple model, whose deterministic analogue is widely known within the community. In the course of the example we calibrate the model against experimental data that is subject to measurement errors, and then validate the resulting uncertain model predictions. We then analyse the sensitivity of the model predictions to the underlying measurement model. Finally, we propose an elementary learning approach for tuning a threshold parameter in the validation procedure in order to maximize predictive accuracy of our validated model

    The ethics of entrepreneurial philanthropy

    Get PDF
    A salient if under researched feature of the new age of global inequalities is the rise to prominence of entrepreneurial philanthropy, the pursuit of transformational social goals through philanthropic investment in projects animated by entrepreneurial principles. Super-wealthy entrepreneurs in this way extend their suzerainty from the domain of the economic to the domains of the social and political. We explore the ethics and ethical implications of entrepreneurialphilanthropy through systematic comparison with what we call customaryphilanthropy, which preferences support for established institutions and social practices. We analyse the ethical statements made at interview by 24 elite UK philanthropists, 12 customary and 12 entrepreneurial, to reveal the co-existence of two ethically charged narratives of elite philanthropic motivations, each instrumental in maintaining the established socio-economic order. We conclude that entrepreneurial philanthropy, as an ostensibly efficacious instrument of social justice, is ethically flawed by its unremitting impulse toward ideological purity

    Different Drivers: Exploring employee involvement in corporate philanthropy

    Get PDF
    Corporate Philanthropy (CP) is multi-dimensional, differs between sectors and involves both individual and organizational decision-making to achieve business and social goals. However, the CP literature characteristically focuses on strategic decisions made by business leaders and ignores the role of employees, especially those in lower status and lower paid positions. To redress this imbalance, we conducted a qualitative study of employees’ involvement in CP processes in ten workplaces in the South East of England to identify whether and how they are involved in CP decision-making and to capture their perspective on the nature of CP and the benefits generated by such activities. We specifically chose to study workplaces where employees are involved in the actual execution of the CP strategy, prioritising companies with a visible presence on the high street. The results illustrate the benefits of involving employees in CP decision-making, which we argue derives in part from the ‘liminal-like states’ that typify CP activities organised by shop floor staff, involving the temporary overturning of hierarchies, humanising of workplaces and opportunities for lower-level staff to prioritise their personal philanthropic preferences and signal their charitable identity to colleagues and customers. Whilst the data also suggests that CP decision-making remains predominantly top-down and driven by profit-oriented goals, we conclude that employees should be involved in choosing charitable causes as well as in designing and implementing workplace fundraising, in order to maximise the advantages of CP for the company and for wider society
    corecore