48,224 research outputs found

    Evolution of Fairness in the Not Quite Ultimatum Game

    Full text link
    The Ultimatum Game (UG) is an economic game where two players (proposer and responder) decide how to split a certain amount of money. While traditional economic theories based on rational decision making predict that the proposer should make a minimal offer and the responder should accept it, human subjects tend to behave more fairly in UG. Previous studies suggested that extra information such as reputation, empathy, or spatial structure is needed for fairness to evolve in UG. Here we show that fairness can evolve without additional information if players make decisions probabilistically and may continue interactions when the offer is rejected, which we call the Not Quite Ultimatum Game (NQUG). Evolutionary simulations of NQUG showed that the probabilistic decision making contributes to the increase of proposers' offer amounts to avoid rejection, while the repetition of the game works to responders' advantage because they can wait until a good offer comes. These simple extensions greatly promote evolution of fairness in both proposers' offers and responders' acceptance thresholds.Comment: 14 pages, 3 figure

    Statistical Analysis of the Efficiency of an Integrated Voltage Regulator by Means of a Machine Learning Model Coupled with Kriging Regression

    Get PDF
    This paper presents a preliminary version of a probabilistic model for the uncertainty quantification of complex electronic systems resulting from the combination of the leastsquares support vector machine (LS-SVM) and the Gaussian process (GP) regression. The proposed model, trained with a limited set of training pairs provided by a set of full-wave expensive simulations, is adopted for the prediction of the efficiency of an integrated voltage regulator (IVR) with 8 uniformly distributed random parameters. The accuracy and the feasibility of the proposed model have been investigated by comparing the model predictions and its confidence intervals with the results of a Monte Carlo (MC) full-wave simulation of the device

    Sensitivity of Climate Change Projections to Uncertainties in the Estimates of Observed Changes in Deep-Ocean Heat Content

    Get PDF
    Abstract and PDF report are also available on the MIT Joint Program on the Science and Policy of Global Change website (http://globalchange.mit.edu/).The MIT 2D climate model is used to make probabilistic projections for changes in global mean surface temperature and for thermosteric sea level rise under a variety of forcing scenarios. The uncertainties in climate sensitivity and rate of heat uptake by the deep ocean are quantified by using the probability distributions derived from observed 20th century temperature changes. The impact on climate change projections of using the smallest and largest estimates of 20th century deep ocean warming is explored. The impact is large in the case of global mean thermosteric sea level rise. In the MIT reference ("business as usual") scenario the median rise by 2100 is 27 and 43 cm in the respective cases. The impact on increases in global mean surface air temperature is more modest, 4.9 C and 3.9 C in the two respective cases, because of the correlation between climate sensitivity and ocean heat uptake required by 20th century surface and upper air temperature changes. The results are also compared with the projections made by the IPCC AR4's multi-model ensemble for several of the SRES scenarios. The multi-model projections are more consistent with the MIT projections based on the largest estimate of ocean warming. However the range for the rate of heat uptake by the ocean suggested by the lowest estimate of ocean warming is more consistent with the range suggested by the 20th century changes in surface and upper air temperatures, combined with expert prior for climate sensitivity.This work was supported in part by the OfïŹce of Science (BER), U.S. Dept. of Energy Grant No. DE-FG02-93ER61677, NSF, and by the MIT Joint Program on the Science and Policy of Global Change

    Probabilistic modeling of one dimensional water movement and leaching from highway embankments containing secondary materials

    Get PDF
    Predictive methods for contaminant release from virgin and secondary road construction materials are important for evaluating potential long-term soil and groundwater contamination from highways. The objective of this research was to describe the field hydrology in a highway embankment and to investigate leaching under unsaturated conditions by use of a contaminant fate and transport model. The HYDRUS2D code was used to solve the Richards equation and the advection–dispersion equation with retardation. Water flow in a Minnesota highway embankment was successfully modeled in one dimension for several rain events after Bayesian calibration of the hydraulic parameters against water content data at a point 0.32 m from the surface of the embankment. The hypothetical leaching of Cadmium from coal fly ash was probabilistically simulated in a scenario where the top 0.50 m of the embankment was replaced by coal fly ash. Simulation results were compared to the percolation equation method where the solubility is multiplied by the liquid-to-solid ratio to estimate total release. If a low solubility value is used for Cadmium, the release estimates obtained using the percolation/equilibrium model are close to those predicted from HYDRUS2D simulations (10–4–10–2 mg Cd/kg ash). If high solubility is used, the percolation equation over predicts the actual release (0.1–1.0 mg Cd/kg ash). At the 90th percentile of uncertainty, the 10-year liquid-to-solid ratio for the coal fly ash embankment was 9.48 L/kg, and the fraction of precipitation that infiltrated the coal fly ash embankment was 92%. Probabilistic modeling with HYDRUS2D appears to be a promising realistic approach to predicting field hydrology and subsequent leaching in embankments

    Astrobiological Complexity with Probabilistic Cellular Automata

    Full text link
    Search for extraterrestrial life and intelligence constitutes one of the major endeavors in science, but has yet been quantitatively modeled only rarely and in a cursory and superficial fashion. We argue that probabilistic cellular automata (PCA) represent the best quantitative framework for modeling astrobiological history of the Milky Way and its Galactic Habitable Zone. The relevant astrobiological parameters are to be modeled as the elements of the input probability matrix for the PCA kernel. With the underlying simplicity of the cellular automata constructs, this approach enables a quick analysis of large and ambiguous input parameters' space. We perform a simple clustering analysis of typical astrobiological histories and discuss the relevant boundary conditions of practical importance for planning and guiding actual empirical astrobiological and SETI projects. In addition to showing how the present framework is adaptable to more complex situations and updated observational databases from current and near-future space missions, we demonstrate how numerical results could offer a cautious rationale for continuation of practical SETI searches.Comment: 37 pages, 11 figures, 2 tables; added journal reference belo

    Probabilistic simulation for the certification of railway vehicles

    Get PDF
    The present dynamic certification process that is based on experiments has been essentially built on the basis of experience. The introduction of simulation techniques into this process would be of great interest. However, an accurate simulation of complex, nonlinear systems is a difficult task, in particular when rare events (for example, unstable behaviour) are considered. After analysing the system and the currently utilized procedure, this paper proposes a method to achieve, in some particular cases, a simulation-based certification. It focuses on the need for precise and representative excitations (running conditions) and on their variable nature. A probabilistic approach is therefore proposed and illustrated using an example. First, this paper presents a short description of the vehicle / track system and of the experimental procedure. The proposed simulation process is then described. The requirement to analyse a set of running conditions that is at least as large as the one tested experimentally is explained. In the third section, a sensitivity analysis to determine the most influential parameters of the system is reported. Finally, the proposed method is summarized and an application is presented

    Great SCO2T! Rapid tool for carbon sequestration science, engineering, and economics

    Full text link
    CO2 capture and storage (CCS) technology is likely to be widely deployed in coming decades in response to major climate and economics drivers: CCS is part of every clean energy pathway that limits global warming to 2C or less and receives significant CO2 tax credits in the United States. These drivers are likely to stimulate capture, transport, and storage of hundreds of millions or billions of tonnes of CO2 annually. A key part of the CCS puzzle will be identifying and characterizing suitable storage sites for vast amounts of CO2. We introduce a new software tool called SCO2T (Sequestration of CO2 Tool, pronounced "Scott") to rapidly characterizing saline storage reservoirs. The tool is designed to rapidly screen hundreds of thousands of reservoirs, perform sensitivity and uncertainty analyses, and link sequestration engineering (injection rates, reservoir capacities, plume dimensions) to sequestration economics (costs constructed from around 70 separate economic inputs). We describe the novel science developments supporting SCO2T including a new approach to estimating CO2 injection rates and CO2 plume dimensions as well as key advances linking sequestration engineering with economics. Next, we perform a sensitivity and uncertainty analysis of geology combinations (including formation depth, thickness, permeability, porosity, and temperature) to understand the impact on carbon sequestration. Through the sensitivity analysis we show that increasing depth and permeability both can lead to increased CO2 injection rates, increased storage potential, and reduced costs, while increasing porosity reduces costs without impacting the injection rate (CO2 is injected at a constant pressure in all cases) by increasing the reservoir capacity.Comment: CO2 capture and storage; carbon sequestration; reduced-order modeling; climate change; economic
    • 

    corecore