57,663 research outputs found

    The diversity of residential electricity demand – a comparative analysis of metered and simulated data

    Get PDF
    A comparative study between simulated residential electricity demand data and metered data from theUK Household Electricity Survey is presented. For this study, a high-resolution probabilistic model wasused to test whether this increasingly widely used modelling approach provides an adequate represen-tation of the statistical characteristics the most comprehensive dataset of metered electricity demandavailable in the UK. Both the empirical and simulated electricity consumption data have been analysedon an aggregated level, paying special attention to the mean daily load profiles, the distribution of house-holds with respect to the total annual demands, and the distributions of the annual demands of particularappliances. A thorough comparison making use of both qualitative and quantitative methods was madebetween simulated datasets and it’s metered counterparts. Significant discrepancies were found in thedistribution of households with respect to both overall electricity consumption and consumption ofindividual appliances. Parametric estimates of the distributions of metered data were obtained, and theanalytic expressions for both the density function and cumulative distribution are given. These can beincorporated into new and existent modelling frameworks, as well as used as tools for further analysis

    A Spatio-Temporal Point Process Model for Ambulance Demand

    Full text link
    Ambulance demand estimation at fine time and location scales is critical for fleet management and dynamic deployment. We are motivated by the problem of estimating the spatial distribution of ambulance demand in Toronto, Canada, as it changes over discrete 2-hour intervals. This large-scale dataset is sparse at the desired temporal resolutions and exhibits location-specific serial dependence, daily and weekly seasonality. We address these challenges by introducing a novel characterization of time-varying Gaussian mixture models. We fix the mixture component distributions across all time periods to overcome data sparsity and accurately describe Toronto's spatial structure, while representing the complex spatio-temporal dynamics through time-varying mixture weights. We constrain the mixture weights to capture weekly seasonality, and apply a conditionally autoregressive prior on the mixture weights of each component to represent location-specific short-term serial dependence and daily seasonality. While estimation may be performed using a fixed number of mixture components, we also extend to estimate the number of components using birth-and-death Markov chain Monte Carlo. The proposed model is shown to give higher statistical predictive accuracy and to reduce the error in predicting EMS operational performance by as much as two-thirds compared to a typical industry practice

    Estimating healthcare demand for an aging population: a flexible and robust bayesian joint model

    Get PDF
    In this paper, we analyse two frequently used measures of the demand for health care, namely hospital visits and out-of-pocket health care expenditure, which have been analysed separately in the existing literature. Given that these two measures of healthcare demand are highly likely to be closely correlated, we propose a framework to jointly model hospital visits and out-of-pocket medical expenditure. Furthermore, the joint framework allows for the presence of non-linear effects of covariates using splines to capture the effects of aging on healthcare demand. Sample heterogeneity is modelled robustly with the random effects following Dirichlet process priors with explicit cross-part correlation. The findings of our empirical analysis of the U.S. Health and Retirement Survey indicate that the demand for healthcare varies with age and gender and exhibits significant cross-part correlation that provides a rich understanding of how aging affects health care demand, which is of particular policy relevance in the context of an aging population

    RoboJam: A Musical Mixture Density Network for Collaborative Touchscreen Interaction

    Full text link
    RoboJam is a machine-learning system for generating music that assists users of a touchscreen music app by performing responses to their short improvisations. This system uses a recurrent artificial neural network to generate sequences of touchscreen interactions and absolute timings, rather than high-level musical notes. To accomplish this, RoboJam's network uses a mixture density layer to predict appropriate touch interaction locations in space and time. In this paper, we describe the design and implementation of RoboJam's network and how it has been integrated into a touchscreen music app. A preliminary evaluation analyses the system in terms of training, musical generation and user interaction

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    © 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. © 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio

    Real Option Valuation of a Portfolio of Oil Projects

    Get PDF
    Various methodologies exist for valuing companies and their projects. We address the problem of valuing a portfolio of projects within companies that have infrequent, large and volatile cash flows. Examples of this type of company exist in oil exploration and development and we will use this example to illustrate our analysis throughout the thesis. The theoretical interest in this problem lies in modeling the sources of risk in the projects and their different interactions within each project. Initially we look at the advantages of real options analysis and compare this approach with more traditional valuation methods, highlighting strengths and weaknesses ofeach approach in the light ofthe thesis problem. We give the background to the stages in an oil exploration and development project and identify the main common sources of risk, for example commodity prices. We discuss the appropriate representation for oil prices; in short, do oil prices behave more like equities or more like interest rates? The appropriate representation is used to model oil price as a source ofrisk. A real option valuation model based on market uncertainty (in the form of oil price risk) and geological uncertainty (reserve volume uncertainty) is presented and tested for two different oil projects. Finally, a methodology to measure the inter-relationship between oil price and other sources of risk such as interest rates is proposed using copula methods.Imperial Users onl

    The time-varying asymmetry of exchange rate returns : a stochastic volatility - stochastic skewness model

    Get PDF
    While the time-varying volatility of financial returns has been extensively modelled, most existing stochastic volatility models either assume a constant degree of return shock asymmetry or impose symmetric model innovations. However, accounting for time-varying asymmetry as a measure of crash risk is important for both investors and policy makers. This paper extends a standard stochastic volatility model to allow for time-varying skewness of the return innovations. We estimate the model by extensions of traditional Markov Chain Monte Carlo (MCMC) methods for stochastic volatility models. When applying this model to the returns of four major exchange rates, skewness is found to vary substantially over time. In addition, stochastic skewness can help to improve forecasts of risk measures. Finally, the results support a potential link between carry trading and crash risk

    Production of a diluted solid tracer by dry co-grinding in a tumbling ball mill

    Get PDF
    This paper presents a study on the production by co-grinding of a diluted solid tracer, sized less than 10 mm and containing less than 2 wt. % of active product, used in the field of grounds contamination and decontamination. Co-grinding was performed in a tumbling ball mill and permits to produce easily a diluted tracer without implementing several apparatus. The two products were ground separately first and then together. The follow-up of the particles size and morphology, as well as the modelling of the grinding kinetics have permitted to propose a mechanism by which the diluted solid tracer is produced. The influence of the operating conditions (nature and initial size of the diluting medium, ball and powder filling rates, proportion of the polluting tracer) on products grinding was studied. Thus, we have defined optimum co-grinding conditions permitting to produce a tracer offering the required properties. These ones are classical for tumbling ball mills. This kind of mill is very interesting since its sizes can easily be extrapolated to answer to an industrial demand
    • …
    corecore