4,369 research outputs found

    Estimating Demand Uncertainty Using Judgmental Forecasts

    Get PDF
    Measuring demand uncertainty is a key activity in supply chain planning. Of various methods of estimating the standard deviation of demand, one that has been employed successfully in the recent literature uses dispersion among expertsâ forecasts. However, there has been limited empirical validation of this methodology. In this paper we provide a general methodology for estimating the standard deviation of a random variable using dispersion among expertsâ forecasts. We test this methodology using three datasets, demand data at item level, sales data at firm level for retailers, and sales data at firm level for manufacturers. We show that the standard deviation of a random variable (demand and sales for our datasets) is positively correlated with dispersion among expertsâ forecasts. Further we use longitudinal datasets with sales forecasts made 3-9 months before earnings report date for retailers and manufacturers to show that the effects of dispersion and scale on standard deviation of forecast error are consistent over time.Operations Management Working Papers Serie

    Isolation of microsatellite loci in the Capricorn silvereye, Zosterops lateralis chlorocephalus (Aves : Zosteropidae)

    Get PDF
    The Capricorn silvereye (Zosterops lateralis chlorocephalus ) is ideally suited to investigating the genetic basis of body size evolution. We have isolated and characterized a set of microsatellite markers for this species. Seven out of 11 loci were polymorphic. The number of alleles detected ranged from two to five and observed heterozygosities between 0.12 and 0.67. One locus, ZL49, was found to be sex-linked. This moderate level of diversity is consistent with that expected in an isolated, island population

    Detecting fractions of electrons in the high-TcT_c cuprates

    Full text link
    We propose several tests of the idea that the electron is fractionalized in the underdoped and undoped cuprates. These include the ac Josephson effect, and tunneling into small superconducting grains in the Coulomb blockade regime. In both cases, we argue that the results are qualitatively modified from the conventional ones if the insulating tunnel barrier is fractionalized. These experiments directly detect the possible existence of the chargon - a charge ee spinless boson - in the insulator. The effects described in this paper provide a means to probing whether the undoped cuprate (despite it's magnetism) is fractionalized. Thus, the experiments discussed here are complementary to the flux-trapping experiment we proposed in our earlier work(cond-mat/0006481).Comment: 7 pages, 5 figure

    Effects of BG9719 (CVT-124), an A1-Adenosine receptor antagonist, and furosemide on glomerular filtration rate and natriuresis in patients with congestive heart failure

    Get PDF
    AbstractOBJECTIVESTo determine the effects of furosemide and the selective A1adenosine receptor BG9719 on renal function in patients with congestive heart failure (CHF).BACKGROUNDStudies suggest that adenosine may affect renal function by various mechanisms, but the effects of blockade of this system in humans is unknown. In addition, the effects of a therapeutic dose of furosemide on glomerular filtration rate (GFR) and renal plasma flow (RPF) in heart failure patients are controversial.METHODSOn different days, 12 patients received placebo, BG9719 and furosemide. Glomerular filtration rate, RPF and sodium and water excretion were assessed immediately following drug administration.RESULTSGlomerular filtration rate was 84 ± 23 ml/min/1.73m2after receiving placebo, 82 ± 24 following BG9719 administration and a decreased (p < 0.005) 63 ± 18 following furosemide. Renal plasma flow was unchanged at 293 ± 124 ml/min/1.73m2on placebo, 334 ± 155 after receiving BG9719 and 374 ± 231 after receiving furosemide. Sodium excretion increased from 8 ± 8 mEq following placebo administration to 37 ± 26 mEq following BG9719 administration. In the six patients in whom it was measured, sodium excretion was 104 ± 78 mEq following furosemide administration.CONCLUSIONSNatriuresis is effectively induced by both furosemide and the adenosine A1antagonist BG9719 in patients with CHF. Doses of the two drugs used in this study did not cause equivalent sodium and water excretion but only furosemide decreased GFR. These data suggest that adenosine is an important determinant of renal function in patients with heart failure

    Come back Marshall, all is forgiven? : Complexity, evolution, mathematics and Marshallian exceptionalism

    Get PDF
    Marshall was the great synthesiser of neoclassical economics. Yet with his qualified assumption of self-interest, his emphasis on variation in economic evolution and his cautious attitude to the use of mathematics, Marshall differs fundamentally from other leading neoclassical contemporaries. Metaphors inspire more specific analogies and ontological assumptions, and Marshall used the guiding metaphor of Spencerian evolution. But unfortunately, the further development of a Marshallian evolutionary approach was undermined in part by theoretical problems within Spencer's theory. Yet some things can be salvaged from the Marshallian evolutionary vision. They may even be placed in a more viable Darwinian framework.Peer reviewedFinal Accepted Versio

    Efficient Multi-site Data Movement Using Constraint Programming for Data Hungry Science

    Full text link
    For the past decade, HENP experiments have been heading towards a distributed computing model in an effort to concurrently process tasks over enormous data sets that have been increasing in size as a function of time. In order to optimize all available resources (geographically spread) and minimize the processing time, it is necessary to face also the question of efficient data transfers and placements. A key question is whether the time penalty for moving the data to the computational resources is worth the presumed gain. Onward to the truly distributed task scheduling we present the technique using a Constraint Programming (CP) approach. The CP technique schedules data transfers from multiple resources considering all available paths of diverse characteristic (capacity, sharing and storage) having minimum user's waiting time as an objective. We introduce a model for planning data transfers to a single destination (data transfer) as well as its extension for an optimal data set spreading strategy (data placement). Several enhancements for a solver of the CP model will be shown, leading to a faster schedule computation time using symmetry breaking, branch cutting, well studied principles from job-shop scheduling field and several heuristics. Finally, we will present the design and implementation of a corner-stone application aimed at moving datasets according to the schedule. Results will include comparison of performance and trade-off between CP techniques and a Peer-2-Peer model from simulation framework as well as the real case scenario taken from a practical usage of a CP scheduler.Comment: To appear in proceedings of Computing in High Energy and Nuclear Physics 200

    Combining estimates of interest in prognostic modelling studies after multiple imputation: current practice and guidelines

    Get PDF
    Background: Multiple imputation (MI) provides an effective approach to handle missing covariate data within prognostic modelling studies, as it can properly account for the missing data uncertainty. The multiply imputed datasets are each analysed using standard prognostic modelling techniques to obtain the estimates of interest. The estimates from each imputed dataset are then combined into one overall estimate and variance, incorporating both the within and between imputation variability. Rubin's rules for combining these multiply imputed estimates are based on asymptotic theory. The resulting combined estimates may be more accurate if the posterior distribution of the population parameter of interest is better approximated by the normal distribution. However, the normality assumption may not be appropriate for all the parameters of interest when analysing prognostic modelling studies, such as predicted survival probabilities and model performance measures. Methods: Guidelines for combining the estimates of interest when analysing prognostic modelling studies are provided. A literature review is performed to identify current practice for combining such estimates in prognostic modelling studies. Results: Methods for combining all reported estimates after MI were not well reported in the current literature. Rubin's rules without applying any transformations were the standard approach used, when any method was stated. Conclusion: The proposed simple guidelines for combining estimates after MI may lead to a wider and more appropriate use of MI in future prognostic modelling studies

    Towards Machine Wald

    Get PDF
    The past century has seen a steady increase in the need of estimating and predicting complex systems and making (possibly critical) decisions with limited information. Although computers have made possible the numerical evaluation of sophisticated statistical models, these models are still designed \emph{by humans} because there is currently no known recipe or algorithm for dividing the design of a statistical model into a sequence of arithmetic operations. Indeed enabling computers to \emph{think} as \emph{humans} have the ability to do when faced with uncertainty is challenging in several major ways: (1) Finding optimal statistical models remains to be formulated as a well posed problem when information on the system of interest is incomplete and comes in the form of a complex combination of sample data, partial knowledge of constitutive relations and a limited description of the distribution of input random variables. (2) The space of admissible scenarios along with the space of relevant information, assumptions, and/or beliefs, tend to be infinite dimensional, whereas calculus on a computer is necessarily discrete and finite. With this purpose, this paper explores the foundations of a rigorous framework for the scientific computation of optimal statistical estimators/models and reviews their connections with Decision Theory, Machine Learning, Bayesian Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty Quantification and Information Based Complexity.Comment: 37 page

    Association between tocilizumab, sarilumab and all-cause mortality at 28 days in hospitalised patients with COVID-19:A network meta-analysis

    Get PDF
    BACKGROUND: A recent prospective meta-analysis demonstrated that interleukin-6 antagonists are associated with lower all-cause mortality in hospitalised patients with COVID-19, compared with usual care or placebo. However, emerging evidence suggests that clinicians are favouring the use of tocilizumab over sarilumab. A new randomised comparison of these agents from the REMAP-CAP trial shows similar effects on in-hospital mortality. Therefore, we initiated a network meta-analysis, to estimate pairwise associations between tocilizumab, sarilumab and usual care or placebo with 28-day mortality, in COVID-19 patients receiving concomitant corticosteroids and ventilation, based on all available direct and indirect evidence. METHODS: Eligible trials randomised hospitalised patients with COVID-19 that compared tocilizumab or sarilumab with usual care or placebo in the prospective meta-analysis or that directly compared tocilizumab with sarilumab. Data were restricted to patients receiving corticosteroids and either non-invasive or invasive ventilation at randomisation. Pairwise associations between tocilizumab, sarilumab and usual care or placebo for all-cause mortality 28 days after randomisation were estimated using a frequentist contrast-based network meta-analysis of odds ratios (ORs), implementing multivariate fixed-effects models that assume consistency between the direct and indirect evidence. FINDINGS: One trial (REMAP-CAP) was identified that directly compared tocilizumab with sarilumab and supplied results on all-cause mortality at 28-days. This network meta-analysis was based on 898 eligible patients (278 deaths) from REMAP-CAP and 3710 eligible patients from 18 trials (1278 deaths) from the prospective meta-analysis. Summary ORs were similar for tocilizumab [0·82 [0·71–0·95, p = 0·008]] and sarilumab [0·80 [0·61–1·04, p = 0·09]] compared with usual care or placebo. The summary OR for 28-day mortality comparing tocilizumab with sarilumab was 1·03 [95%CI 0·81–1·32, p = 0·80]. The p-value for the global test of inconsistency was 0·28. CONCLUSIONS: Administration of either tocilizumab or sarilumab was associated with lower 28-day all-cause mortality compared with usual care or placebo. The association is not dependent on the choice of interleukin-6 receptor antagonist

    Coccidioidomycosis among Workers at an Archeological Site, Northeastern Utah

    Get PDF
    In 2001, an outbreak of acute respiratory disease occurred among persons working at a Native American archeological site at Dinosaur National Monument in northeastern Utah. Epidemiologic and environmental investigations were undertaken to determine the cause of the outbreak. A clinical case was defined by the presence of at least two of the following symptoms: self-reported fever, shortness of breath, or cough. Ten workers met the clinical case definition; 9 had serologic confirmation of coccidioidomycosis, and 8 were hospitalized. All 10 were present during sifting of dirt through screens on June 19; symptoms began 9–12 days later (median 10). Coccidioidomycosis also developed in a worker at the site in September 2001. A serosurvey among 40 other Dinosaur National Monument workers did not find serologic evidence of recent infection. This outbreak documents a new endemic focus of coccidioidomycosis, extending northward its known geographic distribution in Utah by approximately 200 miles
    corecore