10 research outputs found

    Learning in Consumer Choice.

    Get PDF
    In Economics consumption decisions are typically modeled as all-encompassing one-shot decisions; making a consumption decision is thought of as determining a complete consumption plan, which gives a full specification of all present and future consumption. In contrast, this dissertation sets up a learning framework for consumption decisions. This alternative approach considers a sequence of smaller decisions, each of which determines only current consumption and savings, and it models a way in which making (efficient) trade-offs between consumption and savings is learned from the consequences of choices made in earlier periods. Then, this thesis extensively studies the theoretical relations that exist between the traditional and the alternative approach.

    Learning in consumer choice

    Get PDF
    In Economics consumption decisions are typically modeled as all-encompassing one-shot decisions; making a consumption decision is thought of as determining a complete consumption plan, which gives a full specification of all present and future consumption. In contrast, this dissertation sets up a learning framework for consumption decisions. This alternative approach considers a sequence of smaller decisions, each of which determines only current consumption and savings, and it models a way in which making (efficient) trade-offs between consumption and savings is learned from the consequences of choices made in earlier periods. Then, this thesis extensively studies the theoretical relations that exist between the traditional and the alternative approach.

    Learning in consumer choice

    No full text
    In Economics consumption decisions are typically modeled as all-encompassing one-shot decisions; making a consumption decision is thought of as determining a complete consumption plan, which gives a full specification of all present and future consumption. In contrast, this dissertation sets up a learning framework for consumption decisions. This alternative approach considers a sequence of smaller decisions, each of which determines only current consumption and savings, and it models a way in which making (efficient) trade-offs between consumption and savings is learned from the consequences of choices made in earlier periods. Then, this thesis extensively studies the theoretical relations that exist between the traditional and the alternative approach

    BDES2020 - Decorated Shed <James Cristallo>

    No full text
    - Bayesian statistics is an alternative form of statistics that provides a way to systematically integrate new information with existing information.- Bayesian methods are very suitable for evidence synthesis.- Bayesian outcomes are easier to interpret than standard statistical outcomes.- For instance, Bayesian methods allow for determining the probability that a difference in effect between two treatments will be clinically relevant.- The use of Bayesian methods is becoming more prevalent

    Cost-effectiveness on a local level: whether and when to adopt a new technology

    No full text
    Item does not contain fulltextCost-effectiveness analysis has become a widely accepted tool for decision making in health care. The standard textbook cost-effectiveness analysis focuses on whether to make the switch from an old or common practice technology to an innovative technology, and in doing so, it takes a global perspective. In this article, we are interested in a local perspective, and we look at the questions of whether and when the switch from old to new should be made. A new approach to cost-effectiveness from a local (e.g., a hospital) perspective, by means of a mathematical model for cost-effectiveness that explicitly incorporates time, is proposed. A decision rule is derived for establishing whether a new technology should be adopted, as well as a general rule for establishing when it pays to postpone adoption by 1 more period, and a set of decision rules that can be used to determine the optimal timing of adoption. Finally, a simple example is presented to illustrate our model and how it leads to optimal decision making in a number of cases

    Quantifying short run cost-effectiveness during a gradual implementation process

    No full text
    This paper examines the short run inefficiencies that arise during gradual implementation of a new cost-effective technology in healthcare. These inefficiencies arise when health gains associated with the new technology cannot be obtained immediately because the new technology does not yet supply all patients, and when there is overcapacity for the old technology in the short run because the supply of care is divided among two mutually exclusive technologies. Such efficiency losses are not taken into account in standard textbook cost-effectiveness analysis in which a steady state is presented where costs and effects are assumed to be unchanging over time. A model is constructed to quantify such short run inefficiencies as well as to inform the decision maker about the optimal implementation pattern for the new technology. The model operates by integrating the incremental net benefit equations for both the period of co-existence of mutually exclusive technologies and the period after complete substitution of the old technology. It takes into account the rate of implementation of the new technology, depreciation of capital of the old technology as well as the demand curves for both technologies. The model is applied to the real world case of converting from screen film to digital mammography in the Netherlands

    Combined N-of-1 trials to investigate mexiletine in non-dystrophic myotonia using a Bayesian approach; study rationale and protocol

    Get PDF
    Contains fulltext : 154813.pdf (publisher's version ) (Open Access)BACKGROUND: To obtain evidence for the clinical and cost-effectiveness of treatments for patients with rare diseases is a challenge. Non-dystrophic myotonia (NDM) is a group of inherited, rare muscle diseases characterized by muscle stiffness. The reimbursement of mexiletine, the expert opinion drug for NDM, has been discontinued in some countries due to a lack of independent randomized controlled trials (RCTs). It remains unclear however, which concessions can be accepted towards the level 1 evidence needed for coverage decisions, in rare diseases. Considering the large number of rare diseases with a lack of treatment evidence, more experience with innovative trial designs is needed. Both NDM and mexiletine are well suited for an N-of-1 trial design. A Bayesian approach allows for the combination of N-of-1 trials, which enables the assessment of outcomes on the patient and group level simultaneously. METHODS/DESIGN: We will combine 30 individual, double-blind, randomized, placebo-controlled N-of-1 trials of mexiletine (600 mg daily) vs. placebo in genetically confirmed NDM patients using hierarchical Bayesian modeling. Our results will be compared and combined with the main results of an international cross-over RCT (mexiletine vs. placebo in NDM) published in 2012 that will be used as an informative prior. Similar criteria of eligibility, treatment regimen, end-points and measurement instruments are employed as used in the international cross-over RCT. DISCUSSION: The treatment of patients with NDM with mexiletine offers a unique opportunity to compare outcomes and efficiency of novel N-of-1 trial-based designs and conventional approaches in producing evidence of clinical and cost-effectiveness of treatments for patients with rare diseases. TRIAL REGISTRATION: ClinicalTrials.gov Identifier: NCT02045667

    Calibration of fundamental diagrams for travel time predictions based on the cell transmission model

    Get PDF
    Road traffic increases constantly and the negative consequences in the form of traffic jams can be realized especially in urban areas. In order to provide real time traffic information to road users and traffic managers, accurate computer models gain relevance. A software called Mobile Millennium Stockholm (MMS) was developed to estimate and predict travel times and has been implemented on a 7km test stretch in the north of Stockholm. The core of the software is the cell transmission model (CTM) which is a macroscopic traffic flow model based on aggregated speed observations. This thesis focuses on different calibration techniques of the so called fundamental diagram as an important input factor to the CTM. The diagrams illustrate the mathematical function which defines the relation between traffic flow, density and speed. The calibration is performed in different scenarios based on the least square (LS) and total least square (TLS) error minimization. Furthermore, sources, representing the traffic demand, and sinks, representing the surrounding of the modeled network, are implemented as dynamic parameters to model the change in traffic behavior throughout the day. Split ratios, as a representation of the drivers\u91 route choice in the CTM are estimated and implemented as well. For the framework of this work, the MMS software is run in a pure prediction mode. The CTM is based on the source, sink, split and fundamental diagram parameters only and run forward in time. For each fundamental diagram calibration scenario an independent model run is performed. The evaluation of the scenarios is based on the output of the model. The results are compared to existing Bluetooth travel time measurements for the test stretch, which are used as ground truth observations, and a mean average percentage error (MAPE) is calculated. This leads to a most reasonable technique for the fundamental diagram calibration \u96 the total least square error minimization
    corecore