209 research outputs found

    Identifying Taste Variation in Choice Models

    No full text
    Among the many attractive features of the mixed logit model is its ability to take account of taste variation among decision-makers by allowing coefficients to follow pre-specified distributions (usually normal or lognormal). Whilst accounting for heterogeneity in the population, simple applications of the technique fail to identify valuable information on differences in behaviour between market segments. This information is likely to be of use to those involved in policy and investment analysis, product design and marketing. The ‘standard’ approach to overcome this problem when working with the mixed logit model is to identify segments prior to modelling and either specify a set of constant coefficients for each market segment together with an additional error term to ‘mop-up’ any residual variation, or by allowing separate distributions for each market segment. An alternative approach is to adapt an exciting new methodology that offers the ability to estimate reliable individual specific parameters (Revelt and Train, 1999). This approach is documented in Section 3 and involves three key stages: ‱ First use maximum simulated likelihood to estimate distributions of tastes across the population. ‱ Next examine individual’s choices to arrive at estimates of their parameters, conditional on know distributions across the population (including accounting for uncertainty in the population estimates). This process again involves the use of maximum simulated likelihood. ‱ Finally, differences in behaviour between market segments are identified by regressing individual ‘part-worths’ against the characteristics of the decision-maker or attributes of the choice alternatives. In the first instance the technique is validated under ‘controlled’ circumstances on a simulated data set with know taste distributions. This simulation involves a binary choice situation in which the alternatives are described in terms of time and cost. The choices of a group of decision-makers are simulated with each with a value of time drawn from a known distribution. The resulting choices are then analysed and individual values recovered with a surprisingly high degree of precision. The findings of this validation are set out in Section 4. Following a successful validation of the technique on simulated data, the methodology is applied to data from two stated preference experiments in which 326 respondents were asked to choose between alternate motor vehicle 1 specifications defined by purchase price, running costs, engine size, emissions and safety features. The results of this analysis are set out in Section 5 and are compared to the findings of previously calibrated models that identified significant differences in tastes across market segments

    Modelling the impact of alternative fare structures on train overcrowding

    No full text
    The Strategic Rail Authority (SRA) provides the backbone to rail regulation in Great Britain. As part of its responsibilities, the SRA monitors overcrowding on trains which it measures in terms of the proportion of passengers on trains in excess of the seat capacity for longer distance services, and with an allowance for standing passengers on shorter journeys of less than 20 minutes. Overcrowding on Britain’s railways fell during the early 1990s but has been on the increase since 1996 with particularly acute problems in the morning peak for services travelling to London. In a study conducted on behalf of the SRA we developed the PRAISE rail operations model to include penalties for overcrowding based upon journey purpose, journey time and degree of overcrowding. Using demand, fares and timetable information for an actual case study route, we examine how fares and ticketing restrictions can be set to manage demand throughout the day without significantly reducing the overall demand for rail travel

    Direct Demand Models of Air Travel: A Novel Approach to the Analysis of Stated Preference Data

    Get PDF
    This paper uses what has been termed the direct demand approach to obtain elasticity estimates from discrete choice Stated Preference data. The Stated Preference data relates to business travellers' choices between air and rail. The direct demand methodology is outlined and some potential advantages over the conventional disaggregate logit model are discussed. However, further research regarding the relative merits of the two approaches is recommended. The direct demand model is developed to explain variations in the demand for air travel as a function of variations in air headway and cost and in train journey time, frequency, interchange and cost. Relatively little has previously been published about the interaction between rail and air and the elasticities and variation in them which have been estimated are generally plausible. In particular, the results show that large improvements in rail journey times can have a very substantial impact on the demand for air travel and that the rail journey time cross-elasticity depends on satisfying a three hour journey time threshold

    Diverting Inter-Urban Car Users to Rail: Results from a Revealed Preference Mode Choice Model

    Get PDF
    This paper reports disaggregate mode choice models based on the actual choices made by business and leisure travellers on inter-urban journeys which involved crossing the Pennines. The models explain choices as a function of the times and costs of each mode and of train headway and interchange. The models are an extension of those previously reported by Oscar Faber TPA as part of their Trans-Pennine Rail Strategy Study, involving a more detailed examination of functional form and disaggregation by journey purpose. The research reported here was undertaken as part of an ESRC funded project examining the potential for diverting inter-urban car and air passengers to rail. There have been very few studies of inter-modal interaction for inter-urban travel and therefore little is known about the cross- elasticities. The findings discussed here are based on a preferred functional form of mode choice model. The results for leisure travel are generally very satisfactory, with an important distinction identified between group and solus travel. However, we place more reservations on the results ohtained from the business model

    Passenger Rail Statistics Following the Privatisation of British Rail.

    Get PDF
    In common with other major industries, the rail industry requires significant amounts of high quality information for successful operation. Information relating to the demand and supply of rail services is required to ensure that appropriate services are provided at minimum cost. Given the recent explosion of information technology, data on demand and supply is readily accessible and much improved. The benefits to the industry arising from such improvements should be maintained under the new organisational structures brought about by the 1993 Transport Act. It is the aim of this note to outline current sources of information and key statistics and raise the issue of what provisions are being made to ensure that they are maintained in a consistent and compatible format following privatisation

    Designing a broad-spectrum integrative approach for cancer prevention and treatment

    Get PDF
    Targeted therapies and the consequent adoption of "personalized" oncology have achieved notablesuccesses in some cancers; however, significant problems remain with this approach. Many targetedtherapies are highly toxic, costs are extremely high, and most patients experience relapse after a fewdisease-free months. Relapses arise from genetic heterogeneity in tumors, which harbor therapy-resistantimmortalized cells that have adopted alternate and compensatory pathways (i.e., pathways that are notreliant upon the same mechanisms as those which have been targeted). To address these limitations, aninternational task force of 180 scientists was assembled to explore the concept of a low-toxicity "broad-spectrum" therapeutic approach that could simultaneously target many key pathways and mechanisms. Using cancer hallmark phenotypes and the tumor microenvironment to account for the various aspectsof relevant cancer biology, interdisciplinary teams reviewed each hallmark area and nominated a widerange of high-priority targets (74 in total) that could be modified to improve patient outcomes. For thesetargets, corresponding low-toxicity therapeutic approaches were then suggested, many of which werephytochemicals. Proposed actions on each target and all of the approaches were further reviewed forknown effects on other hallmark areas and the tumor microenvironment. Potential contrary or procar-cinogenic effects were found for 3.9% of the relationships between targets and hallmarks, and mixedevidence of complementary and contrary relationships was found for 7.1%. Approximately 67% of therelationships revealed potentially complementary effects, and the remainder had no known relationship. Among the approaches, 1.1% had contrary, 2.8% had mixed and 62.1% had complementary relationships. These results suggest that a broad-spectrum approach should be feasible from a safety standpoint. Thisnovel approach has potential to be relatively inexpensive, it should help us address stages and types ofcancer that lack conventional treatment, and it may reduce relapse risks. A proposed agenda for futureresearch is offered

    ENIGMA and global neuroscience: A decade of large-scale studies of the brain in health and disease across more than 40 countries

    Get PDF
    This review summarizes the last decade of work by the ENIGMA (Enhancing NeuroImaging Genetics through Meta Analysis) Consortium, a global alliance of over 1400 scientists across 43 countries, studying the human brain in health and disease. Building on large-scale genetic studies that discovered the first robustly replicated genetic loci associated with brain metrics, ENIGMA has diversified into over 50 working groups (WGs), pooling worldwide data and expertise to answer fundamental questions in neuroscience, psychiatry, neurology, and genetics. Most ENIGMA WGs focus on specific psychiatric and neurological conditions, other WGs study normal variation due to sex and gender differences, or development and aging; still other WGs develop methodological pipelines and tools to facilitate harmonized analyses of "big data" (i.e., genetic and epigenetic data, multimodal MRI, and electroencephalography data). These international efforts have yielded the largest neuroimaging studies to date in schizophrenia, bipolar disorder, major depressive disorder, post-traumatic stress disorder, substance use disorders, obsessive-compulsive disorder, attention-deficit/hyperactivity disorder, autism spectrum disorders, epilepsy, and 22q11.2 deletion syndrome. More recent ENIGMA WGs have formed to study anxiety disorders, suicidal thoughts and behavior, sleep and insomnia, eating disorders, irritability, brain injury, antisocial personality and conduct disorder, and dissociative identity disorder. Here, we summarize the first decade of ENIGMA's activities and ongoing projects, and describe the successes and challenges encountered along the way. We highlight the advantages of collaborative large-scale coordinated data analyses for testing reproducibility and robustness of findings, offering the opportunity to identify brain systems involved in clinical syndromes across diverse samples and associated genetic, environmental, demographic, cognitive, and psychosocial factors

    Measurement of the tau lepton lifetime

    Get PDF

    Limit on Bs0B^0_s oscillation using a jet charge method

    Get PDF
    A lower limit is set on the B_{s}^{0} meson oscillation parameter \Delta m_{s} using data collected from 1991 to 1994 by the ALEPH detector. Events with a high transverse momentum lepton and a reconstructed secondary vertex are used. The high transverse momentum leptons are produced mainly by b hadron decays, and the sign of the lepton indicates the particle/antiparticle final state in decays of neutral B mesons. The initial state is determined by a jet charge technique using both sides of the event. A maximum likelihood method is used to set a lower limit of \, \Delta m_{s}. The 95\% confidence level lower limit on \Delta m_s ranges between 5.2 and 6.5(\hbar/c^{2})~ps^{-1} when the fraction of b quarks from Z^0 decays that form B_{s}^{0} mesons is varied from 8\% to 16\%. Assuming that the B_{s}^{0} fraction is 12\%, the lower limit would be \Delta m_{s} 6.1(\hbar/c^{2})~ps^{-1} at 95\% confidence level. For x_s = \Delta m_s \, \tau_{B_s}, this limit also gives x_s 8.8 using the B_{s}^{0} lifetime of \tau_{B_s} = 1.55 \pm 0.11~ps and shifting the central value of \tau_{B_s} down by 1\sigma

    Measurement of the Bs0^0_s lifetime and production rate with Ds−l+^-_s l^+ combinations in Z decays

    Get PDF
    The lifetime of the \bs meson is measured in approximately 3 million hadronic Z decays accumulated using the ALEPH detector at LEP from 1991 to 1994. Seven different \ds decay modes were reconstructed and combined with an opposite sign lepton as evidence of semileptonic \bs decays. Two hundred and eight \dsl candidates satisfy selection criteria designed to ensure precise proper time reconstruction and yield a measured \bs lifetime of \mbox{\result .} Using a larger, less constrained sample of events, the product branching ratio is measured to be \mbox{\pbrresult
    • 

    corecore