2,808 research outputs found

    Multiproduct Pricing in Major League Baseball: A Principal Components Analysis

    Get PDF
    The empirical analysis of multiproduct pricing suffers from a lack of clear theoretical guidance and appropriate data, limitations which often render traditional regression-based analyses impractical. This paper analyzes ticket, parking, and concession pricing in Major League Baseball for the period 1991-2003 using a new methodology based on principal components, which allows inferences to be formed about the factors underlying price variation without strong theoretical guidance or abundant information about costs and demand. While general demand shifts are the most important factor, they explain only half of overall price variation. Also important are price interactions that derive from demand interrelationships between goods and the desire to maximize the capture of consumer surplus in the presence of heterogeneous demand.

    Improving the utility of LA-ICP-MS for isotope ratio analyses of single particles with application to uranium oxide

    Get PDF
    The determination of the isotopic composition of single uranium oxide particles, size 0.3-2 ÎŒm, for nuclear safeguards is current performed by either thermal ionisation mass spectrometry (TIMS) or Secondary Ion Mass Spectrometry (SIMS). Laser Ablation-Inductively Coupled Plasma-Mass Spectrometry (LA-ICP-MS), a well-established analytical technique for determining the isotopic composition of solid materials, has the potential to be another method by which single uranium oxide particles can be analysed, complementing established protocol, but requires optimisation. In this study the ability of LA-ICP-MS to determine the isotopic composition, principally 234U/238U, 235U/238U and 236U/238U, of glass reference materials and sub-micron uranium oxide particles is investigated. To achieve the best detection efficiency a prototype high-speed ablation cell and injector design, designed previously at Loughborough University, was coupled to a high efficiency multi collector (MC-) ICP-MS. As a result an increase in signal-to-noise ratio and a measured detection efficiency of 5-7% was achieved for a LA-MC-ICP-MS system. The capability of the LA-MC-ICP-MS system, for the determination of the uranium isotopic composition of single particles was compared to a more established low-volume ablation cell. A source of additional uncertainty, blind time arising from incompatibilities with the mixed detector array of the MC-ICP-MS was identified. The impact of the additional uncertainty on isotope ratio analysis was modeled and a method developed to filter out affected data. LA-ICP-MS and LA-MC-ICP-MS were used to successfully determine the uranium isotopic compositions of sub-micron uranium oxide particles, of a known certified composition. A sample planchet containing particles of two distinct isotopic compositions was resolved. The utility of three data evaluation strategies to determine the isotopic composition of single uranium oxide particles was investigated. The necessity and advantages of calculating isotope ratios using the geometric mean is demonstrated, which has application for isotope ratio analysis performed on all forms of mass spectrometry. A novel approach to prepare particulate samples for laser ablation analysis, cytocentrifugation, is described. By using as the solvent, a mixture of nail polish and acetone, dispersed particles are held in a strong film layer thin enough to allow embedded particles to be imaged by SEM-EDX. A sample of uranium oxide particles in an environmental matrix prepared using cytocentrifugation is analysed by LA-MC-ICP-MS and their isotopic composition resolved

    Liberalism and the problem of colonial rule : three-stages in Anglo-American thought

    Get PDF
    Includes bibliographical references (leaves 57-61).From as early as the 15th century when European explorers rounded the tip of Africa in search of trade routes to the East, until the early twentieth century, the West, through the territorial expansion of empire, established itself as the dominant authority within the global political order. Ideologically inspired conflicts in the first half of the twentieth century, Cold War tensions and the process of decolonization, however, resulted in a fundamental change in the nature of this power and global influence, and led to the construction of a new global order that had never existed before. After centuries of being structured around the power of a few European countries with colonial subjects, the post-colonial order was based on formal equality between states, where the notion of territorial expansion and paternal rule were no longer accepted practices. Instead, power within the international system was determined by economic competition and the notion of 'civilization' was replaced by the ideal of economic development, predominantly through the forces of the international capitalist system. The aim of the following chapters is to highlight the dominant discourse of the AngloAmerican liberal tradition within the context of the changing global order, and argue, more specifically, that the process of decolonization can be used as a lens through which changes reflecting how the 'liberal task' was conceived within Anglo-American political thought, can be traced. Furthermore, it aims to show that Anglo-American political philosophy in the postcolonial era can understood as a part of a larger historical process. dating back to the work John Stuart Mill in the early nineteenth century. By contrasting the liberalisms of Mill, the British Idealists and Isaiah Berlin, and their responses to the question of colonial rule, this history sheds light on the fundamental impulses of the liberal tradition between the colonial and post-colonial periods. It is widely known that Mill was employed by the East India Company and that the subject of colonial rule, to some extent, informed his liberalism

    A Visual’s Worth a Thousand Codes: Illustrative Techniques for Grounded Theory Methodology

    Get PDF
    Academic research should show a transparent methodology. Transparency is important for replicability, trust in the results, and adapting to new contexts. Due to its subjective nature, transparency is especially important for qualitative work, such as grounded theory methodology (GTM). In this paper, we report aspects of a GTM study that highlights several visuals aimed at increasing transparency. This paper aims to contribute novel, transparency enhancing GTM illustrations that others can adapt for their purposes. The illustrations are analyzed and discussed with suggestions for implementation

    How accurate are forecasts of costs of energy? A methodological contribution

    Get PDF
    Forecasts of the cost of energy are typically presented as point estimates; however forecasts are seldom accurate, which makes it important to understand the uncertainty around these point estimates. The scale of the differences between forecasts and outturns (i.e. contemporary estimates) of costs may have important implications for government decisions on the appropriate form (and level) of support, modelling energy scenarios or industry investment appraisal. This paper proposes a methodology to assess the accuracy of cost forecasts. We apply this to levelised costs of energy for different generation technologies due to the availability of comparable forecasts and contemporary estimates, however the same methodology could be applied to the components of levelised costs, such as capital costs. The estimated “forecast errors” capture the accuracy of previous forecasts and can provide objective bounds to the range around current forecasts for such costs. The results from applying this method are illustrated using publicly available data for on- and off-shore wind, Nuclear and CCGT technologies, revealing the possible scale of “forecast errors” for these technologies

    Investigating Endwall-Blade Fillet Radius Variation to Reduce Secondary Flow Losses

    Get PDF
    In turbomachinery the joint between a turbine blade and the endwall often involves a fillet. Previous studies show that this fillet significantly influences the secondary flows despite regularly being omitted from simulation and testing, specifically that a uniform fillet radius of 16% axial chord increased endwall losses by 10%. It was proposed that a variable radius fillet could reduce secondary flows and the associated endwall losses. This paper describes a computational study to determine what variable radius fillet is required for optimal performance in the cascade. The variable radius fillet ranges from 0.5% to 16% of axial chord and was found using a genetic algorithm optimisation. Although this is a computational study the design offers physically plausible mechanisms by which the extra losses introduced by fillets may be reduced. This paper also suggests a generalised rule of fillet radius variation to minimise endwall losses. A large radius is required on the leading edge that reduces slowly along the pressure side but rapidly on the suction side such that the smallest permitted radius is applied to the suction side. A medium radius is required at the trailing edge

    Sediment resuspension rates, organic matter quality and food utilization by sea scallops (Placopecten magellanicus) on Georges Bank

    Get PDF
    Benthic detritus, bacteria, and settled phytoplankton are transported into the water column by resuspension, potentially providing a high quality food source to suspension feeders. Two aspects of resuspension must be considered in relation to food supplies for suspension feeders: the flux of particles from the sediments to the water column and its food value. Sediment resuspension rates on Georges Bank and the role of resuspended sediment in the diet of sea scallops (Placopecten magellanicus) were determined in laboratory flume experiments and shipboard feeding experiments, respectively. Resuspended carbon flux was estimated from flume bedload transport rates and the mass of organic carbon associated with the silt-clay fraction eroded from Georges Bank sediment during transport. A comparison of sand erosion thresholds with the frequency distribution of shear velocity estimated from field current meters indicated that tidal sediment resuspension will occur 62% of the time. Resuspended material had a carbon content of 4–8% and a C:N of 5–8. Rates of resuspension (33–229 mg C m−2h−1) and settling rates indicate that resuspended sediment in a size range available to scallops (\u3e5 ÎŒm) remains in suspension for periods of hours to days. Clearance rates of resuspended sediment by scallops were similar to those for water column particles, and filtration rates increased with increasing concentrations of resuspended material. Feeding experiments demonstrated that scallops absorbed organic matter from resuspended sediments with an efficiency of up to 40%. Therefore, in terms of particle retention, ingestion, and digestion, sea scallops are able to exploit resuspended organic matter from a continental shelf habitat. Furthermore, resuspension occurs with sufficient frequency, and resuspended sediment has long enough residence time in the water column to provide a consistent nutritional benefit to scallops

    Psychopathy, adaptation, and disorder

    Get PDF
    In a recent study, we found a negative association between psychopathy and violence against genetic relatives. We interpreted this result as a form of nepotism and argued that it failed to support the hypothesis that psychopathy is a mental disorder, suggesting instead that it supports the hypothesis that psychopathy is an evolved life history strategy. This interpretation and subsequent arguments have been challenged in a number of ways. Here, we identify several misunderstandings regarding the harmful dysfunction definition of mental disorder as it applies to psychopathy and regarding the meaning of nepotism. Furthermore, we examine the evidence provided by our critics that psychopathy is associated with other disorders, and we offer a comment on their alternative model of psychopathy. We conclude that there remains little evidence that psychopathy is the product of dysfunctional mechanisms

    Assessment of the learning curve in health technologies: a systematic review

    Get PDF
    Objective: We reviewed and appraised the methods by which the issue of the learning curve has been addressed during health technology assessment in the past. Method: We performed a systematic review of papers in clinical databases (BIOSIS, CINAHL, Cochrane Library, EMBASE, HealthSTAR, MEDLINE, Science Citation Index, and Social Science Citation Index) using the search term "learning curve:" Results: The clinical search retrieved 4,571 abstracts for assessment, of which 559 (12%) published articles were eligible for review. Of these, 272 were judged to have formally assessed a learning curve. The procedures assessed were minimal access (51%), other surgical (41%), and diagnostic (8%). The majority of the studies were case series (95%). Some 47% of studies addressed only individual operator performance and 52% addressed institutional performance. The data were collected prospectively in 40%, retrospectively in 26%, and the method was unclear for 31%. The statistical methods used were simple graphs (44%), splitting the data chronologically and performing a t test or chi-squared test (60%), curve fitting (12%), and other model fitting (5%). Conclusions: Learning curves are rarely considered formally in health technology assessment. Where they are, the reporting of the studies and the statistical methods used are weak. As a minimum, reporting of learning should include the number and experience of the operators and a detailed description of data collection. Improved statistical methods would enhance the assessment of health technologies that require learning
    • 

    corecore