4,550 research outputs found

    Single-leg airline revenue management with overbooking

    Get PDF
    Airline revenue management is about identifying the maximum revenue seat allocation policies. Since a major loss in revenue results from cancellations and no-show passengers, over the years overbooking has received a significant attention in the literature. In this study, we propose new models for static and dynamic single-leg overbooking problems. In the static case, we introduce computationally tractable models that give upper and lower bounds for the optimal expected revenue. In the dynamic case, we propose a new dynamic programming model, which is based on two streams of arrivals. The first stream corresponds to the booking requests and the second stream represents the cancellations. We also conduct simulation experiments to illustrate the proposed models and the solution methods

    Effects of serum proteins on corrosion behavior of ISO 5832–9 alloy modified by titania coatings

    Get PDF
    Stainless steel ISO 5832–9 type is often used to perform implants which operate in protein-containing physiological environments. The interaction between proteins and surface of the implant may affect its corrosive properties. The aim of this work was to study the effect of selected serum proteins (albumin and γ-globulins) on the corrosion of ISO 5832–9 alloy (trade name M30NW) which surface was modified by titania coatings. These coatings were obtained by sol– gel method and heated at temperatures of 400 and 800 °C. To evaluate the effect of the proteins, the corrosion tests were performed with and without the addition of proteins with concentration of 1 g L−1 to the physiological saline solution (0.9 % NaCl, pH 7.4) at 37 °C. The tests were carried out within 7 days. The following electrochemical methods were used: open circuit potential, linear polarization resistance, and electrochemical impedance spectroscopy. In addition, surface analysis by optical microscopy and X-ray photoelectron spectroscopy (XPS) method was done at the end of weekly corrosion tests. The results of corrosion tests showed that M30NW alloy both uncoated and modified with titania coatings exhibits a very good corrosion resistance during weekly exposition to corrosion medium. The best corrosion resistance in 0.9 % NaCl solution is shown by alloy samples modified by titania coating annealed at 400 °C. The serumproteins have no significant effect onto corrosion of investigated biomedical steel. The XPS results confirmed the presence of proteins on the alloy surface after 7 days of immersion in proteincontaining solutions.The investigations were supported by the National Science Centre project No. N N507 501339. The authors gratefully acknowledge Dr. Janusz Sobczak and Dr. hab. Wojciech Lisowski from Institute of Physical Chemistry of PAS for XPS surface analyses

    Closing the sea surface mixed layer temperature budget from in situ observations alone: Operation Advection during BoBBLE

    Get PDF
    Sea surface temperature (SST) is a fundamental driver of tropical weather systems such as monsoon rainfall and tropical cyclones. However, understanding of the factors that control SST variability is lacking, especially during the monsoons when in situ observations are sparse. Here we use a ground-breaking observational approach to determine the controls on the SST variability in the southern Bay of Bengal. We achieve this through the first full closure of the ocean mixed layer energy budget derived entirely from in situ observations during the Bay of Bengal Boundary Layer Experiment (BoBBLE). Locally measured horizontal advection and entrainment contribute more significantly than expected to SST evolution and thus oceanic variability during the observation period. These processes are poorly resolved by state-of-the-art climate models, which may contribute to poor representation of monsoon rainfall variability. The novel techniques presented here provide a blueprint for future observational experiments to quantify the mixed layer heat budget on longer time scales and to evaluate these processes in models

    Psychometric precision in phenotype definition is a useful step in molecular genetic investigation of psychiatric disorders

    Get PDF
    Affective disorders are highly heritable, but few genetic risk variants have been consistently replicated in molecular genetic association studies. The common method of defining psychiatric phenotypes in molecular genetic research is either a summation of symptom scores or binary threshold score representing the risk of diagnosis. Psychometric latent variable methods can improve the precision of psychiatric phenotypes, especially when the data structure is not straightforward. Using data from the British 1946 birth cohort, we compared summary scores with psychometric modeling based on the General Health Questionnaire (GHQ-28) scale for affective symptoms in an association analysis of 27 candidate genes (249 single-nucleotide polymorphisms (SNPs)). The psychometric method utilized a bi-factor model that partitioned the phenotype variances into five orthogonal latent variable factors, in accordance with the multidimensional data structure of the GHQ-28 involving somatic, social, anxiety and depression domains. Results showed that, compared with the summation approach, the affective symptoms defined by the bi-factor psychometric model had a higher number of associated SNPs of larger effect sizes. These results suggest that psychometrically defined mental health phenotypes can reflect the dimensions of complex phenotypes better than summation scores, and therefore offer a useful approach in genetic association investigations

    Mechanical Systems with Symmetry, Variational Principles, and Integration Algorithms

    Get PDF
    This paper studies variational principles for mechanical systems with symmetry and their applications to integration algorithms. We recall some general features of how to reduce variational principles in the presence of a symmetry group along with general features of integration algorithms for mechanical systems. Then we describe some integration algorithms based directly on variational principles using a discretization technique of Veselov. The general idea for these variational integrators is to directly discretize Hamilton’s principle rather than the equations of motion in a way that preserves the original systems invariants, notably the symplectic form and, via a discrete version of Noether’s theorem, the momentum map. The resulting mechanical integrators are second-order accurate, implicit, symplectic-momentum algorithms. We apply these integrators to the rigid body and the double spherical pendulum to show that the techniques are competitive with existing integrators

    Neutralino dark matter in mSUGRA/CMSSM with a 125 GeV light Higgs scalar

    Full text link
    The minimal supergravity (mSUGRA or CMSSM) model is an oft-used framework for exhibiting the properties of neutralino (WIMP) cold dark matter (CDM). However, the recent evidence from Atlas and CMS on a light Higgs scalar with mass m_h\simeq 125 GeV highly constrains the superparticle mass spectrum, which in turn constrains the neutralino annihilation mechanisms in the early universe. We find that stau and stop co-annihilation mechanisms -- already highly stressed by the latest Atlas/CMS results on SUSY searches -- are nearly eliminated if indeed the light Higgs scalar has mass m_h\simeq 125 GeV. Furthermore, neutralino annihilation via the A-resonance is essentially ruled out in mSUGRA so that it is exceedingly difficult to generate thermally-produced neutralino-only dark matter at the measured abundance. The remaining possibility lies in the focus-point region which now moves out to m_0\sim 10-20 TeV range due to the required large trilinear soft SUSY breaking term A_0. The remaining HB/FP region is more fine-tuned than before owing to the typically large top squark masses. We present updated direct and indirect detection rates for neutralino dark matter, and show that ton scale noble liquid detectors will either discover mixed higgsino CDM or essentially rule out thermally-produced neutralino-only CDM in the mSUGRA model.Comment: 17 pages including 9 .eps figure

    A Profile Likelihood Analysis of the Constrained MSSM with Genetic Algorithms

    Full text link
    The Constrained Minimal Supersymmetric Standard Model (CMSSM) is one of the simplest and most widely-studied supersymmetric extensions to the standard model of particle physics. Nevertheless, current data do not sufficiently constrain the model parameters in a way completely independent of priors, statistical measures and scanning techniques. We present a new technique for scanning supersymmetric parameter spaces, optimised for frequentist profile likelihood analyses and based on Genetic Algorithms. We apply this technique to the CMSSM, taking into account existing collider and cosmological data in our global fit. We compare our method to the MultiNest algorithm, an efficient Bayesian technique, paying particular attention to the best-fit points and implications for particle masses at the LHC and dark matter searches. Our global best-fit point lies in the focus point region. We find many high-likelihood points in both the stau co-annihilation and focus point regions, including a previously neglected section of the co-annihilation region at large m_0. We show that there are many high-likelihood points in the CMSSM parameter space commonly missed by existing scanning techniques, especially at high masses. This has a significant influence on the derived confidence regions for parameters and observables, and can dramatically change the entire statistical inference of such scans.Comment: 47 pages, 8 figures; Fig. 8, Table 7 and more discussions added to Sec. 3.4.2 in response to referee's comments; accepted for publication in JHE

    Calibration of myocardial T2 and T1 against iron concentration.

    Get PDF
    BACKGROUND: The assessment of myocardial iron using T2* cardiovascular magnetic resonance (CMR) has been validated and calibrated, and is in clinical use. However, there is very limited data assessing the relaxation parameters T1 and T2 for measurement of human myocardial iron. METHODS: Twelve hearts were examined from transfusion-dependent patients: 11 with end-stage heart failure, either following death (n=7) or cardiac transplantation (n=4), and 1 heart from a patient who died from a stroke with no cardiac iron loading. Ex-vivo R1 and R2 measurements (R1=1/T1 and R2=1/T2) at 1.5 Tesla were compared with myocardial iron concentration measured using inductively coupled plasma atomic emission spectroscopy. RESULTS: From a single myocardial slice in formalin which was repeatedly examined, a modest decrease in T2 was observed with time, from mean (± SD) 23.7 ± 0.93 ms at baseline (13 days after death and formalin fixation) to 18.5 ± 1.41 ms at day 566 (p<0.001). Raw T2 values were therefore adjusted to correct for this fall over time. Myocardial R2 was correlated with iron concentration [Fe] (R2 0.566, p<0.001), but the correlation was stronger between LnR2 and Ln[Fe] (R2 0.790, p<0.001). The relation was [Fe] = 5081•(T2)-2.22 between T2 (ms) and myocardial iron (mg/g dry weight). Analysis of T1 proved challenging with a dichotomous distribution of T1, with very short T1 (mean 72.3 ± 25.8 ms) that was independent of iron concentration in all hearts stored in formalin for greater than 12 months. In the remaining hearts stored for <10 weeks prior to scanning, LnR1 and iron concentration were correlated but with marked scatter (R2 0.517, p<0.001). A linear relationship was present between T1 and T2 in the hearts stored for a short period (R2 0.657, p<0.001). CONCLUSION: Myocardial T2 correlates well with myocardial iron concentration, which raises the possibility that T2 may provide additive information to T2* for patients with myocardial siderosis. However, ex-vivo T1 measurements are less reliable due to the severe chemical effects of formalin on T1 shortening, and therefore T1 calibration may only be practical from in-vivo human studies

    Assessing Performance of Orthology Detection Strategies Applied to Eukaryotic Genomes

    Get PDF
    Orthology detection is critically important for accurate functional annotation, and has been widely used to facilitate studies on comparative and evolutionary genomics. Although various methods are now available, there has been no comprehensive analysis of performance, due to the lack of a genomic-scale ‘gold standard’ orthology dataset. Even in the absence of such datasets, the comparison of results from alternative methodologies contains useful information, as agreement enhances confidence and disagreement indicates possible errors. Latent Class Analysis (LCA) is a statistical technique that can exploit this information to reasonably infer sensitivities and specificities, and is applied here to evaluate the performance of various orthology detection methods on a eukaryotic dataset. Overall, we observe a trade-off between sensitivity and specificity in orthology detection, with BLAST-based methods characterized by high sensitivity, and tree-based methods by high specificity. Two algorithms exhibit the best overall balance, with both sensitivity and specificity>80%: INPARANOID identifies orthologs across two species while OrthoMCL clusters orthologs from multiple species. Among methods that permit clustering of ortholog groups spanning multiple genomes, the (automated) OrthoMCL algorithm exhibits better within-group consistency with respect to protein function and domain architecture than the (manually curated) KOG database, and the homolog clustering algorithm TribeMCL as well. By way of using LCA, we are also able to comprehensively assess similarities and statistical dependence between various strategies, and evaluate the effects of parameter settings on performance. In summary, we present a comprehensive evaluation of orthology detection on a divergent set of eukaryotic genomes, thus providing insights and guides for method selection, tuning and development for different applications. Many biological questions have been addressed by multiple tests yielding binary (yes/no) outcomes but no clear definition of truth, making LCA an attractive approach for computational biology
    corecore