121 research outputs found

    Sensitivity of tropical deep convection in global models: Effects of horizontal resolution, surface constraints, and 3D atmospheric nudging

    Get PDF
    We investigate the ability of global models to capture the spatial patterns of tropical deep convection. Their sensitivity is assessed through changing horizontal resolution, surface flux constraints, and constraining background atmospheric conditions. We assess two models at typical climate and weather forecast resolutions. Comparison with observations indicates that increasing resolution generally improves the pattern of tropical convection. When the models are constrained with realistic surface fluxes and atmospheric structure, the location of convection improves dramatically and is very similar irrespective of resolution and parameterisations used in the models.RCUK, OtherThis is the accepted version of the following article: 'Sensitivity of tropical deep convection in global models: effects of horizontal resolution, surface constraints and 3D atmospheric nudging', which will be published in Atmospheric Science Letters. This record will be updated with citation and DOI after publication

    The Impact of Changes in Tropical Sea Surface Temperatures over 1979–2012 on Northern Hemisphere High-Latitude Climate

    Get PDF
    While rapid changes in Arctic climate over recent decades are widely documented, the importance of different driving mechanisms is still debated. A previous study proposed a causal connection between recent tropical Pacific sea surface temperature (SST) trends and circulation changes over northern Canada and Greenland (NCG). Here, using the HadGEM3-A model, we perform a suite of sensitivity experiments to investigate the influence of tropical SSTs on winter atmospheric circulation over NCG. The experiments are forced with observed SST changes between an “early” (1979–88) and “late” period (2003–12) and applied across the entire tropics (TropSST), the tropical Pacific (PacSST), and the tropical Atlantic (AtlSST). In contrast to the previous study, all three experiments show a negative 200-hPa eddy geopotential height (Z200) anomaly over NCG in winter, which is similar to the response in AMIP experiments from four other climate models. The positive Z200 NCG anomaly in ERA-Interim between the two periods is inside the bounds of internal variability estimated from bootstrap sampling. The NCG circulation anomaly in the TropSST experiment is associated with a Rossby wave train originating from the tropical Pacific, with an important contribution coming from the tropical Atlantic SSTs connected via an atmospheric bridge through the tropical Pacific. This generates anomalous upper-level convergence and a positive Rossby wave source anomaly near the North Pacific jet exit region. Hence, while a tropics–Arctic teleconnection is evident, its influence on recent Arctic regional climate differs from observed changes and warrants further research

    An Assessment of Recent and Future Temperature Change over the Sichuan Basin, China, using CMIP5 Climate Models

    Get PDF
    The Sichuan basin is one of the most densely populated regions of China, making the area particularly vulnerable to the adverse impacts associated with future climate change. As such, climate models are important for understanding regional and local impacts of climate change and variability, like heat stress and drought. In this study, climate models from phase 5 of the Coupled Model Intercomparison Project (CMIP5) are validated over the Sichuan basin by evaluating how well each model can capture the phase, amplitude, and variability of the regionally observed mean, maximum, and minimum temperature between 1979 and 2005. The results reveal that the majority of the models do not capture the basic spatial pattern and observed means, trends, and probability distribution functions. In particular, mean and minimum temperatures are underestimated, especially during the winter, resulting in biases exceeding −3°C. Models that reasonably represent the complex basin topography are found to generally have lower biases overall. The five most skillful climate models with respect to the regional climate of the Sichuan basin are selected to explore twenty-first-century temperature projections for the region. Under the CMIP5 high-emission future climate change scenario, representative concentration pathway 8.5 (RCP8.5), the temperatures are projected to increase by approximately 4°C (with an average warming rate of +0.72°C decade−1), with the greatest warming located over the central plains of the Sichuan basin, by 2100. Moreover, the frequency of extreme months (where mean temperature exceeds 28°C) is shown to increase in the twenty-first century at a faster rate compared to the twentieth century.Funding for this research was provided by the Engineering and Physical Sciences Research Council (EPSRC) as part of the Low Carbon Climate-Responsive Heating and Cooling of Cities (LoHCool) project (EP/N009797/1)

    Can quantitative analysis of multi-parametric MRI independently predict failure of focal salvage HIFU therapy in men with radio-recurrent prostate cancer?

    Get PDF
    OBJECTIVES: Focal salvage HIFU is a feasible therapeutic option in some men who have recurrence after primary radiotherapy for prostate cancer. We aimed to determine if multi-parametric quantitative parameters, in addition to clinical factors, might have a role in independently predicting focal salvage HIFU outcomes. METHODS: A retrospective registry analysis included 150 consecutive men who underwent focal salvage HIFU (Sonablate500) (2006-2015); 89 had mpMRI available. Metastatic disease was excluded by nodal assessment on pelvic MRI, a radioisotope bone-scan and/or choline or FDG PET/CT scan. All men had mpMRI and either transperineal template prostate mapping biopsy or targeted and systematic TRUS-biopsy. mpMRI included T2-weighted, diffusion-weighted and dynamic contrast-enhancement. Pre-HIFU quantitative mpMRI data was obtained using Horos DICOM Viewer v3.3.5 for general MRI parameters and IB DCE v2.0 plug-in. Progression-free survival (PFS) was defined by biochemical failure and/or positive localized or distant imaging results and/or positive biopsy and/or systemic therapy and/or metastases/prostate cancer-specific death. Potential predictors of PFS were analyzed by univariable and multivariable Cox-regression. RESULTS: Median age at focal salvage HIFU was 71 years (interquartile range [IQR] 65-74.5) and median PSA pre-focal salvage treatment was 5.8ng/ml (3.8-8). Median follow-up was 35 months (23-47) and median time to failure was 15 months (7.8-24.3). D-Amico low, intermediate and high-risk disease was present in 1% (1/89), 40% (36/89) and 43% (38/89) prior to focal salvage HIFU (16% missing data). 56% (50/89) failed by the composite outcome. A total of 22 factors were evaluated on univariable and 8 factors on multivariable analysis. The following quantitative parameters were included: Ktrans, Kep, Ve, Vp, IS, rTTP and TTP. On univariable analysis, PSA, prostate volume at time of radiotherapy failure and Ve (median) value were predictors for failure. Ve represents extracellular fraction of the whole tissue volume. On multivariable analysis, only Ve (median) value remained as an independent predictor. CONCLUSIONS: One pharmacokinetic quantitative parameter based on DCE sequences seems to independently predict failure following focal salvage HIFU for radio-recurrent prostate cancer. This likely relates to the tumor microenvironment producing heat-sinks which counter the heating effect of HIFU. Further validation in larger datasets and evaluating mechanisms to reduce heat-sinks are required

    A genome-wide study of Hardy–Weinberg equilibrium with next generation sequence data

    Get PDF
    Statistical tests for Hardy–Weinberg equilibrium have been an important tool for detecting genotyping errors in the past, and remain important in the quality control of next generation sequence data. In this paper, we analyze complete chromosomes of the 1000 genomes project by using exact test procedures for autosomal and X-chromosomal variants. We find that the rate of disequilibrium largely exceeds what might be expected by chance alone for all chromosomes. Observed disequilibrium is, in about 60% of the cases, due to heterozygote excess. We suggest that most excess disequilibrium can be explained by sequencing problems, and hypothesize mechanisms that can explain exceptional heterozygosities. We report higher rates of disequilibrium for the MHC region on chromosome 6, regions flanking centromeres and p-arms of acrocentric chromosomes. We also detected long-range haplotypes and areas with incidental high disequilibrium. We report disequilibrium to be related to read depth, with variants having extreme read depths being more likely to be out of equilibrium. Disequilibrium rates were found to be 11 times higher in segmental duplications and simple tandem repeat regions. The variants with significant disequilibrium are seen to be concentrated in these areas. For next generation sequence data, Hardy–Weinberg disequilibrium seems to be a major indicator for copy number variation.Peer ReviewedPostprint (published version

    A regional Bayesian POT model for flood frequency analysis

    Full text link
    Flood frequency analysis is usually based on the fitting of an extreme value distribution to the local streamflow series. However, when the local data series is short, frequency analysis results become unreliable. Regional frequency analysis is a convenient way to reduce the estimation uncertainty. In this work, we propose a regional Bayesian model for short record length sites. This model is less restrictive than the index flood model while preserving the formalism of "homogeneous regions". The performance of the proposed model is assessed on a set of gauging stations in France. The accuracy of quantile estimates as a function of the degree of homogeneity of the pooling group is also analysed. The results indicate that the regional Bayesian model outperforms the index flood model and local estimators. Furthermore, it seems that working with relatively large and homogeneous regions may lead to more accurate results than working with smaller and highly homogeneous regions

    Seasonal Arctic sea ice forecasting with probabilistic deep learning

    Get PDF
    Anthropogenic warming has led to an unprecedented year-round reduction in Arctic sea ice extent. This has far-reaching consequences for indigenous and local communities, polar ecosystems, and global climate, motivating the need for accurate seasonal sea ice forecasts. While physics-based dynamical models can successfully forecast sea ice concentration several weeks ahead, they struggle to outperform simple statistical benchmarks at longer lead times. We present a probabilistic, deep learning sea ice forecasting system, IceNet. The system has been trained on climate simulations and observational data to forecast the next 6 months of monthly-averaged sea ice concentration maps. We show that IceNet advances the range of accurate sea ice forecasts, outperforming a state-of-the-art dynamical model in seasonal forecasts of summer sea ice, particularly for extreme sea ice events. This step-change in sea ice forecasting ability brings us closer to conservation tools that mitigate risks associated with rapid sea ice loss

    Risk-Sensitive Mean-Field Type Control under Partial Observation

    Full text link
    We establish a stochastic maximum principle (SMP) for control problems of partially observed diffusions of mean-field type with risk-sensitive performance functionals.Comment: arXiv admin note: text overlap with arXiv:1404.144

    Whole genome association mapping by incompatibilities and local perfect phylogenies

    Get PDF
    BACKGROUND: With current technology, vast amounts of data can be cheaply and efficiently produced in association studies, and to prevent data analysis to become the bottleneck of studies, fast and efficient analysis methods that scale to such data set sizes must be developed. RESULTS: We present a fast method for accurate localisation of disease causing variants in high density case-control association mapping experiments with large numbers of cases and controls. The method searches for significant clustering of case chromosomes in the "perfect" phylogenetic tree defined by the largest region around each marker that is compatible with a single phylogenetic tree. This perfect phylogenetic tree is treated as a decision tree for determining disease status, and scored by its accuracy as a decision tree. The rationale for this is that the perfect phylogeny near a disease affecting mutation should provide more information about the affected/unaffected classification than random trees. If regions of compatibility contain few markers, due to e.g. large marker spacing, the algorithm can allow the inclusion of incompatibility markers in order to enlarge the regions prior to estimating their phylogeny. Haplotype data and phased genotype data can be analysed. The power and efficiency of the method is investigated on 1) simulated genotype data under different models of disease determination 2) artificial data sets created from the HapMap ressource, and 3) data sets used for testing of other methods in order to compare with these. Our method has the same accuracy as single marker association (SMA) in the simplest case of a single disease causing mutation and a constant recombination rate. However, when it comes to more complex scenarios of mutation heterogeneity and more complex haplotype structure such as found in the HapMap data our method outperforms SMA as well as other fast, data mining approaches such as HapMiner and Haplotype Pattern Mining (HPM) despite being significantly faster. For unphased genotype data, an initial step of estimating the phase only slightly decreases the power of the method. The method was also found to accurately localise the known susceptibility variants in an empirical data set – the ΔF508 mutation for cystic fibrosis – where the susceptibility variant is already known – and to find significant signals for association between the CYP2D6 gene and poor drug metabolism, although for this dataset the highest association score is about 60 kb from the CYP2D6 gene. CONCLUSION: Our method has been implemented in the Blossoc (BLOck aSSOCiation) software. Using Blossoc, genome wide chip-based surveys of 3 million SNPs in 1000 cases and 1000 controls can be analysed in less than two CPU hours

    Short-term stability in refractive status despite large fluctuations in glucose levels in diabetes mellitus type 1 and 2

    Get PDF
    Purpose: This work investigates how short-term changes in blood glucose concentration affect the refractive components of the diabetic eye in patients with long-term Type 1 and Type 2 diabetes. Methods: Blood glucose concentration, refractive error components (mean spherical equivalent MSE, J0, J45), central corneal thickness (CCT), anterior chamber depth (ACD), crystalline lens thickness (LT), axial length (AL) and ocular aberrations were monitored at two-hourly intervals over a 12-hour period in: 20 T1DM patients (mean age ± SD) 38±14 years, baseline HbA1c 8.6±1.9%; 21 T2DM patients (mean age ± SD) 56±11 years, HbA1c 7.5±1.8%; and in 20 control subjects (mean age ± SD) 49±23 years, HbA1c 5.5±0.5%. The refractive and biometric results were compared with the corresponding changes in blood glucose concentration. Results: Blood glucose concentration at different times was found to vary significantly within (p0.05). Minor changes of marginal statistical or optical significance were observed in some biometric parameters. Similarly there were some marginally significant differences between the baseline biometric parameters of well-controlled and poorly-controlled diabetic subjects. Conclusion: This work suggests that normal, short-term fluctuations (of up to about 6 mM/l on a timescale of a few hours) in the blood glucose levels of diabetics are not usually associated with acute changes in refractive error or ocular wavefront aberrations. It is therefore possible that factors other than refractive error fluctuations are sometimes responsible for the transient visual problems often reported by diabetic patients
    corecore