718 research outputs found

    Inferring mechanisms of copy number change from haplotype structures at the human DEFA1A3 locus

    Get PDF
    Background: The determination of structural haplotypes at copy number variable regions can indicate the mechanisms responsible for changes in copy number, as well as explain the relationship between gene copy number and expression. However, obtaining spatial information at regions displaying extensive copy number variation, such as the DEFA1A3 locus, is complex, because of the difficulty in the phasing and assembly of these regions. The DEFA1A3 locus is intriguing in that it falls within a region of high linkage disequilibrium, despite its high variability in copy number (n = 3–16); hence, the mechanisms responsible for changes in copy number at this locus are unclear. Results: In this study, a region flanking the DEFA1A3 locus was sequenced across 120 independent haplotypes with European ancestry, identifying five common classes of DEFA1A3 haplotype. Assigning DEFA1A3 class to haplotypes within the 1000 Genomes project highlights a significant difference in DEFA1A3 class frequencies between populations with different ancestry. The features of each DEFA1A3 class, for example, the associated DEFA1A3 copy numbers, were initially assessed in a European cohort (n = 599) and replicated in the 1000 Genomes samples, showing within-class similarity, but between-class and between-population differences in the features of the DEFA1A3 locus. Emulsion haplotype fusion-PCR was used to generate 61 structural haplotypes at the DEFA1A3 locus, showing a high within-class similarity in structure. Conclusions: Structural haplotypes across the DEFA1A3 locus indicate that intra-allelic rearrangement is the predominant mechanism responsible for changes in DEFA1A3 copy number, explaining the conservation of linkage disequilibrium across the locus. The identification of common structural haplotypes at the DEFA1A3 locus could aid studies into how DEFA1A3 copy number influences expression, which is currently unclear

    Effect of Nuclear Quadrupole Interaction on the Relaxation in Amorphous Solids

    Full text link
    Recently it has been experimentally demonstrated that certain glasses display an unexpected magnetic field dependence of the dielectric constant. In particular, the echo technique experiments have shown that the echo amplitude depends on the magnetic field. The analysis of these experiments results in the conclusion that the effect seems to be related to the nuclear degrees of freedom of tunneling systems. The interactions of a nuclear quadrupole electrical moment with the crystal field and of a nuclear magnetic moment with magnetic field transform the two-level tunneling systems inherent in amorphous dielectrics into many-level tunneling systems. The fact that these features show up at temperatures T<100mKT<100mK, where the properties of amorphous materials are governed by the long-range R−3R^{-3} interaction between tunneling systems, suggests that this interaction is responsible for the magnetic field dependent relaxation. We have developed a theory of many-body relaxation in an ensemble of interacting many-level tunneling systems and show that the relaxation rate is controlled by the magnetic field. The results obtained correlate with the available experimental data. Our approach strongly supports the idea that the nuclear quadrupole interaction is just the key for understanding the unusual behavior of glasses in a magnetic field.Comment: 18 pages, 9 figure

    Comparison of BMD changes and bone formation marker levels 3 years after bisphosphonate discontinuation: FLEX and HORIZON-PFT Extension I trials

    Get PDF
    An ASBMR task force recommends a drug holiday for certain women treated for ≥5 years with oral alendronate or ≥3 years with intravenous zoledronic acid, with reassessment 2-3 years later. It is not known whether changes in BMD or bone turnover markers differ after oral or intravenous therapy. Our goal was to compare changes in BMD and procollagen type I N propeptide, PINP, after oral or intravenous bisphosphonate use. In the Fracture Intervention Trial Long-term Extension (FLEX), women who received a mean 5 years of alendronate were randomized to placebo or continued treatment. In the Health Outcomes and Reduced Incidence with Zoledronic acid Once Yearly-Pivotal Fracture Trial Extension I (HORIZON-PFT E1), women who received 3 years of zoledronic acid were randomized to placebo or continued treatment. We examined the proportion of participants with BMD loss or PINP gain ≥least significant change (LSC), and those whose values exceeded a threshold (T score ≤-2.5 or PINP ≥36.0 ng/mL, a premenopausal median value). After 3 years of placebo, the FLEX group had greater mean total hip BMD decreases (-2.3% versus -1.2% in the HORIZON-PFT E1 group, p < 0.01), and greater rises in PINP (+11.6 ng/mL versus +6.7 ng/mL, p < 0.01). There was a greater proportion of individuals in FLEX with total hip BMD loss and PINP increases that exceeded LSC, and PINP values ≥36.0 ng/mL. In contrast, there were small changes in the proportion of women with femoral neck T scores ≤-2.5 in both groups. In conclusion, 3 years after bisphosphonate discontinuation, a considerable proportion of former alendronate and zoledronic acid users had meaningful declines in total hip BMD and elevations in PINP. Despite a longer treatment course, alendronate may have a more rapid offset of drug effect than zoledronic acid

    Motion integration using competitive priors

    Full text link
    Psychophysical experiments show that humans are better at perceiving rotation and expansion than translation [5][9]. These findings are inconsistent with standard models of motion integration which predict best performance for translation. To explain this discrepancy, our theory formulates motion perception at two levels of inference: we first perform model selection between the competing models (e.g. translation, rotation, and expansion) and then estimate the velocity using the selected model. We define novel prior models for smooth rotation and expansion using techniques similar to those in the slow-and-smooth model [23] (e.g. Green functions of differential operators). The theory gives good agreement with the trends observed in four human experiments

    Accurate measurement of gene copy number for human alpha-defensin DEFA1A3

    Get PDF
    Background: Multi-allelic copy number variants include examples of extensive variation between individuals in the copy number of important genes, most notably genes involved in immune function. The definition of this variation, and analysis of its impact on function, has been hampered by the technical difficulty of large-scale but accurate typing of genomic copy number. The copy-variable alpha-defensin locus DEFA1A3 on human chromosome 8 commonly varies between 4 and 10 copies per diploid genome, and presents considerable challenges for accurate high-throughput typing. Results: In this study, we developed two paralogue ratio tests and three allelic ratio measurements that, in combination, provide an accurate and scalable method for measurement of DEFA1A3 gene number. We combined information from different measurements in a maximum-likelihood framework which suggests that most samples can be assigned to an integer copy number with high confidence, and applied it to typing 589 unrelated European DNA samples. Typing the members of three-generation pedigrees provided further reassurance that correct integer copy numbers had been assigned. Our results have allowed us to discover that the SNP rs4300027 is strongly associated with DEFA1A3 gene copy number in European samples. Conclusions: We have developed an accurate and robust method for measurement of DEFA1A3 copy number. Interrogation of rs4300027 and associated SNPs in Genome-Wide Association Study SNP data provides no evidence that alpha-defensin copy number is a strong risk factor for phenotypes such as Crohn’s disease, type I diabetes, HIV progression and multiple sclerosis

    Superconductivity in graphene stacks: from the bilayer to graphite

    Full text link
    We study the superconducting phase transition, both in a graphene bilayer and in graphite. For that purpose we derive the mean-field effective potential for a stack of graphene layers presenting hopping between adjacent sheets. For describing superconductivity, we assume there is an on-site attractive interaction between electrons and determine the superconducting critical temperature as a function of the chemical potential. This displays a dome-shaped curve, in agreement with previous results for two-dimensional Dirac fermions. We show that the hopping between adjacent layers increases the critical temperature for small values of the chemical potential. Finally, we consider a minimal model for graphite and show that the transition temperature is higher than that for the graphene bilayer for small values of chemical potential. This might explain why intrinsic superconductivity is observed in graphite

    Evaluating G2G for use in Rapid Response Catchments: Final Report

    Get PDF
    Flood impacts can be severe for rapid response catchments (RRCs). Providing targeted flood warnings is challenging using existing methodologies and on account of the typical absence of river flow gauging. The Pitt Review of the Summer 2007 floods recognised the need for new alert procedures for RRCs able to exploit the new distributed flood forecasting capability being progressed from research into operations. Work on the G2G (Grid-to-Grid) distributed hydrological model was accelerated into operational practice to support 5-day countrywide flood outlooks, a major recommendation of the Pitt Review. The present study aims to explore the potential of G2G to support more frequent and detailed alerts relevant to flood warning in RRCs. Integral to this study is the use of emerging rainfall forecast products, in deterministic and ensemble form, which allow the lead-time of G2G flow forecasts to be extended and given an uncertainty context. This Report sets down the overall scope of the project, provides an introduction to G2G by way of background and then reports on the outcomes of the R&D study. This includes extensive preparatory work on collating historical datasets to support G2G model assessment, both relating to hydrometry and new rainfall forecast products. A framework is developed for assessing G2G in both simulation-mode and forecast-mode (as a function of lead-time) targeted at the RRC requirement. Relevant to the requirement is the RRC Register of points and areas of interest compiled by the Environment Agency, and the characteristics of RRCs (occurring in isolation or in combination): small catchment area, urban/sub-urban land-cover and steep slopes. The assessment framework is first applied assuming perfect knowledge of rainfall observations for past and future times, so as not to confound the analysis with errors from rainfall forecasts. Variability of performance measures across groups of sites is summarised through box and whisker plots, groups being differentiated on size of catchment area and nature of G2G run (simulation, and with the addition of state updating and flow insertion in turn). Skill scores judge how well the model performs in detecting a flood event exceeding a flow threshold, taken as the median annual flood (as an indicator of bankfull flow exceedance for natural channels) and fractional multipliers of it. The skill scores include POD (Probability of Detection) and FAR (False Alarm Ratio). Performance maps of R2 Efficiency, indicating the variability in the observations accounted for by the model, are used to portray the spatial variability of G2G accuracy across the country. G2G performance in small catchments, relevant to the RRC requirement, is best over South West, North East and North West regions; also median performance appears robust from one year to the next. Larger catchments benefit most in forecast-mode from flow insertion, whilst smaller headwater catchments gain particularly from ARMA (AutoRegressive Moving Average) error-prediction. An assessment is made of using deterministic rainfall forecasts from NWP UKV - the Numerical Weather Prediction UK Variable Resolution form of the Met Office Unified Model - in a full emulation of G2G in real-time, and using foreknowledge of rainfall observations as a reference baseline. Forecast quality can deteriorate strongly beyond 12 hours, especially for smaller catchments, whilst for some locations good performance is maintained even for long lead-times. Diagnostic analysis reveals that the UKV rainfall forecasts have patterns of overestimation in some lowland areas (e.g. over London) and leeward of high elevation areas (e.g. north and south Pennines). Overall performance is better in Scotland although there is evidence of UKV overestimating rainfall near the coast at Edinburgh and Elgin in the north. The assessment framework is extended to include rainfall forecast ensembles and probabilistic flood forecasting, using a combination of case-study and longer-term analyses. Blended Ensemble rainfall forecasts are assessed in two forms: forecasts out to 24 hours updated 4 times a day, and nowcasts out to 7 hours updated every 15 minutes. The 24 hour forecasts generally perform well as input to G2G in the case studies, the G2G flow forecasts typically signalling a flood peak 12 to 18 hours in advance and ahead of any observed response for small catchments. New regional summary map displays of the probability of flow threshold exceedances over a forecast horizon, and for increasing levels of severity, are developed to highlight evolving hotspots of flood risk over time. The first ever continuous assessment of G2G probability flow forecasts is reported using national maps of probabilistic skill scores - Relative Operating Characteristic (ROC) Skill Score and Brier Skill Score (BSS) - to spatially assess their performance. It is noted that the short periods available for assessment - a 7½ month period over England & Wales and 4 ½ months over Scotland - limit the analyses to low return period flow thresholds. Half the median (2-year) flood is used although a regional pooled analysis allows some assessment up to 5-year. The G2G probability forecast assessed is the probability of the chosen flow threshold being exceeded at any time over the forecast horizon (taken to be 24 hours). Comparison of these scores when applied to deterministic and probabilistic forecasts from G2G provides strong evidence of the value of G2G ensemble forecasts as an indicator of flood risk over Britain. Noticeably poorer performance indicated by the BSS across Scotland is in part attributed to the short, summer-dominated assessment period. Operational tools available to FFC and the SFFS for using G2G flow ensembles are reviewed and options for improvement identified drawing on the experience and findings of the study. This leads to identifying some work of an operational nature for consideration in Phase 3 of the project. The report closes with a summary of project achievements grouped thematically, a set of recommendations both of a general nature and specific to FFC and SFFS needs, and finally some proposals for consideration under Phase 3 of the G2G for Rapid Response Catchments project. Some key benefits arising from the project are summarised below. • Evidence has been produced that shows G2G has good skill in providing strategic forecasts for RRCs. The evidence is stratified by catchment type (area, urbanisation, headwater), form of forecast (simulation or forecast mode) and nature of rainfall input (raingauge, deterministic forecast, ensemble forecast). • Strong evidence has been presented on the advantage of using an ensemble rainfall forecast as input to G2G to obtain a probabilistic flood forecast for an RRC, relative to an approach where only a single deterministic rainfall and flood forecast is obtained. This indicates better guidance can be given on forecast flood risk for RRCs, improving the level of service provision for such catchments which are currently not well served. • An improved G2G model configuration, exploiting gauged flows from 912 sites and including new locally calibrated parameters, has been delivered and made operational for the FFC with England & Wales coverage. The benefit is improved operational flood forecast accuracy. For Scotland, an enhanced configuration will be delivered to SFFS in Spring 2014. • Detailed recommendations on how the visual presentation of G2G ensemble results could be improved are set down in this report. When further developed and implemented, these will prove of benefit to the preparation of Flood Guidance Statements issued by FFC and the SFFS across Britain

    Probing RS scenarios of flavour at LHC via leptonic channels

    Full text link
    We study a purely leptonic signature of the Randall-Sundrum scenario with Standard Model fields in the bulk at LHC: the contribution from the exchange of Kaluza-Klein (KK) excitations of gauge bosons to the clear Drell-Yan reaction. We show that this contribution is detectable (even with the low luminosities of the LHC initial regime) for KK masses around the TeV scale and for sufficiently large lepton couplings to KK gauge bosons. Such large couplings can be compatible with ElectroWeak precision data on the Zff coupling in the framework of the custodial O(3) symmetry recently proposed, for specific configurations of lepton localizations (along the extra dimension). These configurations can simultaneously reproduce the correct lepton masses, while generating acceptably small Flavour Changing Neutral Current (FCNC) effects. This LHC phenomenological analysis is realistic in the sense that it is based on fermion localizations which reproduce all the quark/lepton masses plus mixing angles and respect FCNC constraints in both the hadron and lepton sectors.Comment: 15 pages, 6 Figures, Latex fil

    Modeling the non-Markovian, non-stationary scaling dynamics of financial markets

    Full text link
    A central problem of Quantitative Finance is that of formulating a probabilistic model of the time evolution of asset prices allowing reliable predictions on their future volatility. As in several natural phenomena, the predictions of such a model must be compared with the data of a single process realization in our records. In order to give statistical significance to such a comparison, assumptions of stationarity for some quantities extracted from the single historical time series, like the distribution of the returns over a given time interval, cannot be avoided. Such assumptions entail the risk of masking or misrepresenting non-stationarities of the underlying process, and of giving an incorrect account of its correlations. Here we overcome this difficulty by showing that five years of daily Euro/US-Dollar trading records in the about three hours following the New York market opening, provide a rich enough ensemble of histories. The statistics of this ensemble allows to propose and test an adequate model of the stochastic process driving the exchange rate. This turns out to be a non-Markovian, self-similar process with non-stationary returns. The empirical ensemble correlators are in agreement with the predictions of this model, which is constructed on the basis of the time-inhomogeneous, anomalous scaling obeyed by the return distribution.Comment: Throughout revision. 15 pages, 6 figures. Presented by A.L. Stella in a Talk at the "Econophysics - Kolkata V'' International Workshop, March 2010, Saha Institute of Nuclear Physics, Kolkata, Indi

    A Review of Controlling Motivational Strategies from a Self-Determination Theory Perspective: Implications for Sports Coaches

    Get PDF
    The aim of this paper is to present a preliminary taxonomy of six controlling strategies, primarily based on the parental and educational literatures, which we believe are employed by coaches in sport contexts. Research in the sport and physical education literature has primarily focused on coaches’ autonomysupportive behaviours. Surprisingly, there has been very little research on the use of controlling strategies. A brief overview of the research which delineates each proposed strategy is presented, as are examples of the potential manifestation of the behaviours associated with each strategy in the context of sports coaching. In line with self-determination theory (Deci & Ryan, 1985; Ryan & Deci, 2002), we propose that coach behaviours employed to pressure or control athletes have the potential to thwart athletes’ feelings of autonomy, competence,and relatedness, which, in turn, undermine athletes’ self-determined motivation and contribute to the development of controlled motives. When athletes feel pressured to behave in a certain way, a variety of negative consequences are expected to ensue which are to the detriment of the athletes’ well-being. The purpose of this paper is to raise awareness and interest in the darker side of sport participation and to offer suggestions for future research in this area
    • …
    corecore