2,062 research outputs found

    Erratum to: can dynamic light improve melatonin production and quality of sleep?

    Full text link

    Impact of calcium on salivary α-amylase activity, starch paste apparent viscosity and thickness perception

    Get PDF
    Thickness perception of starch-thickened products during eating has been linked to starch viscosity and salivary amylase activity. Calcium is an essential cofactor for α-amylase and there is anecdotal evidence that adding extra calcium affects amylase activity in processes like mashing of beer. The aims of this paper were to (1) investigate the role of salivary calcium on α-amylase activity and (2) to measure the effect of calcium concentration on apparent viscosity and thickness perception when interacting with salivary α-amylase in starch-based samples. α-Amylase activity in saliva samples from 28 people was assessed using a typical starch pasting cycle (up to 95 °C). The activity of the enzyme (as measured by the change in starch apparent viscosity) was maintained by the presence of calcium, probably by protecting the enzyme from heat denaturation. Enhancement of α-amylase activity by calcium at 37 °C was also observed although to a smaller extent. Sensory analysis showed a general trend of decreased thickness perception in the presence of calcium, but the result was only significant for one pair of samples, suggesting a limited impact of calcium enhanced enzyme activity on perceived thickness

    Conformal Symmetry of a Black Hole as a Scaling Limit: A Black Hole in an Asymptotically Conical Box

    Full text link
    We show that the previously obtained subtracted geometry of four-dimensional asymptotically flat multi-charged rotating black holes, whose massless wave equation exhibit SL(2,R)×SL(2,R)×SO(3)SL(2,\R) \times SL(2,\R) \times SO(3) symmetry may be obtained by a suitable scaling limit of certain asymptotically flat multi-charged rotating black holes, which is reminiscent of near-extreme black holes in the dilute gas approximation. The co-homogeneity-two geometry is supported by a dilation field and two (electric) gauge-field strengths. We also point out that these subtracted geometries can be obtained as a particular Harrison transformation of the original black holes. Furthermore the subtracted metrics are asymptotically conical (AC), like global monopoles, thus describing "a black hole in an AC box". Finally we account for the the emergence of the SL(2,R)×SL(2,R)×SO(3)SL(2,\R) \times SL(2,\R) \times SO(3) symmetry as a consequence of the subtracted metrics being Kaluza-Klein type quotients of AdS3×4S3 AdS_3\times 4 S^3. We demonstrate that similar properties hold for five-dimensional black holes.Comment: Sections 3 and 4 significantly augmente

    Combining Experiments and Simulations Using the Maximum Entropy Principle

    Get PDF
    A key component of computational biology is to compare the results of computer modelling with experimental measurements. Despite substantial progress in the models and algorithms used in many areas of computational biology, such comparisons sometimes reveal that the computations are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy applications in our field has grown steadily in recent years, in areas as diverse as sequence analysis, structural modelling, and neurobiology. In this Perspectives article, we give a broad introduction to the method, in an attempt to encourage its further adoption. The general procedure is explained in the context of a simple example, after which we proceed with a real-world application in the field of molecular simulations, where the maximum entropy procedure has recently provided new insight. Given the limited accuracy of force fields, macromolecular simulations sometimes produce results that are at not in complete and quantitative accordance with experiments. A common solution to this problem is to explicitly ensure agreement between the two by perturbing the potential energy function towards the experimental data. So far, a general consensus for how such perturbations should be implemented has been lacking. Three very recent papers have explored this problem using the maximum entropy approach, providing both new theoretical and practical insights to the problem. We highlight each of these contributions in turn and conclude with a discussion on remaining challenges

    Clusters versus Affinity-Based Approaches in F. tularensis Whole Genome Search of CTL Epitopes

    Get PDF
    Deciphering the cellular immunome of a bacterial pathogen is challenging due to the enormous number of putative peptidic determinants. State-of-the-art prediction methods developed in recent years enable to significantly reduce the number of peptides to be screened, yet the number of remaining candidates for experimental evaluation is still in the range of ten-thousands, even for a limited coverage of MHC alleles. We have recently established a resource-efficient approach for down selection of candidates and enrichment of true positives, based on selection of predicted MHC binders located in high density “hotspots" of putative epitopes. This cluster-based approach was applied to an unbiased, whole genome search of Francisella tularensis CTL epitopes and was shown to yield a 17–25 fold higher level of responders as compared to randomly selected predicted epitopes tested in Kb/Db C57BL/6 mice. In the present study, we further evaluate the cluster-based approach (down to a lower density range) and compare this approach to the classical affinity-based approach by testing putative CTL epitopes with predicted IC50 values of <10 nM. We demonstrate that while the percent of responders achieved by both approaches is similar, the profile of responders is different, and the predicted binding affinity of most responders in the cluster-based approach is relatively low (geometric mean of 170 nM), rendering the two approaches complimentary. The cluster-based approach is further validated in BALB/c F. tularensis immunized mice belonging to another allelic restriction (Kd/Dd) group. To date, the cluster-based approach yielded over 200 novel F. tularensis peptides eliciting a cellular response, all were verified as MHC class I binders, thereby substantially increasing the F. tularensis dataset of known CTL epitopes. The generality and power of the high density cluster-based approach suggest that it can be a valuable tool for identification of novel CTLs in proteomes of other bacterial pathogens

    The politicisation of evaluation: constructing and contesting EU policy performance

    Get PDF
    Although systematic policy evaluation has been conducted for decades and has been growing strongly within the European Union (EU) institutions and in the member states, it remains largely underexplored in political science literatures. Extant work in political science and public policy typically focuses on elements such as agenda setting, policy shaping, decision making, or implementation rather than evaluation. Although individual pieces of research on evaluation in the EU have started to emerge, most often regarding policy “effectiveness” (one criterion among many in evaluation), a more structured approach is currently missing. This special issue aims to address this gap in political science by focusing on four key focal points: evaluation institutions (including rules and cultures), evaluation actors and interests (including competencies, power, roles and tasks), evaluation design (including research methods and theories, and their impact on policy design and legislation), and finally, evaluation purpose and use (including the relationships between discourse and scientific evidence, political attitudes and strategic use). The special issue considers how each of these elements contributes to an evolving governance system in the EU, where evaluation is playing an increasingly important role in decision making

    Large-scale validation of methods for cytotoxic T-lymphocyte epitope prediction

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Reliable predictions of Cytotoxic T lymphocyte (CTL) epitopes are essential for rational vaccine design. Most importantly, they can minimize the experimental effort needed to identify epitopes. NetCTL is a web-based tool designed for predicting human CTL epitopes in any given protein. It does so by integrating predictions of proteasomal cleavage, TAP transport efficiency, and MHC class I affinity. At least four other methods have been developed recently that likewise attempt to predict CTL epitopes: EpiJen, MAPPP, MHC-pathway, and WAPP. In order to compare the performance of prediction methods, objective benchmarks and standardized performance measures are needed. Here, we develop such large-scale benchmark and corresponding performance measures and report the performance of an updated version 1.2 of NetCTL in comparison with the four other methods.</p> <p>Results</p> <p>We define a number of performance measures that can handle the different types of output data from the five methods. We use two evaluation datasets consisting of known HIV CTL epitopes and their source proteins. The source proteins are split into all possible 9 mers and except for annotated epitopes; all other 9 mers are considered non-epitopes. In the RANK measure, we compare two methods at a time and count how often each of the methods rank the epitope highest. In another measure, we find the specificity of the methods at three predefined sensitivity values. Lastly, for each method, we calculate the percentage of known epitopes that rank within the 5% peptides with the highest predicted score.</p> <p>Conclusion</p> <p>NetCTL-1.2 is demonstrated to have a higher predictive performance than EpiJen, MAPPP, MHC-pathway, and WAPP on all performance measures. The higher performance of NetCTL-1.2 as compared to EpiJen and MHC-pathway is, however, not statistically significant on all measures. In the large-scale benchmark calculation consisting of 216 known HIV epitopes covering all 12 recognized HLA supertypes, the NetCTL-1.2 method was shown to have a sensitivity among the 5% top-scoring peptides above 0.72. On this dataset, the best of the other methods achieved a sensitivity of 0.64. The NetCTL-1.2 method is available at <url>http://www.cbs.dtu.dk/services/NetCTL</url>.</p> <p>All used datasets are available at <url>http://www.cbs.dtu.dk/suppl/immunology/CTL-1.2.php</url>.</p

    Expanding the evolutionary explanations for sex differences in the human skeleton

    Get PDF
    While the anatomy and physiology of human reproduction differ between the sexes, the effects of hormones on skeletal growth do not. Human bone growth depends on estrogen. Greater estrogen produced by ovaries causes bones in female bodies to fuse before males\u27 resulting in sex differences in adult height and mass. Female pelves expand more than males\u27 due to estrogen and relaxin produced and employed by the tissues of the pelvic region and potentially also due to greater internal space occupied by female gonads and genitals. Evolutionary explanations for skeletal sex differences (aka sexual dimorphism) that focus too narrowly on big competitive men and broad birthing women must account for the adaptive biology of skeletal growth and its dependence on the developmental physiology of reproduction. In this case, dichotomizing evolution into proximate‐ultimate categories may be impeding the progress of human evolutionary science, as well as enabling the popular misunderstanding and abuse of it

    Harmonic publication and citation counting: sharing authorship credit equitably – not equally, geometrically or arithmetically

    Get PDF
    Bibliometric counting methods need to be validated against perceived notions of authorship credit allocation, and standardized by rejecting methods with poor fit or questionable ethical implications. Harmonic counting meets these concerns by exhibiting a robust fit to previously published empirical data from medicine, psychology and chemistry, and by complying with three basic ethical criteria for the equitable sharing of authorship credit. Harmonic counting can also incorporate additional byline information about equal contribution, or the elevated status of a corresponding last author. By contrast, several previously proposed counting schemes from the bibliometric literature including arithmetic, geometric and fractional counting, do not fit the empirical data as well and do not consistently meet the ethical criteria. In conclusion, harmonic counting would seem to provide unrivalled accuracy, fairness and flexibility to the long overdue task of standardizing bibliometric allocation of publication and citation credit
    corecore