1,485 research outputs found

    Character building in childrens’ online information behaviours: applying a virtue epistemology perspective to information literacy.

    Get PDF
    This paper advances our understanding of the theoretical and practical challenges of developing intellectual character in children’s online information behaviours. We argue that widely reported issues such as misinformation and disinformation extend IL education beyond considerations of ability to considerations of disposition, and highlight this as an understudied topic within IL education. We introduce the classical concept of intellectual character and discuss virtues traits in the IL context. Applying Baehr’s nine intellectual virtues to two commonly cited IL models, we evidence limited presence of virtues in IL models, and propose an important agenda for future research

    Novel statistical approaches for non-normal censored immunological data: analysis of cytokine and gene expression data

    Get PDF
    Background: For several immune-mediated diseases, immunological analysis will become more complex in the future with datasets in which cytokine and gene expression data play a major role. These data have certain characteristics that require sophisticated statistical analysis such as strategies for non-normal distribution and censoring. Additionally, complex and multiple immunological relationships need to be adjusted for potential confounding and interaction effects. Objective: We aimed to introduce and apply different methods for statistical analysis of non-normal censored cytokine and gene expression data. Furthermore, we assessed the performance and accuracy of a novel regression approach in order to allow adjusting for covariates and potential confounding. Methods: For non-normally distributed censored data traditional means such as the Kaplan-Meier method or the generalized Wilcoxon test are described. In order to adjust for covariates the novel approach named Tobit regression on ranks was introduced. Its performance and accuracy for analysis of non-normal censored cytokine/gene expression data was evaluated by a simulation study and a statistical experiment applying permutation and bootstrapping. Results: If adjustment for covariates is not necessary traditional statistical methods are adequate for non-normal censored data. Comparable with these and appropriate if additional adjustment is required, Tobit regression on ranks is a valid method. Its power, type-I error rate and accuracy were comparable to the classical Tobit regression. Conclusion: Non-normally distributed censored immunological data require appropriate statistical methods. Tobit regression on ranks meets these requirements and can be used for adjustment for covariates and potential confounding in large and complex immunological datasets

    Quantifying the interplay of experimental constraints in analyses of parton distributions

    Get PDF
    Parton distribution functions (PDFs) play a central role in calculations for the LHC. To gain a deeper understanding of the emergence and interplay of constraints on the PDFs in the global QCD analyses, it is important to examine the relative significance and mutual compatibility of the experimental datasets included in the PDF fits. Toward this goal, we discuss the L2 sensitivity, a convenient statistical indicator for exploring the statistical pulls of individual datasets on the best-fit PDFs and identifying tensions between competing datasets. Unlike the Lagrange multiplier method, the L2 sensitivity can be quickly computed for a range of PDFs and momentum fractions using the published Hessian error sets. We employ the L2 sensitivity as a common metric to study the relative importance of datasets in the recent ATLAS, CTEQ-TEA, MSHT, and reduced PDF4LHC21 PDF analyses at next-to-next-to-leading-order and approximate next-to-next-to-next-to-leading-order. We illustrate how this method can aid the users of PDFs to identify datasets that are important for a PDF at a given kinematic point, to study quark flavor composition and other detailed features of the PDFs, and to compare the data pulls on the PDFs for various perturbative orders and functional forms. We also address the feasibility of computing the sensitivities using Monte Carlo error PDFs. Together with the article, we present a companion interactive website with a large collection of plotted L2 sensitivities for eight recent PDF releases and a C++ program to plot the L2 sensitivities

    Prompt atmospheric neutrino fluxes: perturbative QCD models and nuclear effects

    Full text link
    We evaluate the prompt atmospheric neutrino flux at high energies using three different frameworks for calculating the heavy quark production cross section in QCD: NLO perturbative QCD, kTk_T factorization including low-xx resummation, and the dipole model including parton saturation. We use QCD parameters, the value for the charm quark mass and the range for the factorization and renormalization scales that provide the best description of the total charm cross section measured at fixed target experiments, at RHIC and at LHC. Using these parameters we calculate differential cross sections for charm and bottom production and compare with the latest data on forward charm meson production from LHCb at 77 TeV and at 1313 TeV, finding good agreement with the data. In addition, we investigate the role of nuclear shadowing by including nuclear parton distribution functions (PDF) for the target air nucleus using two different nuclear PDF schemes. Depending on the scheme used, we find the reduction of the flux due to nuclear effects varies from 10%10\% to 50%50 \% at the highest energies. Finally, we compare our results with the IceCube limit on the prompt neutrino flux, which is already providing valuable information about some of the QCD models.Comment: 61 pages, 25 figures, 11 table

    Natriuretic Peptides and Assessment of Cardiovascular Disease Risk in Asymptomatic Persons

    Get PDF
    Current tools for cardiovascular disease (CVD) risk assessment in asymptomatic individuals are imperfect. Preventive measures aimed only at individuals deemed high risk by current algorithms neglect large numbers of low-risk and intermediate-risk individuals who are destined to develop CVD and who would benefit from early and aggressive treatment. Natriuretic peptides have the potential both to identify individuals at risk for future cardiovascular events and to help detect subclinical CVD. Choosing the appropriate subpopulation to target for natriuretic peptide testing will help maximize the performance and the cost effectiveness. The combined use of multiple risk markers, including biomarkers, genetic testing, and imaging or other noninvasive measures of risk, offers promise for further refining risk assessment algorithms. Recent studies have highlighted the utility of natriuretic peptides for preoperative risk stratification; however, cost effectiveness and outcomes studies are needed to affirm this and other uses of natriuretic peptides for cardiovascular risk assessment in asymptomatic individuals

    Blood Signature of Pre-Heart Failure: A Microarrays Study

    Get PDF
    International audienceBACKGROUND: The preclinical stage of systolic heart failure (HF), known as asymptomatic left ventricular dysfunction (ALVD), is diagnosed only by echocardiography, frequent in the general population and leads to a high risk of developing severe HF. Large scale screening for ALVD is a difficult task and represents a major unmet clinical challenge that requires the determination of ALVD biomarkers. METHODOLOGY/PRINCIPAL FINDINGS: 294 individuals were screened by echocardiography. We identified 9 ALVD cases out of 128 subjects with cardiovascular risk factors. White blood cell gene expression profiling was performed using pangenomic microarrays. Data were analyzed using principal component analysis (PCA) and Significant Analysis of Microarrays (SAM). To build an ALVD classifier model, we used the nearest centroid classification method (NCCM) with the ClaNC software package. Classification performance was determined using the leave-one-out cross-validation method. Blood transcriptome analysis provided a specific molecular signature for ALVD which defined a model based on 7 genes capable of discriminating ALVD cases. Analysis of an ALVD patients validation group demonstrated that these genes are accurate diagnostic predictors for ALVD with 87% accuracy and 100% precision. Furthermore, Receiver Operating Characteristic curves of expression levels confirmed that 6 out of 7 genes discriminate for left ventricular dysfunction classification. CONCLUSIONS/SIGNIFICANCE: These targets could serve to enhance the ability to efficiently detect ALVD by general care practitioners to facilitate preemptive initiation of medical treatment preventing the development of HF

    Educational paper: Abusive Head Trauma Part I. Clinical aspects

    Get PDF
    Abusive Head Trauma (AHT) refers to the combination of findings formerly described as shaken baby syndrome. Although these findings can be caused by shaking, it has become clear that in many cases there may have been impact trauma as well. Therefore a less specific term has been adopted by the American Academy of Pediatrics. AHT is a relatively common cause of childhood neurotrauma with an estimated incidence of 14–40 cases per 100,000 children under the age of 1 year. About 15–23% of these children die within hours or days after the incident. Studies among AHT survivors demonstrate that approximately one-third of the children are severely disabled, one-third of them are moderately disabled and one-third have no or only mild symptoms. Other publications suggest that neurological problems can occur after a symptom-free interval and that half of these children have IQs below the 10th percentile. Clinical findings are depending on the definitions used, but AHT should be considered in all children with neurological signs and symptoms especially if no or only mild trauma is described. Subdural haematomas are the most reported finding. The only feature that has been identified discriminating AHT from accidental injury is apnoea. Conclusion: AHT should be approached with a structured approach, as in any other (potentially lethal) disease. The clinician can only establish this diagnosis if he/she has knowledge of the signs and symptoms of AHT, risk factors, the differential diagnosis and which additional investigations to perform, the more so since parents seldom will describe the true state of affairs spontaneously

    Combination Therapy Is Superior to Sequential Monotherapy for the Initial Treatment of Hypertension:A Double-Blind Randomized Controlled Trial

    Get PDF
    Background: Guidelines for hypertension vary in their preference for initial combination therapy or initial monotherapy, stratified by patient profile; therefore, we compared the efficacy and tolerability of these approaches. Methods and Results: We performed a 1‐year, double‐blind, randomized controlled trial in 605 untreated patients aged 18 to 79 years with systolic blood pressure (BP) ≥150 mm Hg or diastolic BP ≥95 mm Hg. In phase 1 (weeks 0–16), patients were randomly assigned to initial monotherapy (losartan 50–100 mg or hydrochlorothiazide 12.5–25 mg crossing over at 8 weeks), or initial combination (losartan 50–100 mg plus hydrochlorothiazide 12.5–25 mg). In phase 2 (weeks 17–32), all patients received losartan 100 mg and hydrochlorothiazide 12.5 to 25 mg. In phase 3 (weeks 33–52), amlodipine with or without doxazosin could be added to achieve target BP. Hierarchical primary outcomes were the difference from baseline in home systolic BP, averaged over phases 1 and 2 and, if significant, at 32 weeks. Secondary outcomes included adverse events, and difference in home systolic BP responses between tertiles of plasma renin. Home systolic BP after initial monotherapy fell 4.9 mm Hg (range: 3.7–6.0 mm Hg) less over 32 weeks (P<0.001) than after initial combination but caught up at 32 weeks (difference 1.2 mm Hg [range: −0.4 to 2.8 mm Hg], P=0.13). In phase 1, home systolic BP response to each monotherapy differed substantially between renin tertiles, whereas response to combination therapy was uniform and at least 5 mm Hg more than to monotherapy. There were no differences in withdrawals due to adverse events. Conclusions: Initial combination therapy can be recommended for patients with BP >150/95 mm Hg. Clinical Trial Registration URL: http://www.ClinicalTrials.gov. Unique identifier: NCT00994617

    Monitoring and modelling landscape dynamics

    Get PDF
    International audienceChanges in land cover and land use are among the most pervasive and important sources of recent alterations of the Earth's land surface.This special issue also presents new directions in modelling landscape dynamics. Agent-based models have primarily been used to simulate local land use and land cover changes processes with a focus on decision making (Le 2008; Matthews et al. 2007; Parker et al. 2003; Bousquet and Le Page 2001)
    corecore