66 research outputs found

    Using Technology to Develop Preservice Teachers\u27 Reflective Thinking

    Full text link
    Developing high-level reflection skills proves troublesome for some preservice teachers. To examine the potential of an online environment for increasing productive reflection, students in three sequential undergraduate education classes responded to regular online prompts. We coded student comments for productive and unproductive reflection, knowledge integration, and analysis of the four aspects of teaching (learners and learning, subject matter knowledge, assessment and instruction ) as described by Davis, Bain, & Harrington (2001). We adapted a scoring approach recommended by Davis & Linn, (2000); Davis (2003) to analyze what aspects of teaching preservice teachers included, emphasized, and integrated when they reflected on their own beliefs about teaching. Discussion examines the utility of online environments for producing productive preservice teacher reflection

    Size and characteristics of the biomedical research workforce associated with U.S. National Institutes of Health extramural grants

    Get PDF
    The U.S. National Institutes of Health (NIH) annually invests approximately $22 billion in biomedical research through its extramural grant programs. Since fiscal year (FY) 2010, all persons involved in research during the previous project year have been required to be listed on the annual grant progress report. These new data have enabled the production of the first-ever census of the NIH-funded extramural research workforce. Data were extracted from All Personnel Reports submitted for NIH grants funded in FY 2009, including position title, months of effort, academic degrees obtained, and personal identifiers. Data were de-duplicated to determine a unique person count. Person-years of effort (PYE) on NIH grants were computed. In FY 2009, NIH funded 50,885 grant projects, which created 313,049 full- and part-time positions spanning all job functions involved in biomedical research. These positions were staffed by 247,457 people at 2,604 institutions. These persons devoted 121,465 PYE to NIH grant-supported research. Research project grants each supported 6 full- or part-time positions, on average. Over 20% of positions were occupied by postdoctoral researchers and graduate and undergraduate students. These baseline data were used to project workforce estimates forFYs 2010–2014 and will serve as a foundation for future research

    Size and characteristics of the biomedical research workforce associated with U.S. National Institutes of Health extramural grants

    Get PDF
    The U.S. National Institutes of Health (NIH) annually invests approximately $22 billion in biomedical research through its extramural grant programs. Since fiscal year (FY) 2010, all persons involved in research during the previous project year have been required to be listed on the annual grant progress report. These new data have enabled the production of the first-ever census of the NIH-funded extramural research workforce. Data were extracted from All Personnel Reports submitted for NIH grants funded in FY 2009, including position title, months of effort, academic degrees obtained, and personal identifiers. Data were de-duplicated to determine a unique person count. Person-years of effort (PYE) on NIH grants were computed. In FY 2009, NIH funded 50,885 grant projects, which created 313,049 full- and part-time positions spanning all job functions involved in biomedical research. These positions were staffed by 247,457 people at 2,604 institutions. These persons devoted 121,465 PYE to NIH grant-supported research. Research project grants each supported 6 full- or part-time positions, on average. Over 20% of positions were occupied by postdoctoral researchers and graduate and undergraduate students. These baseline data were used to project workforce estimates forFYs 2010–2014 and will serve as a foundation for future research

    Composite likelihood estimation of demographic parameters

    Get PDF
    which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Background: Most existing likelihood-based methods for fitting historical demographic models to DNA sequence polymorphism data to do not scale feasibly up to the level of whole-genome data sets. Computational economies can be achieved by incorporating two forms of pseudo-likelihood: composite and approximate likelihood methods. Composite likelihood enables scaling up to large data sets because it takes the product of marginal likelihoods as an estimator of the likelihood of the complete data set. This approach is especially useful when a large number of genomic regions constitutes the data set. Additionally, approximate likelihood methods can reduce the dimensionality of the data by summarizing the information in the original data by either a sufficient statistic, or a set of statistics. Both composite and approximate likelihood methods hold promise for analyzing large data sets or for use in situations where the underlying demographic model is complex and has many parameters. This paper considers a simple demographic model of allopatric divergence between two populations, in which one of the population is hypothesized to have experienced a founder event, or population bottleneck. A large resequencing data set from human populations is summarized by the joint frequency spectrum, which is a matrix of the genomic frequency spectrum of derived base frequencies in two populations. A Bayesia

    Influential Periods in Longitudinal Clinical Cardiovascular Health Scores

    Get PDF
    The prevalence of ideal cardiovascular health (CVH) among adults in the United States is low and decreases with age. Our objective was to identify specific age windows when the loss of CVH accelerates, to ascertain preventive opportunities for intervention. Data were pooled from 5 longitudinal cohorts (Project Heartbeat!, Cardiovascular Risk in Young Finns Study, The Bogalusa Heart Study, Coronary Artery Risk Development in Young Adults, Special Turku Coronary Risk Factor Intervention Project) from the United States and Finland from 1973 to 2012. Individuals with clinical CVH factors (i.e., body mass index, blood pressure, cholesterol, blood glucose) measured from ages 8 to 55 years were included. These factors were categorized and summed into a clinical CVH score ranging from 0 (worst) to 8 (best). Adjusted, segmented, linear mixed models were used to estimate the change in CVH over time. Among the 18,343 participants, 9,461 (52%) were female and 12,346 (67%) were White. The baseline mean (standard deviation) clinical CVH score was 6.9 (1.2) at an average age of 17.6 (8.1) years. Two inflection points were estimated: at 16.9 years (95% confidence interval: 16.4, 17.4) and at 37.2 years (95% confidence interval: 32.4, 41.9). Late adolescence and early middle age appear to be influential periods during which the loss of CVH accelerates.publishedVersionPeer reviewe

    Influential Periods in Longitudinal Clinical Cardiovascular Health Scores

    Get PDF
    The prevalence of ideal cardiovascular health (CVH) among adults in the United States is low and decreases with age. Our objective was to identify specific age windows when the loss of CVH accelerates, to ascertain preventive opportunities for intervention. Data were pooled from 5 longitudinal cohorts (Project Heartbeat!, Cardiovascular Risk in Young Finns Study, The Bogalusa Heart Study, Coronary Artery Risk Development in Young Adults, Special Turku Coronary Risk Factor Intervention Project) from the United States and Finland from 1973 to 2012. Individuals with clinical CVH factors (i.e., body mass index, blood pressure, cholesterol, blood glucose) measured from ages 8 to 55 years were included. These factors were categorized and summed into a clinical CVH score ranging from 0 (worst) to 8 (best). Adjusted, segmented, linear mixed models were used to estimate the change in CVH over time. Among the 18,343 participants, 9,461 (52%) were female and 12,346 (67%) were White. The baseline mean (standard deviation) clinical CVH score was 6.9 (1.2) at an average age of 17.6 (8.1) years. Two inflection points were estimated: at 16.9 years (95% confidence interval: 16.4, 17.4) and at 37.2 years (95% confidence interval: 32.4, 41.9). Late adolescence and early middle age appear to be influential periods during which the loss of CVH accelerates. </p

    Genetic Relationship between Schizophrenia and Nicotine Dependence

    Get PDF
    It is well known that most schizophrenia patients smoke cigarettes. There are different hypotheses postulating the underlying mechanisms of this comorbidity. We used summary statistics from large meta-analyses of plasma cotinine concentration (COT), Fagerström test for nicotine dependence (FTND) and schizophrenia to examine the genetic relationship between these traits. We found that schizophrenia risk scores calculated at P-value thresholds of 5 × 10−3 and larger predicted FTND and cigarettes smoked per day (CPD), suggesting that genes most significantly associated with schizophrenia were not associated with FTND/CPD, consistent with the self-medication hypothesis. The COT risk scores predicted schizophrenia diagnosis at P-values of 5 × 10−3 and smaller, implying that genes most significantly associated with COT were associated with schizophrenia. These results implicated that schizophrenia and FTND/CPD/COT shared some genetic liability. Based on this shared liability, we identified multiple long non-coding RNAs and RNA binding protein genes (DA376252, BX089737, LOC101927273, LINC01029, LOC101928622, HY157071, DA902558, RBFOX1 and TINCR), protein modification genes (MANBA, UBE2D3, and RANGAP1) and energy production genes (XYLB, MTRF1 and ENOX1) that were associated with both conditions. Further analyses revealed that these shared genes were enriched in calcium signaling, long-term potentiation and neuroactive ligand-receptor interaction pathways that played a critical role in cognitive functions and neuronal plasticity.</p

    Avoidable costs of physical treatments for chronic back, neck and shoulder pain within the Spanish National Health Service: a cross-sectional study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Back, neck and shoulder pain are the most common causes of occupational disability. They reduce health-related quality of life and have a significant economic impact. Many different forms of physical treatment are routinely used. The objective of this study was to estimate the cost of physical treatments which, despite the absence of evidence supporting their effectiveness, were used between 2004 and 2007 for chronic and non-specific neck pain (NP), back pain (BP) and shoulder pain (SP), within the Spanish National Health Service in the Canary Islands (SNHSCI).</p> <p>Methods</p> <p>Chronic patients referred from the SNHSCI to private physical therapy centres for NP, BP or SP, between 2004 and 2007, were identified. The cost of providing physical therapies to these patients was estimated. Systematic reviews (SRs) and clinical practice guidelines (CPGs) for NP, BP and SP available in the same period were searched for and rated according to the Oxman and AGREE criteria, respectively. Those rated positively for ≄70% of the criteria, were used to categorise physical therapies as Effective; Ineffective; Inconclusive; and Insufficiently Assessed. The main outcome was the cost of physical therapies included in each of these categories.</p> <p>Results</p> <p>8,308 chronic cases of NP, 4,693 of BP and 5,035 of SP, were included in this study. Among prescribed treatments, 39.88% were considered Effective (physical exercise and manual therapy with mobilization); 23.06% Ineffective; 13.38% Inconclusive, and 23.66% Insufficiently Assessed. The total cost of treatments was € 5,107,720. Effective therapies accounted for € 2,069,932.</p> <p>Conclusions</p> <p>Sixty percent of the resources allocated by the SNHSCI to fund physical treatment for NP, BP and SP in private practices are spent on forms of treatment proven to be ineffective, or for which there is no evidence of effectiveness.</p

    An Exploration of Bitcoin Mining Practices:Miners’ Trust Challenges and Motivations

    Get PDF
    Bitcoin blockchain technology is a distributed ledger of nodes authorizing transactions between anonymous parties. Its key actors are miners using computational power to solve mathematical problems for validating transactions. By sharing blockchain's characteristics, mining is a decentralized, transparent and unregulated practice, less explored in HCI, so we know little about miners' motivations and experiences, and how these may impact on different dimensions of trust. This paper reports on interviews with 20 bitcoin miners about their practices and trust challenges. Findings contribute to HCI theories by extending the exploration of blockchain's characteristics relevant to trust with the competitiveness dimension underpinning the social organization of mining. We discuss the risks of collaborative mining due to centralization and dishonest administrators, and conclude with design implications highlighting the need for tools monitoring the distribution of rewards in collaborative mining, tools tracking data centers' authorization and reputation, and tools supporting the development of decentralized pools
    • 

    corecore