1,356 research outputs found
Improving Rural Educational Attainment
More often than not, policymakers focus on school-based strategies to spur improvements in the educational progress of students. The 2002 No Child Left Behind Act, which demands greater school accountability for student performance, is a case in point. Yet, what happens in the classroom is only part of the story.In fact, as Lionel J. Beaulieu, Glenn D. Israel and Ronald C. Wimberley show in their chapter in "Challenges for Rural America in the Twenty-First Century", family characteristics have from 5 to 10 times as much impact as school characteristics on reading and math scores of rural U.S. eighth graders.In addition, community characteristics have as much impact as school characteristics on test scores, although both community and chool characteristics tend to be more important in geographically isolated rural areas than those adjacent to metropolitan areas. Clearly, helping rural youth succeed academically is the collective responsibility of families, schools, and communities.This issue brief is a joint product of the Rural Sociological Society and the National Coalition for Rural Entrepreneurship, a collaboration of four Regional Rural Development Centers: The Northeast Regional Center for Rural Development, the Southern Rural Development Center, the North Central Regional Center for Rural Development, and the Western Rural Development Center. Funding was also made available from the Ford Foundation.This brief is part of a policy brief series by the Rural Sociological Society and the Regional Rural Development Centers that stresses the importance of community collective action and developing the capacity of people and organizations to meet the community's needsThe Rural Sociological Society and the Regional Rural Development Centers creates new Public Policy Issue Brief series based on its recent book, "Challenges for Rural America in the Twenty-First Century".The briefs synthesize the context and substance of important issues raised in the book and address alternative policy options, with the goal of bringing important research to the policy community
Validation of an algorithm-based definition of treatment resistance in patients with schizophrenia
Large-scale pharmacoepidemiological research on treatment resistance relies on accurate identification of people with treatment-resistant schizophrenia (TRS) based on data that are retrievable from administrative registers. This is usually approached by operationalising clinical treatment guidelines by using prescription and hospital admission information. We examined the accuracy of an algorithm-based definition of TRS based on clozapine prescription and/or meeting algorithm-based eligibility criteria for clozapine against a gold standard definition using case notes. We additionally validated a definition entirely based on clozapine prescription. 139 schizophrenia patients aged 18â65 years were followed for a mean of 5 years after first presentation to psychiatric services in South-London, UK. The diagnostic accuracy of the algorithm-based measure against the gold standard was measured with sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV). A total of 45 (32.4%) schizophrenia patients met the criteria for the gold standard definition of TRS; applying the algorithm-based definition to the same cohort led to 44 (31.7%) patients fulfilling criteria for TRS with sensitivity, specificity, PPV and NPV of 62.2%, 83.0%, 63.6% and 82.1%, respectively. The definition based on lifetime clozapine prescription had sensitivity, specificity, PPV and NPV of 40.0%, 94.7%, 78.3% and 76.7%, respectively. Although a perfect definition of TRS cannot be derived from available prescription and hospital registers, these results indicate that researchers can confidently use registries to identify individuals with TRS for research and clinical practices
Recommended from our members
Clozapine use in childhood and adolescent schizophrenia: A nationwide population-based study
Early onset schizophrenia (EOS) begins in childhood or adolescence. EOS is associated with poor treatment response and may benefit from timely use of clozapine. This study aimed to identify the predictors of clozapine use in EOS and characterize the clinical profile and outcome of clozapine-treated youths with schizophrenia. We conducted a nationwide population-based study using linked data from Danish medical registries. We examined all incident cases of EOS (i.e., cases diagnosed prior to their 18th birthday) between December 31st 1994 and December 31st 2006 and characterized their demographic, clinical and treatment profiles. We then used multivariable cox proportional hazard models to identify predictors of clozapine treatment in this patient population. We identified 662 EOS cases (1.9% of all schizophrenia cases), of whom 108 (17.6%) had commenced clozapine by December 31st 2008. Patients had on average 3 antipsychotic trials prior to clozapine initiation. The mean interval between first antipsychotic treatment and clozapine initiation was 3.2 (2.9) years. Older age at diagnosis of schizophrenia [HR=1.2, 95% CI (1.05-1.4), p=0.01], family history of schizophrenia [HR=2.1, 95% CI (1.1-3.04), p=0.02] and attempted suicide [HR=1.8, 95% CI (1.1-3.04), p=0.02] emerged as significant predictors of clozapine use. The majority of patients (n=96, 88.8%) prescribed clozapine appeared to have a favorable clinical response as indicated by continued prescription redemption and improved occupational outcomes. Our findings support current recommendations for the timely use of clozapine in EOS
International Federation of Clinical Chemistry (IFCC): Scientific Division, Committee on pH, Blood Gases and Electrolytes: Guidelines for Transcutaneouspo2andpco2 Measurement
This document provides guidelines for the terminology, methodology,
and for the interpretation of data obtained from the use of skin
(transcutaneous) po2 and pco2 electrodes. The transcutaneous
technique has found special application in newborn infants. The
causes of analytical bias with respect to arterial blood gas values,
and imprecision obtained with transcutaneous pco2 electrodes, are reviewed. Electrode temperatures above 44°C should not be used
routinely, and, at a measuring temperature of 44°C, the measuring
site should be changed at least every 4 h to avoid skin burns
Modelling [18F]LW223 PET data using simplified imaging protocols for quantification of TSPO expression in the rat heart and brain
PURPOSE: To provide a comprehensive assessment of the novel 18Â kDa translocator protein (TSPO) radiotracer, [(18)F]LW223, kinetics in the heart and brain when using a simplified imaging approach. METHODS: Naive adult rats and rats with surgically induced permanent coronary artery ligation received a bolus intravenous injection of [(18)F]LW223 followed by 120Â min PET scanning with arterial blood sampling throughout. Kinetic modelling of PET data was applied to estimated rate constants, total volume of distribution (V(T)) and binding potential transfer corrected (BP(TC)) using arterial or image-derived input function (IDIF). Quantitative bias of simplified protocols using IDIF versus arterial input function (AIF) and stability of kinetic parameters for PET imaging data of different length (40â120Â min) were estimated. RESULTS: PET outcome measures estimated using IDIF significantly correlated with those derived with invasive AIF, albeit with an inherent systematic bias. Truncation of the dynamic PET scan duration to less than 100Â min reduced the stability of the kinetic modelling outputs. Quantification of [(18)F]LW223 uptake kinetics in the brain and heart required the use of different outcome measures, with BP(TC) more stable in the heart and V(T) more stable in the brain. CONCLUSION: Modelling of [(18)F]LW223 PET showed the use of simplified IDIF is acceptable in the rat and the minimum scan duration for quantification of TSPO expression in rats using kinetic modelling with this radiotracer is 100Â min. Carefully assessing kinetic outcome measures when conducting a systems level as oppose to single-organ centric analyses is crucial. This should be taken into account when assessing the emerging role of the TSPO heart-brain axis in the field of PET imaging. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1007/s00259-021-05482-1
The association of glucose metabolism measures and diabetes status with Alzheimer's disease biomarkers of amyloid and tau: A systematic review and meta-analysis
Conflicting evidence exists on the relationship between diabetes mellitus (DM) and Alzheimer's disease (AD) biomarkers. Therefore, we conducted a random-effects meta-analysis to evaluate the correlation of glucose metabolism measures (glycated hemoglobin, fasting blood glucose, insulin resistance indices) and DM status with AD biomarkers of amyloid-ÎČ and tau measured by positron emission tomography or cerebrospinal fluid. We selected 37 studies from PubMed and Embase, including 11,694 individuals. More impaired glucose metabolism and DM status were associated with higher tau biomarkers (r=0.11[0.03-0.18], p=0.008; I2=68%), but were not associated with amyloid-ÎČ biomarkers (r=-0.06[-0.13-0.01], p=0.08; I2=81%). Meta-regression revealed that glucose metabolism and DM were specifically associated with tau biomarkers in population settings (p=0.001). Furthermore, more impaired glucose metabolism and DM status were associated with lower amyloid-ÎČ biomarkers in memory clinic settings (p=0.004), and in studies with a higher prevalence of dementia (p<0.001) or lower cognitive scores (p=0.04). These findings indicate that DM is associated with biomarkers of tau but not with amyloid-ÎČ. This knowledge is valuable for improving dementia and DM diagnostics and treatment
Genomics: Think Global, Act Local
Long a slogan for environmentalists, âthink global, act localâ could be a new rallying cry for biologists. As genome-wide techniques advance and their costs drop, scientists are expanding into larger and larger territoriesâmetagenomics, global proteomic approaches, and analyses of thousands of genomes. These massive data sets are opening up new possibilities for understanding some of the smallest details of the genome. Here, we look at four such casesâinvestigating the evolutionary role of insertions and deletions in the genome, connecting an orphan enzyme with its gene, mapping the fine details of chromatin structure, and characterizing global interactions between proteins and RNAâall of which depend on a combination of global thinking and local action
An Automated and High Precision Quantitative Analysis of the ACR Phantom
A novel phantom-imaging platform for automated and high precision imaging of the American College of Radiology (ACR) PET phantom is proposed. The platform facilitates the generation of an accurate Ό-map for PET/MR systems with a robust alignment based on two-stage image registration using specifically designed PET templates. The automated analysis of PET images uses a set of granular composite volume of interest (VOI) templates in a 0.5 mm resolution grid for sampling of the system response to the insert step functions. The impact of the activity outside the field of view (FOV) was evaluated using two acquisitions of 30 minutes each, with and without the activity outside the FOV. Iterative image reconstruction was employed with and without modelled shift-invariant point spread function (PSF) and varying ordered subsets expectation maximisation (OSEM) iterations. Uncertainty analysis of all image-derived statistics was performed using bootstrap resampling of the list-mode data. We found that the activity outside the FOV can adversely affect the imaging planes close to the edge of the axial FOV, reducing the contrast, background uniformity and overall quantitative accuracy. The PSF had a positive impact on contrast recovery (although it slows convergence). The proposed platform may be helpful in a more informative evaluation of PET systems and image reconstruction methods
Comparing the hierarchy of keywords in on-line news portals
The tagging of on-line content with informative keywords is a widespread
phenomenon from scientific article repositories through blogs to on-line news
portals. In most of the cases, the tags on a given item are free words chosen
by the authors independently. Therefore, relations among keywords in a
collection of news items is unknown. However, in most cases the topics and
concepts described by these keywords are forming a latent hierarchy, with the
more general topics and categories at the top, and more specialised ones at the
bottom. Here we apply a recent, cooccurrence-based tag hierarchy extraction
method to sets of keywords obtained from four different on-line news portals.
The resulting hierarchies show substantial differences not just in the topics
rendered as important (being at the top of the hierarchy) or of less interest
(categorised low in the hierarchy), but also in the underlying network
structure. This reveals discrepancies between the plausible keyword association
frameworks in the studied news portals
Hierarchy measure for complex networks
Nature, technology and society are full of complexity arising from the
intricate web of the interactions among the units of the related systems (e.g.,
proteins, computers, people). Consequently, one of the most successful recent
approaches to capturing the fundamental features of the structure and dynamics
of complex systems has been the investigation of the networks associated with
the above units (nodes) together with their relations (edges). Most complex
systems have an inherently hierarchical organization and, correspondingly, the
networks behind them also exhibit hierarchical features. Indeed, several papers
have been devoted to describing this essential aspect of networks, however,
without resulting in a widely accepted, converging concept concerning the
quantitative characterization of the level of their hierarchy. Here we develop
an approach and propose a quantity (measure) which is simple enough to be
widely applicable, reveals a number of universal features of the organization
of real-world networks and, as we demonstrate, is capable of capturing the
essential features of the structure and the degree of hierarchy in a complex
network. The measure we introduce is based on a generalization of the m-reach
centrality, which we first extend to directed/partially directed graphs. Then,
we define the global reaching centrality (GRC), which is the difference between
the maximum and the average value of the generalized reach centralities over
the network. We investigate the behavior of the GRC considering both a
synthetic model with an adjustable level of hierarchy and real networks.
Results for real networks show that our hierarchy measure is related to the
controllability of the given system. We also propose a visualization procedure
for large complex networks that can be used to obtain an overall qualitative
picture about the nature of their hierarchical structure.Comment: 29 pages, 9 figures, 4 table
- âŠ