6,741 research outputs found
A qualitative exploration of health-related quality of life and health behaviours in children with sickle cell disease and healthy siblings
Objectives
This study explored the health-related quality of life (HRQL) and health behaviours of children with sickle cell disease (SCD) and healthy siblings, drawing on Gap theory, which suggests HRQL is the discrepancy between current and ideal selves.
Design
Thirty-two interviews, facilitated by children’s drawings of their current and ideal selves were thematically analysed.
Results
Two themes were identified. First, limitations of SCD and adjusted expectations. Children with SCD report some discrepancy in HRQL as they would like to participate in more physical activity, but overall, they appear to have normalised their condition and adjusted their expectations in the context of the limits of their condition. Healthy siblings worry about their sibling and have greater expectations about engaging in adventurous activities and for their future. Second, coping with SCD. Children have limited social support, although children with SCD seek support from their mothers. They also modify health behaviours, like reducing exercise to help prevent and cope with sickle-related pain.
Conclusion
Children have some discrepancies in their HRQL but adjusted expectations among children with SCD may reduce discrepancy. Adapting health behaviours may help to cope with SCD but it is important that reductions in physical activity do not impair HRQL
Better together: Integrating biomedical informatics and healthcare IT operations to create a learning health system during the COVID-19 pandemic
The growing availability of multi-scale biomedical data sources that can be used to enable research and improve healthcare delivery has brought about what can be described as a healthcare data age. This new era is defined by the explosive growth in bio-molecular, clinical, and population-level data that can be readily accessed by researchers, clinicians, and decision-makers, and utilized for systems-level approaches to hypothesis generation and testing as well as operational decision-making. However, taking full advantage of these unprecedented opportunities presents an opportunity to revisit the alignment between traditionally academic biomedical informatics (BMI) and operational healthcare information technology (HIT) personnel and activities in academic health systems. While the history of the academic field of BMI includes active engagement in the delivery of operational HIT platforms, in many contemporary settings these efforts have grown distinct. Recent experiences during the COVID-19 pandemic have demonstrated greater coordination of BMI and HIT activities that have allowed organizations to respond to pandemic-related changes more effectively, with demonstrable and positive impact as a result. In this position paper, we discuss the challenges and opportunities associated with driving alignment between BMI and HIT, as viewed from the perspective of a learning healthcare system. In doing so, we hope to illustrate the benefits of coordination between BMI and HIT in terms of the quality, safety, and outcomes of care provided to patients and populations, demonstrating that these two groups can be better together
Logarithmically-concave moment measures I
We discuss a certain Riemannian metric, related to the toric Kahler-Einstein
equation, that is associated in a linearly-invariant manner with a given
log-concave measure in R^n. We use this metric in order to bound the second
derivatives of the solution to the toric Kahler-Einstein equation, and in order
to obtain spectral-gap estimates similar to those of Payne and Weinberger.Comment: 27 page
A protocol to evaluate RNA sequencing normalization methods
Background
RNA sequencing technologies have allowed researchers to gain a better understanding of how the transcriptome affects disease. However, sequencing technologies often unintentionally introduce experimental error into RNA sequencing data. To counteract this, normalization methods are standardly applied with the intent of reducing the non-biologically derived variability inherent in transcriptomic measurements. However, the comparative efficacy of the various normalization techniques has not been tested in a standardized manner. Here we propose tests that evaluate numerous normalization techniques and applied them to a large-scale standard data set. These tests comprise a protocol that allows researchers to measure the amount of non-biological variability which is present in any data set after normalization has been performed, a crucial step to assessing the biological validity of data following normalization.
Results
In this study we present two tests to assess the validity of normalization methods applied to a large-scale data set collected for systematic evaluation purposes. We tested various RNASeq normalization procedures and concluded that transcripts per million (TPM) was the best performing normalization method based on its preservation of biological signal as compared to the other methods tested.
Conclusion
Normalization is of vital importance to accurately interpret the results of genomic and transcriptomic experiments. More work, however, needs to be performed to optimize normalization methods for RNASeq data. The present effort helps pave the way for more systematic evaluations of normalization methods across different platforms. With our proposed schema researchers can evaluate their own or future normalization methods to further improve the field of RNASeq normalization
CO(2) sensitivity of Southern Ocean phytoplankton
The Southern Ocean exerts a strong impact on marine biogeochemical cycles and global air-sea CO(2) fluxes. Over the coming century, large increases in surface ocean CO(2) levels, combined with increased upper water column temperatures and stratification, are expected to diminish Southern Ocean CO(2) uptake. These effects could be significantly modulated by concomitant CO(2)-dependent changes in the region\u27s biological carbon pump. Here we show that CO(2) concentrations affect the physiology, growth and species composition of phytoplankton assemblages in the Ross Sea, Antarctica. Field results from in situ sampling and ship-board incubation experiments demonstrate that inorganic carbon uptake, steady-state productivity and diatom species composition are sensitive to CO(2) concentrations ranging from 100 to 800 ppm. Elevated CO(2) led to a measurable increase in phytoplankton productivity, promoting the growth of larger chain-forming diatoms. Our results suggest that CO(2) concentrations can influence biological carbon cycling in the Southern Ocean, thereby creating potential climate feedbacks
Recommended from our members
Is the Helmholtz equation really sign-indefinite?
The usual variational (or weak) formulations of the Helmholtz equation are sign-indefinite in the sense that the bilinear forms cannot be bounded below by a positive multiple of the appropriate norm squared. This is often for a good reason, since in bounded domains under certain boundary conditions the solution of the Helmholtz equation is not unique at wavenumbers that correspond to eigenvalues of the Laplacian, and thus the variational problem cannot be sign-definite. However, even in cases where the solution is unique for all wavenumbers, the standard variational formulations of the Helmholtz equation are still indefinite when the wavenumber is large. This indefiniteness has implications for both the analysis and the practical implementation of finite element methods. In this paper we introduce new sign-definite (also called coercive or elliptic) formulations of the Helmholtz equation posed in either the interior of a star-shaped domain with impedance boundary conditions, or the exterior of a star-shaped domain with Dirichlet boundary conditions. Like the standard variational formulations, these new formulations arise just by multiplying the Helmholtz equation by particular test functions and integrating by parts
Vertical zonation of testate amoebae in the Elatia Mires, northern Greece : palaeoecological evidence for a wetland response to recent climate change or autogenic processes?
The Elatia Mires of northern Greece are unique ecosystems of high conservation value. The mires are climatically marginal and may be sensitive to changing hydroclimate, while northern Greece has experienced a significant increase in aridity since the late twentieth century. To investigate the impact of recent climatic change on the hydrology of the mires, the palaeoecological record was investigated from three near-surface monoliths extracted from two sites. Testate amoebae were analysed as sensitive indicators of hydrology. Results were interpreted using transfer function models to provide quantitative reconstructions of changing water table depth and pH. AMS radiocarbon dates and 210Pb suggest the peats were deposited within the last c. 50 years, but do not allow a secure chronology to be established. Results from all three profiles show a distinct shift towards a more xerophilic community particularly noted by increases in Euglypha species. Transfer function results infer a distinct lowering of water tables in this period. A hydrological response to recent climate change is a tenable hypothesis to explain this change; however other possible explanations include selective test decay, vertical zonation of living amoebae, ombrotrophication and local hydrological change. It is suggested that a peatland response to climatic change is the most probable hypothesis, showing the sensitivity of marginal peatlands to recent climatic change
Electronic health record data quality assessment and tools: A systematic review
OBJECTIVE: We extended a 2013 literature review on electronic health record (EHR) data quality assessment approaches and tools to determine recent improvements or changes in EHR data quality assessment methodologies.
MATERIALS AND METHODS: We completed a systematic review of PubMed articles from 2013 to April 2023 that discussed the quality assessment of EHR data. We screened and reviewed papers for the dimensions and methods defined in the original 2013 manuscript. We categorized papers as data quality outcomes of interest, tools, or opinion pieces. We abstracted and defined additional themes and methods though an iterative review process.
RESULTS: We included 103 papers in the review, of which 73 were data quality outcomes of interest papers, 22 were tools, and 8 were opinion pieces. The most common dimension of data quality assessed was completeness, followed by correctness, concordance, plausibility, and currency. We abstracted conformance and bias as 2 additional dimensions of data quality and structural agreement as an additional methodology.
DISCUSSION: There has been an increase in EHR data quality assessment publications since the original 2013 review. Consistent dimensions of EHR data quality continue to be assessed across applications. Despite consistent patterns of assessment, there still does not exist a standard approach for assessing EHR data quality.
CONCLUSION: Guidelines are needed for EHR data quality assessment to improve the efficiency, transparency, comparability, and interoperability of data quality assessment. These guidelines must be both scalable and flexible. Automation could be helpful in generalizing this process
- …