1,435 research outputs found

    New first trimester crown-rump length's equations optimized by structured data collection from a French general population

    Full text link
    --- Objectives --- Prior to foetal karyotyping, the likelihood of Down's syndrome is often determined combining maternal age, serum free beta-HCG, PAPP-A levels and embryonic measurements of crown-rump length and nuchal translucency for gestational ages between 11 and 13 weeks. It appeared important to get a precise knowledge of these scan parameters' normal values during the first trimester. This paper focused on crown-rump length. --- METHODS --- 402 pregnancies from in-vitro fertilization allowing a precise estimation of foetal ages (FA) were used to determine the best model that describes crown-rump length (CRL) as a function of FA. Scan measures by a single operator from 3846 spontaneous pregnancies representative of the general population from Northern France were used to build a mathematical model linking FA and CRL in a context as close as possible to normal scan screening used in Down's syndrome likelihood determination. We modeled both CRL as a function of FA and FA as a function of CRL. For this, we used a clear methodology and performed regressions with heteroskedastic corrections and robust regressions. The results were compared by cross-validation to retain the equations with the best predictive power. We also studied the errors between observed and predicted values. --- Results --- Data from 513 spontaneous pregnancies allowed to model CRL as a function of age of foetal age. The best model was a polynomial of degree 2. Datation with our equation that models spontaneous pregnancies from a general population was in quite agreement with objective datations obtained from 402 IVF pregnancies and thus support the validity of our model. The most precise measure of CRL was when the SD was minimal (1.83mm), for a CRL of 23.6 mm where our model predicted a 49.4 days of foetal age. Our study allowed to model the SD from 30 to 90 days of foetal age and offers the opportunity of using Zscores in the future to detect growth abnormalities. --- Conclusion --- With powerful statistical tools we report a good modeling of the first trimester embryonic growth in the general population allowing a better knowledge of the date of fertilization useful in the ultrasound screening of Down's syndrome. The optimal period to measure CRL and predict foetal age was 49.4 days (9 weeks of gestational age). Our results open the way to the detection of foetal growth abnormalities using CRL Zscores throughout the first trimester

    Appropriate disclosure of a diagnosis of dementia : identifying the key behaviours of 'best practice'

    Get PDF
    Background: Despite growing evidence that many people with dementia want to know their diagnosis, there is wide variation in attitudes of professionals towards disclosure. The disclosure of the diagnosis of dementia is increasingly recognised as being a process rather than a one-off behaviour. However, the different behaviours that contribute to this process have not been comprehensively defined. No intervention studies to improve diagnostic disclosure in dementia have been reported to date. As part of a larger study to develop an intervention to promote appropriate disclosure, we sought to identify important disclosure behaviours and explore whether supplementing a literature review with other methods would result in the identification of new behaviours. Methods: To identify a comprehensive list of behaviours in disclosure we conducted a literature review, interviewed people with dementia and informal carers, and used a consensus process involving health and social care professionals. Content analysis of the full list of behaviours was carried out. Results: Interviews were conducted with four people with dementia and six informal carers. Eight health and social care professionals took part in the consensus panel. From the interviews, consensus panel and literature review 220 behaviours were elicited, with 109 behaviours over-lapping. The interviews and consensus panel elicited 27 behaviours supplementary to the review. Those from the interviews appeared to be self-evident but highlighted deficiencies in current practice and from the panel focused largely on balancing the needs of people with dementia and family members. Behaviours were grouped into eight categories: preparing for disclosure; integrating family members; exploring the patient's perspective; disclosing the diagnosis; responding to patient reactions; focusing on quality of life and well-being; planning for the future; and communicating effectively. Conclusion: This exercise has highlighted the complexity of the process of disclosing a diagnosis of dementia in an appropriate manner. It confirms that many of the behaviours identified in the literature (often based on professional opinion rather than empirical evidence) also resonate with people with dementia and informal carers. The presence of contradictory behaviours emphasises the need to tailor the process of disclosure to individual patients and carers. Our combined methods may be relevant to other efforts to identify and define complex clinical practices for further study.This project is funded by UK Medical Research Council, Grant reference number G0300999

    A qualitative study of Telehealth patient information leaflets (TILs) : are we giving patients enough information?

    Get PDF
    BACKGROUND: The provision of patient information leaflets regarding telehealth has been perceived by potential consumers as a strategy to promote awareness and adoption of telehealth services. However, such leaflets need to be designed carefully if adoption and awareness among potential users is to be promoted. Therefore, the aims of this study were: first, to see how telehealth was portrayed in some of the existing telehealth leaflets (THLs). Second, to explore patients' perceptions of the existing THLs and their engagement with the concept and how THLs can be optimised. METHODS: A two-step approach was employed to address the aims of this study. The first phase involved the use of discourse analysis to compare 12 electronically and publically available THLs, with the existing THL guidance "Involve Yorkshire and Humber". The second phase involved conducting 14 semi-structured interviews with potential telehealth users/patients to gauge their perception and engagement with the concept, using the two leaflets that were mostly matching with the guidance used. Six interviews were audio-recorded and eight had detailed jotted notes. The interviews were transcribed and thematically analysed to identify key themes. RESULTS: The discourse analysis showed certain gaps and variations within the screened leaflets when addressing the following aspects: cost of the telehealth service, confidentiality, patients' choices in addition to equipment use and technical support. Analysis of the interviews revealed patients' need for having clear and sufficient information about the telehealth service within the THLs; in addition to, patients' preference for the use of simpler terminologies for telehealth description and the provision of clear simple texts with pictorial presentations. The interviews also revealed certain limitations against adoption of telehealth by the participants, such as: lack of privacy and confidentiality of information, fear of technology breakdown and equipment failure, loss of face-to-face contact with healthcare professionals and being too dependent on the telehealth service. CONCLUSION: The current study showed a great variation among the screened THLs and highlighted certain gaps within the content and presentation of these leaflets. However, the study also highlighted certain key issues to be considered when designing THLs in the future to enhance telehealth uptake and use by patients

    A meta-analysis of long-term effects of conservation agriculture on maize grain yield under rain-fed conditions

    Get PDF
    Conservation agriculture involves reduced tillage, permanent soil cover and crop rotations to enhance soil fertility and to supply food from a dwindling land resource. Recently, conservation agriculture has been promoted in Southern Africa, mainly for maize-based farming systems. However, maize yields under rain-fed conditions are often variable. There is therefore a need to identify factors that influence crop yield under conservation agriculture and rain-fed conditions. Here, we studied maize grain yield data from experiments lasting 5 years and more under rain-fed conditions. We assessed the effect of long-term tillage and residue retention on maize grain yield under contrasting soil textures, nitrogen input and climate. Yield variability was measured by stability analysis. Our results show an increase in maize yield over time with conservation agriculture practices that include rotation and high input use in low rainfall areas. But we observed no difference in system stability under those conditions. We observed a strong relationship between maize grain yield and annual rainfall. Our meta-analysis gave the following findings: (1) 92% of the data show that mulch cover in high rainfall areas leads to lower yields due to waterlogging; (2) 85% of data show that soil texture is important in the temporal development of conservation agriculture effects, improved yields are likely on well-drained soils; (3) 73% of the data show that conservation agriculture practices require high inputs especially N for improved yield; (4) 63% of data show that increased yields are obtained with rotation but calculations often do not include the variations in rainfall within and between seasons; (5) 56% of the data show that reduced tillage with no mulch cover leads to lower yields in semi-arid areas; and (6) when adequate fertiliser is available, rainfall is the most important determinant of yield in southern Africa. It is clear from our results that conservation agriculture needs to be targeted and adapted to specific biophysical conditions for improved impact

    National audit of post-operative management in spinal surgery

    Get PDF
    BACKGROUND: There is some evidence from a Cochrane review that rehabilitation following spinal surgery may be beneficial. METHODS: We conducted a survey of current post-operative practice amongst spinal surgeons in the United Kingdom in 2002 to determine whether such interventions are being included routinely in the post-operative management of spinal patients. The survey included all surgeons who were members of either the British Association of Spinal Surgeons (BASS) or the Society for Back Pain Research. Data on the characteristics of each surgeon and his or her current pattern of practice and post-operative care were collected via a reply-paid postal questionnaire. RESULTS: Usable responses were provided by 57% of the 89 surgeons included in the survey. Most surgeons (79%) had a routine post-operative management regime, but only 35% had a written set of instructions that they gave to their patients concerning this. Over half (55%) of surgeons do not send their patients for any physiotherapy after discharge, with an average of less than two sessions of treatment organised by those that refer for physiotherapy at all. Restrictions on lifting, sitting and driving showed considerable inconsistency both between surgeons and also within the recommendations given by individual surgeons. CONCLUSION: Demonstrable inconsistencies within and between spinal surgeons in their approaches to post-operative management can be interpreted as evidence of continuing and significant uncertainty across the sub-speciality as to what does constitute best care in these areas of practice. Conducting further large, rigorous, randomised controlled trials would be the best method for obtaining definitive answers to these questions

    Effect of pre-stroke use of ACE inhibitors on ischemic stroke severity

    Get PDF
    BACKGROUND: Recent trials suggest that angiotensin-converting enzyme inhibitors (ACEI) are effective in prevention of ischemic stroke, as measured by reduced stroke incidence. We aimed to compare stroke severity between stroke patients who were taking ACEI before their stroke onset and those who were not, to examine the effects of pretreatment with ACEI on ischemic stroke severity. METHODS: We retrospectively studied 126 consecutive patients presenting within 24 hours of ischemic stroke onset, as confirmed by diffusion-weighted magnetic resonance imaging (DWI). We calculated the NIHSS score at presentation, as the primary measure of clinical stroke severity, and categorized stroke severity as mild (NIHSS [less than or equal to] 7), moderate (NIHSS 8–13) or severe (NIHSS [greater than or equal to] 14). We analyzed demographic data, risk-factor profile, blood pressure (BP) and medications on admissions, and determined stroke mechanism according to TOAST criteria. We also measured the volumes of admission diffusion- and perfusion-weighted (DWI /PWI) magnetic resonance imaging lesions, as a secondary measure of ischemic tissue volume. We compared these variables among patients on ACEI and those who were not. RESULTS: Thirty- three patients (26%) were on ACE-inhibitors. The overall median baseline NIHSS score was 5.5 (range 2–21) among ACEI-treated patients vs. 9 (range 1–36) in non-ACEI patients (p = 0.036). Patients on ACEI prior to their stroke had more mild and less severe strokes, and smaller DWI and PWI lesion volumes compared to non-ACEI treated patients. However, none of these differences were significant. Predictably, a higher percentage of patients on ACEI had a history of heart failure (p = 0.03). Age, time-to-imaging or neurological evaluation, risk-factor profile, concomitant therapy with lipid lowering, other antihypertensives or antithrombotic agents, or admission BP were comparable between the two groups. CONCLUSION: Our results suggest that ACE-inhibitors may reduce the clinical severity of stroke, as measured by NIHSS score. Further, larger-scale, prospective studies areneeded to validate our findings, and to elucidate the mechanism(s) of ACEImediated benefits in patients with ischemic stroke

    Pathophysiological Mechanisms of Severe Anaemia in Malawian Children

    Get PDF
    BACKGROUND: Severe anaemia is a major cause of morbidity and mortality in African children. The aetiology is multi-factorial, but interventions have often targeted only one or a few causal factors, with limited success. METHODS AND FINDINGS: We assessed the contribution of different pathophysiological mechanisms (red cell production failure [RCPF], haemolysis and blood loss) to severe anaemia in Malawian children in whom etiological factors have been described previously. More complex associations between etiological factors and the mechanisms were explored using structural equation modelling. In 235 children with severe anaemia (haemoglobin<3.2 mMol/L [5.0 g/dl]) studied, RCPF, haemolysis and blood loss were found in 48.1%, 21.7% and 6.9%, respectively. The RCPF figure increased to 86% when a less stringent definition of RCPF was applied. RCPF was the most common mechanism in each of the major etiological subgroups (39.7-59.7%). Multiple aetiologies were common in children with severe anaemia. In the final model, nutritional and infectious factors, including malaria, were directly or indirectly associated with RCPF, but not with haemolysis. CONCLUSION: RCPF was the most common pathway leading to severe anaemia, from a variety of etiological factors, often found in combination. Unlike haemolysis or blood loss, RCPF is a defect that is likely to persist to a significant degree unless all of its contributing aetiologies are corrected. This provides a further explanation for the limited success of the single factor interventions that have commonly been applied to the prevention or treatment of severe anaemia. Our findings underline the need for a package of measures directed against all of the local aetiologies of this often fatal paediatric syndrome

    The relationship between spasticity in young children (18 months of age) with cerebral palsy and their gross motor function development

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>It is thought that spasticity has an influence on the development of functional motor abilities among children with cerebral palsy (CP). The extent to which spasticity is associated with the change in motor abilities in young children with CP has not been established. The objective of this study is to evaluate the relationship of initial spasticity in young children with CP and their gross motor function development over one year.</p> <p>Methods</p> <p>Fifty children with CP aged 18 months, GMFCS-levels I-V participated in a longitudinal observational study. Change in gross motor functioning (GMFM-66) was measured over one year. The level of spasticity measured at the first assessment was determined with the Modified Tardieu Scale in three muscle groups of the lower extremity (adductor muscles, the hamstrings and the m. gastrocnemius). The Spasticity Total Score per child was calculated with a maximum score of 12 points.</p> <p>Results</p> <p>Spearman's Rho Correlation (-0.28) revealed a statistically significant relationship (p < 0.05) of small strength between the Spasticity Total Score and the change score of the GMFM-66.</p> <p>Conclusion</p> <p>Our findings indicate that when measured over one year, spasticity is marginally related to gross motor function development in infants with CP. The initial level of spasticity is only one of the many child, environmental and family factors that determines gross motor development of a young child with CP.</p

    Bacterial and Archaea Community Present in the Pine Barrens Forest of Long Island, NY: Unusually High Percentage of Ammonia Oxidizing Bacteria

    Get PDF
    Of the few preserved areas in the northeast of United States, the soil in the Pine Barrens Forests presents a harsh environment for the microorganisms to grow and survive. In the current study we report the use of clustering methods to scientifically select the sampling locations that would represent the entire forest and also report the microbial diversity present in various horizons of the soil. Sixty six sampling locations were selected across the forest and soils were collected from three horizons (sampling depths). The three horizons were 0–10 cm (Horizon O); 11–25 cm (Horizon A) and 26–40 cm (Horizon B). Based on the total microbial substrate utilization pattern and K-means clustering analysis, the soil in the Pine Barrens Forest can be classified into four distinct clusters at each of the three horizons. One soil sample from each of the four clusters were selected and archaeal and bacterial populations within the soil studied using pyrosequencing method. The results show the microbial communities present in each of these clusters are different. Within the microbial communities present, microorganisms involved in nitrogen cycle occupy a major fraction of microbial community in the soil. High level of diversity was observed for nitrogen fixing bacteria. In contrast, Nitrosovibrio and Nitrosocaldus spp are the single bacterial and archaeal population respectively carrying out ammonia oxidation in the soil

    Molecular and cellular mechanisms underlying the evolution of form and function in the amniote jaw.

    Get PDF
    The amniote jaw complex is a remarkable amalgamation of derivatives from distinct embryonic cell lineages. During development, the cells in these lineages experience concerted movements, migrations, and signaling interactions that take them from their initial origins to their final destinations and imbue their derivatives with aspects of form including their axial orientation, anatomical identity, size, and shape. Perturbations along the way can produce defects and disease, but also generate the variation necessary for jaw evolution and adaptation. We focus on molecular and cellular mechanisms that regulate form in the amniote jaw complex, and that enable structural and functional integration. Special emphasis is placed on the role of cranial neural crest mesenchyme (NCM) during the species-specific patterning of bone, cartilage, tendon, muscle, and other jaw tissues. We also address the effects of biomechanical forces during jaw development and discuss ways in which certain molecular and cellular responses add adaptive and evolutionary plasticity to jaw morphology. Overall, we highlight how variation in molecular and cellular programs can promote the phenomenal diversity and functional morphology achieved during amniote jaw evolution or lead to the range of jaw defects and disease that affect the human condition
    corecore