1,864 research outputs found

    Prostaglandin insert dinoprostone versus trans-cervical balloon catheter for outpatient labour induction: a randomised controlled trial of feasibility (PROBIT-F)

    Get PDF
    Background The aim was to assess the feasibility of conducting a randomised controlled trial (RCT) of induction of labour comparing use of two methods in the outpatient setting. Methods An open-label feasibility RCT was conducted in two UK maternity units from October 2017 to March 2019. Women aged ≥ 16 years, undergoing induction of labour (IOL) at term, with intact membranes and deemed suitable for outpatient IOL according to local guidelines were considered eligible. They were randomised to cervical ripening balloon catheter (CRB) or vaginal dinoprostone (Propess). The participants completed a questionnaire and a sub-group underwent detailed interview. Service use and cost data were collected via the Adult Service Use Schedule (AD-SUS). Women who declined to participate were requested to complete a decliners’ questionnaire. Results During the study period, 274 eligible women were identified. Two hundred thirty (83.9%) were approached for participation of whom 84/230 (36.5%) agreed and 146 did not. Of these, 38 were randomised to Propess (n = 20) and CRB (n = 18). Decliner data were collected for 93 women. The reasons for declining were declining IOL (n = 22), preference for inpatient IOL (n = 22) and preference for a specific method, Propess (n = 19). The intended sample size of 120 was not reached due to restrictive criteria for suitability for outpatient IOL, participant preference for Propess and shortage of research staff. The intervention as randomised was received by 29/38 (76%) women. Spontaneous vaginal delivery was observed in 9/20 (45%) women in the dinoprostone group and 11/18 (61%) women in the CRB group. Severe maternal adverse events were recorded in one woman in each group. All babies were born with good condition and all except one (37/38, 97.4%) remained with the mother after delivery. No deaths were recorded. − 21% of women in the dinoprostone group were re-admitted prior to diagnosis of active labour compared to 12% in the CRB group. Conclusions A third of the approached eligible women agreed for randomisation. An RCT is not feasible in the current service context. Modifications to the eligibility criteria for outpatient IOL, better information provision and round the clock availability of research staff would be needed to reach sufficient numbers

    Impact of treatment on damage and hospitalization in elderly patients with microscopic polyangiitis and granulomatosis with polyangiitis

    Get PDF
    OBJECTIVE: Age is a risk factor for organ damage, adverse events, and mortality in microscopic polyangiitis (MPA) and granulomatosis with polyangiitis (GPA). However, the relationship between treatment and damage, hospitalizations, and causes of death in elderly patients is largely unknown. METHODS: Consecutive patients from Sweden, England, and the Czech Republic diagnosed between 1997 and 2013 were included. Inclusion criteria were a diagnosis of MPA or GPA and age 75 years or more at diagnosis. Treatment with cyclophosphamide, rituximab, and corticosteroids the first three months was registered. Outcomes up to two years from diagnosis included vasculitis damage index (VDI), hospitalization, and cause of death. RESULTS: Treatment data was available for 167 of 202 patients. At two years, 4% had no items of damage. There was a positive association between VDI score at two years and Birmingham Vasculitis Activity Score at onset, and a negative association with treatment using cyclophosphamide or rituximab. Intravenous methylprednisolone dose was associated with treatment-related damage. During the first year, 69% of patients were readmitted to hospital. MPO-ANCA positivity and lower creatinine levels decreased the odds for readmission. The most common cause of death was infection, and this was associated with cumulative oral prednisolone dose. CONCLUSION: Immunosuppressive treatment with cyclophosphamide or rituximab in elderly patients with MPA and GPA was associated with development of less permanent organ damage and was not associated with hospitalization. However, higher doses of corticosteroids during the first three months was associated with treatment-related damage and fatal infections

    Brain age predicts mortality

    Get PDF
    Age-associated disease and disability are placing a growing burden on society. However, ageing does not affect people uniformly. Hence, markers of the underlying biological ageing process are needed to help identify people at increased risk of age-associated physical and cognitive impairments and ultimately, death. Here, we present such a biomarker, ‘brain-predicted age’, derived using structural neuroimaging. Brain-predicted age was calculated using machine-learning analysis, trained on neuroimaging data from a large healthy reference sample (N=2001), then tested in the Lothian Birth Cohort 1936 (N=669), to determine relationships with age-associated functional measures and mortality. Having a brain-predicted age indicative of an older-appearing brain was associated with: weaker grip strength, poorer lung function, slower walking speed, lower fluid intelligence, higher allostatic load and increased mortality risk. Furthermore, while combining brain-predicted age with grey matter and cerebrospinal fluid volumes (themselves strong predictors) not did improve mortality risk prediction, the combination of brain-predicted age and DNA-methylation-predicted age did. This indicates that neuroimaging and epigenetics measures of ageing can provide complementary data regarding health outcomes. Our study introduces a clinically-relevant neuroimaging ageing biomarker and demonstrates that combining distinct measurements of biological ageing further helps to determine risk of age-related deterioration and death

    The association between clinical integration of care and transfer of veterans with acute coronary syndromes from primary care VHA hospitals

    Get PDF
    BACKGROUND: Few studies report on the effect of organizational factors facilitating transfer between primary and tertiary care hospitals either within an integrated health care system or outside it. In this paper, we report on the relationship between degree of clinical integration of cardiology services and transfer rates of acute coronary syndrome (ACS) patients from primary to tertiary hospitals within and outside the Veterans Health Administration (VHA) system. METHODS: Prospective cohort study. Transfer rates were obtained for all patients with ACS diagnoses admitted to 12 primary VHA hospitals between 1998 and 1999. Binary variables measuring clinical integration were constructed for each primary VHA hospital reflecting: presence of on-site VHA cardiologist; referral coordinator at the associated tertiary VHA hospital; and/or referral coordinator at the primary VHA hospital. We assessed the association between the integration variables and overall transfer from primary to tertiary hospitals, using random effects logistic regression, controlling for clustering at two levels and adjusting for patient characteristics. RESULTS: Three of twelve hospitals had a VHA cardiologist on site, six had a referral coordinator at the tertiary VHA hospital, and four had a referral coordinator at the primary hospital. Presence of a VHA staff cardiologist on site and a referral coordinator at the tertiary VHA hospital decreased the likelihood of any transfer (OR 0.45, 95% CI 0.27–0.77, and 0.46, p = 0.002, CI 0.27–0.78). Conversely, having a referral coordinator at the primary VHA hospital increased the likelihood of transfer (OR 6.28, CI 2.92–13.48). CONCLUSIONS: Elements of clinical integration are associated with transfer, an important process in the care of ACS patients. In promoting optimal patient care, clinical integration factors should be considered in addition to patient characteristics

    Clinicopathological Profile and Surgical Treatment of Abdominal Tuberculosis: A Single Centre Experience in Northwestern Tanzania.

    Get PDF
    Abdominal tuberculosis continues to be a major public health problem worldwide and poses diagnostic and therapeutic challenges to general surgeons practicing in resource-limited countries. This study was conducted to describe the clinicopathological profile and outcome of surgical treatment of abdominal tuberculosis in our setting and compare with what is described in literature. A prospective descriptive study of patients who presented with abdominal tuberculosis was conducted at Bugando Medical Centre (BMC) in northwestern Tanzania from January 2006 to February 2012. Ethical approval to conduct the study was obtained from relevant authorities. Statistical data analysis was performed using SPSS version 17.0. Out of 256 patients enrolled in the study, males outnumbered females. The median age was 28 years (range = 16-68 years). The majority of patients (77.3%) had primary abdominal tuberculosis. A total of 127 (49.6%) patients presented with intestinal obstruction, 106 (41.4%) with peritonitis, 17 (6.6%) with abdominal masses and 6 (2.3%) patients with multiple fistulae in ano. Forty-eight (18.8%) patients were HIV positive. A total of 212 (82.8%) patients underwent surgical treatment for abdominal tuberculosis. Bands /adhesions (58.5%) were the most common operative findings. Ileo-caecal region was the most common bowel involved in 122 (57.5%) patients. Release of adhesions and bands was the most frequent surgical procedure performed in 58.5% of cases. Complication and mortality rates were 29.7% and 18.8% respectively. The overall median length of hospital stay was 32 days and was significantly longer in patients with complications (p < 0.001). Advanced age (age ≥ 65 years), co-morbid illness, late presentation, HIV positivity and CD4+ count < 200 cells/μl were statistically significantly associated with mortality (p < 0.0001). The follow up of patients were generally poor as only 37.5% of patients were available for follow up at twelve months after discharge. Abdominal tuberculosis constitutes a major public health problem in our environment and presents a diagnostic challenge requiring a high index of clinical suspicion. Early diagnosis, early anti-tuberculous therapy and surgical treatment of the associated complications are essential for survival

    Integrative analysis reveals CD38 as a therapeutic target for plasma cell-rich pre-disease and established rheumatoid arthritis and systemic lupus erythematosus

    Get PDF
    Figure S2. Daratumumab has no impact on T cells and monocytes ex vivo. (A) Total number of CD3+ T cells in each daratumumab concentration at 72 h post-treatment. (B) Quantification of CD38 MFI on CD3+ T cells at 72 h post-culture with isotype control or daratumumab at indicated concentrations. (C) Total number of CD14+ monocytes in each daratumumab concentration at 72 h post-treatment. (D) Quantification of CD38 MFI on CD14+ monocytes at 72 h post-culture with isotype control or daratumumab at indicated concentrations. Data shown represent four patients with SLE, six with RA and six healthy control donors. (PNG 2127 kb

    Dissecting mitosis by RNAi in Drosophila tissue culture cells

    Get PDF
    Here we describe a detailed methodology to study the function of genes whose products function during mitosis by dsRNA-mediated interference (RNAi) in cultured cells of Drosophila melanogaster. This procedure is particularly useful for the analysis of genes for which genetic mutations are not available or for the dissection of complicated phenotypes derived from the analysis of such mutants. With the advent of whole genome sequencing it is expected that RNAi-based screenings will be one method of choice for the identification and study of novel genes involved in particular cellular processes. In this paper we focused particularly on the procedures for the proper phenotypic analysis of cells after RNAi-mediated depletion of proteins required for mitosis, the process by which the genetic information is segregated equally between daughter cells. We use RNAi of the microtubule-associated protein MAST/Orbit as an example for the usefulness of the technique

    Declining Burden of Malaria Over two Decades in a Rural Community of Muheza District, North-Eastern Tanzania.

    Get PDF
    The recently reported declining burden of malaria in some African countries has been attributed to scaling-up of different interventions although in some areas, these changes started before implementation of major interventions. This study assessed the long-term trends of malaria burden for 20 years (1992--2012) in Magoda and for 15 years in Mpapayu village of Muheza district, north-eastern Tanzania, in relation to different interventions as well as changing national malaria control policies.\ud Repeated cross-sectional surveys recruited individuals aged 0 -- 19 years from the two villages whereby blood smears were collected for detection of malaria parasites by microscopy. Prevalence of Plasmodium falciparum infections and other indices of malaria burden (prevalence of anaemia, splenomegaly and gametocytes) were compared across the years and between the study villages. Major interventions deployed including mobile clinic, bed nets and other research activities, and changes in national malaria control policies were also marked. In Magoda, the prevalence of P. falciparum infections initially decreased between 1992 and 1996 (from 83.5 to 62.0%), stabilized between 1996 and 1997, and further declined to 34.4% in 2004. A temporary increase between 2004 and 2008 was followed by a progressive decline to 7.2% in 2012, which is more than 10-fold decrease since 1992. In Mpapayu (from 1998), the highest prevalence was 81.5% in 1999 and it decreased to 25% in 2004. After a slight increase in 2008, a steady decline followed, reaching <5% from 2011 onwards. Bed net usage was high in both villages from 1999 to 2004 (>=88%) but it decreased between 2008 and 2012 (range, 28% - 68%). After adjusting for the effects of bed nets, age, fever and year of study, the risk of P. falciparum infections decreased significantly by >=97% in both villages between 1999 and 2012 (p < 0.001). The prevalence of splenomegaly (>40% to <1%) and gametocytes (23% to <1%) also decreased in both villages.Discussion and conclusionsA remarkable decline in the burden of malaria occurred between 1992 and 2012 and the initial decline (1992 -- 2004) was most likely due to deployment of interventions, such as bed nets, and better services through research activities. Apart from changes of drug policies, the steady decline observed from 2008 occurred when bed net coverage was low suggesting that other factors contributed to the most recent pattern. These results suggest that continued monitoring is required to determine causes of the changing malaria epidemiology and also to monitor the progress towards maintaining low malaria transmission and reaching related millennium development goals
    corecore