497 research outputs found

    Outcomes following kidney transplantation in patients with sickle cell disease: The impact of automated exchange blood transfusion

    Get PDF
    There are over 12,000 people with sickle cell disease (SCD) in the UK, and 4–12% of patients who develop Sickle Cell Nephropathy (SCN) progress to End Stage Renal Disease (ESRD). Renal transplantation offers the best outcomes for these patients with but their access to transplantation is often limited. Regular automated exchange blood transfusions (EBT) reduce the complications of SCD and may improve outcomes. However, concerns over alloimmunisation limit its widespread implementation. In this retrospective multicenter study, data were collected on 34 SCD patients who received a kidney transplant across 6 London Hospitals between 1997 and 2017. 20/34 patients were on an EBT program, pre or post renal transplantation. Overall patient and graft survival were inferior to contemporaneous UK data in the ESRD population as a whole, a finding which is well-recognised. However, patient survival (CI 95%, p = 0.0032), graft survival and graft function were superior at all time-points in those who received EBT versus those who did not. 4/20 patients (20%) on EBT developed de novo donor specific antibodies (DSAs). 3/14 patients (21%) not on EBT developed de novo DSAs. The incidence of rejection in those on EBT was 5/18 (28%), as compared with 7/13 (54%) not on EBT. In conclusion, our data, while limited by an inevitably small sample size and differences in the date of transplantation, do suggest that long-term automated EBT post renal transplant is effective and safe, with improvement in graft and patient outcomes and no increase in antibody formation or graft rejection

    Impact of organised programs on colorectal cancer screening

    Get PDF
    <p>Abstract</p> <p>Purpose</p> <p>Colorectal cancer (CRC) screening has been shown to decrease CRC mortality. Organised mass screening programs are being implemented in France. Its perception in the general population and by general practitioners is not well known.</p> <p>Methods</p> <p>Two nationwide observational telephone surveys were conducted in early 2005. First among a representative sample of subjects living in France and aged between 50 and 74 years that covered both geographical departments with and without implemented screening services. Second among General Practionners (Gps). Descriptive and multiple logistic regression was carried out.</p> <p>Results</p> <p>Twenty-five percent of the persons(N = 1509) reported having undergone at least one CRC screening, 18% of the 600 interviewed GPs reported recommending a screening test for CRC systematically to their patients aged 50–74 years. The odds ratio (OR) of having undergone a screening test using FOBT was 3.91 (95% CI: 2.49–6.16) for those living in organised departments (referent group living in departments without organised screening), almost twice as high as impact educational level (OR = 2.03; 95% CI: 1.19–3.47).</p> <p>Conclusion</p> <p>CRC screening is improved in geographical departments where it is organised by health authorities. In France, an organised screening programs decrease inequalities for CRC screening.</p

    Hospital outpatient perceptions of the physical environment of waiting areas: the role of patient characteristics on atmospherics in one academic medical center

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>This study examines hospital outpatient perceptions of the physical environment of the outpatient waiting areas in one medical center. The relationship of patient characteristics and their perceptions and needs for the outpatient waiting areas are also examined.</p> <p>Method</p> <p>The examined medical center consists of five main buildings which house seventeen primary waiting areas for the outpatient clinics of nine medical specialties: 1) Internal Medicine; 2) Surgery; 3) Ophthalmology; 4) Obstetrics-Gynecology and Pediatrics; 5) Chinese Medicine; 6) Otolaryngology; 7) Orthopedics; 8) Family Medicine; and 9) Dermatology. A 15-item structured questionnaire was developed to rate patient satisfaction covering the four dimensions of the physical environments of the outpatient waiting areas: 1) visual environment; 2) hearing environment; 3) body contact environment; and 4) cleanliness. The survey was conducted between November 28, 2005 and December 8, 2005. A total of 680 outpatients responded. Descriptive, univariate, and multiple regression analyses were applied in this study.</p> <p>Results</p> <p>All of the 15 items were ranked as relatively high with a range from 3.362 to 4.010, with a neutral score of 3. Using a principal component analysis' summated scores of four constructed dimensions of patient satisfaction with the physical environments (i.e. visual environment, hearing environment, body contact environment, and cleanliness), multiple regression analyses revealed that patient satisfaction with the physical environment of outpatient waiting areas was associated with gender, age, visiting frequency, and visiting time.</p> <p>Conclusion</p> <p>Patients' socio-demographics and context backgrounds demonstrated to have effects on their satisfaction with the physical environment of outpatient waiting areas. In addition to noticing the overall rankings for less satisfactory items, what should receive further attention is the consideration of the patients' personal characteristics when redesigning more comfortable and customized physical environments of waiting areas.</p

    Duration of temporary catheter use for hemodialysis: an observational, prospective evaluation of renal units in Brazil

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>For chronic hemodialysis, the ideal permanent vascular access is the arteriovenous fistula (AVF). Temporary catheters should be reserved for acute dialysis needs. The AVF is associated with lower infection rates, better clinical results, and a higher quality of life and survival when compared to temporary catheters. In Brazil, the proportion of patients with temporary catheters for more than 3 months from the beginning of therapy is used as an evaluation of the quality of renal units. The aim of this study is to evaluate factors associated with the time between the beginning of hemodialysis with temporary catheters and the placement of the first arteriovenous fistula in Brazil.</p> <p>Methods</p> <p>This is an observational, prospective non-concurrent study using national administrative registries of all patients financed by the public health system who began renal replacement therapy (RRT) between 2000 and 2004 in Brazil. Incident patients were eligible who had hemodialysis for the first time. Patients were excluded who: had hemodialysis reportedly started after the date of death (inconsistent database); were younger than 18 years old; had HIV; had no record of the first dialysis unit; and were dialyzed in units with less than twenty patients. To evaluate individual and renal unit factors associated with the event of interest, the frailty model was used (N = 55,589).</p> <p>Results</p> <p>Among the 23,824 patients (42.9%) who underwent fistula placement in the period of the study, 18.2% maintained the temporary catheter for more than three months until the fistula creation. The analysis identified five statistically significant factors associated with longer time until first fistula: higher age (Hazard-risk - HR 0.99, 95% CI 0.99-1.00); having hypertension and cardiovascular diseases (HR 0.94, 95% CI 0.9-0.98) as the cause of chronic renal disease; residing in capitals cities (HR 0.92, 95% CI 0.9-0.95) and certain regions in Brazil - South (HR 0.83, 95% CI 0.8-0.87), Midwest (HR 0.88, 95% CI 0.83-0.94), Northeast (HR 0.91, 95% CI 0.88-0.94), or North (HR 0.88, 95% CI 0.83-0.94) and the type of renal unit (public or private).</p> <p>Conclusion</p> <p>Monitoring the provision of arteriovenous fistulas in renal units could improve the care given to patients with end stage renal disease.</p

    Using theory to improve low back pain care in Australian Aboriginal primary care: a mixed method single cohort pilot study

    Get PDF
    Background: Low back pain (LBP) care is frequently discordant with research evidence. This pilot study evaluated changes in LBP care following a systematic, theory informed intervention in a rural Australian Aboriginal Health Service. We aimed to improve three aspects of care; reduce inappropriate LBP radiological imaging referrals, increase psychosocial oriented patient assessment and, increase the provision of LBP self-management information to patients. Methods: Three interventions to improve care were developed using a four-step systematic implementation approach. A mixed methods pre/post cohort design evaluated changes in the three behaviours using a clinical audit of LBP care in a six month period prior to the intervention and then following implementation. In-depth interviews elicited the perspectives of involved General Practitioners (GPs). Qualitative analysis was guided by the theoretical domains framework. Results: The proportion of patients who received guideline inconsistent imaging referrals (GICI) improved from 4.1 GICI per 10 patients to 0.4 (95 % CI for decrease in rate: 1.6 to 5.6) amongst GPs involved in the intervention. Amongst non-participating GPs (locum/part-time GPs who commenced post-interventions) the rate of GICI increased from 1.5 to 4.4 GICI per 10 patients (95 % CI for increase in rate: .5 to 5.3). There was a modest increase in the number of patients who received LBP self-management information from participating GPs and no substantial changes to psychosocial oriented patient assessments by any participants; however GPs qualitatively reported that their behaviours had changed. Knowledge and beliefs about consequences were important behavioural domains related to changes. Environmental and resource factors including protocols for locum staff and clinical tools embedded in patient management software were future strategies identified. Conclusions: A systematic intervention model resulted in partial improvements in LBP care. Determinants of practice change amongst GPs were increased knowledge of clinical guidelines, education delivered by someone considered a trusted source of information, and awareness of the negative consequences of inappropriate practices, especially radiological imaging on patient outcomes. Inconsistent and non-evidence based practices amongst locum GPs was an issue that emerged and will be a significant future challenge. The systematic approach utilised is applicable to other services interested in improving LBP care

    Three-dimensional drip infusion CT cholangiography in patients with suspected obstructive biliary disease: a retrospective analysis of feasibility and adverse reaction to contrast material.

    Get PDF
    BACKGROUND: Computed Tomography Cholangiography (CTC) is a fast and widely available alternative technique to visualise hepatobiliary disease in patients with an inconclusive ultrasound when MRI cannot be performed. The method has previously been relatively unknown and sparsely used, due to concerns about adverse reactions and about image quality in patients with impaired hepatic function and thus reduced contrast excretion. In this retrospective study, the feasibility and the frequency of adverse reactions of CTC when using a drip infusion scheme based on bilirubin levels were evaluated. METHODS: The medical records of patients who had undergone upper abdominal spiral CT with subsequent three-dimensional rendering of the biliary tract by means of CTC during seven years were retrospectively reviewed regarding serum bilirubin concentration, adverse reaction and presence of visible contrast media in the bile ducts at CT examination. In total, 153 consecutive examinations in 142 patients were reviewed. RESULTS: Contrast media was observed in the bile ducts at 144 examinations. In 110 examinations, the infusion time had been recorded in the medical records. Among these, 42 examinations had an elevated bilirubin value (>19 umol/L). There were nine patients without contrast excretion; 3 of which had a normal bilirubin value and 6 had an elevated value (25–133 umol/L). Two of the 153 examinations were inconclusive. One subject (0.7%) experienced a minor adverse reaction – a pricking sensation in the face. No other adverse effects were noted. CONCLUSION: We conclude that drip infusion CTC with an infusion rate of the biliary contrast agent iotroxate governed by the serum bilirubin value is a feasible and safe alternative to MRC in patients with and without impaired biliary excretion. In this retrospective study the feasibility and the frequency of adverse reactions when using a drip infusion scheme based on bilirubin levels has been evaluated

    An investigation into the validity of cervical spine motion palpation using subjects with congenital block vertebrae as a 'gold standard'

    Get PDF
    BACKGROUND: Although the effectiveness of manipulative therapy for treating back and neck pain has been demonstrated, the validity of many of the procedures used to detect joint dysfunction has not been confirmed. Practitioners of manual medicine frequently employ motion palpation as a diagnostic tool, despite conflicting evidence regarding its utility and reliability. The introduction of various spinal models with artificially introduced 'fixations' as an attempt to introduce a 'gold standard' has met with frustration and frequent mechanical failure. Because direct comparison against a 'gold standard' allows the validity, specificity and sensitivity of a test to be calculated, the identification of a realistic 'gold standard' against which motion palpation can be evaluated is essential. The objective of this study was to introduce a new, realistic, 'gold standard', the congenital block vertebra (CBV) to assess the validity of motion palpation in detecting a true fixation. METHODS: Twenty fourth year chiropractic students examined the cervical spines of three subjects with single level congenital block vertebrae, using two commonly employed motion palpation tests. The examiners, who were blinded to the presence of congenital block vertebrae, were asked to identify the most hypomobile segment(s). The congenital block segments included two subjects with fusion at the C2–3 level and one with fusion at C5-6. Exclusion criteria included subjects who were frankly symptomatic, had moderate or severe degenerative changes in their cervical spines, or displayed signs of cervical instability. Spinal levels were marked on the subject's skin overlying the facet joints from C1 to C7 bilaterally and the motion segments were then marked alphabetically with 'A' corresponding to C1-2. Kappa coefficients (K) were calculated to determine the validity of motion palpation to detect the congenitally fused segments as the 'most hypomobile' segments. Sensitivity and specificity of the diagnostic procedure were also calculated. RESULTS: Kappa coefficients (K) showed substantial overall agreement for identification of the segment of greatest hypomobility (K = 0.65), with substantial (K = 0.76) and moderate (K = 0.46) agreement for hypomobility at C2-3 and C5-6 respectively. Sensitivity ranged from 55% at the C5-6 CBV to 78% at the C2-3 level. Specificity of the procedure was high (91 – 98%). CONCLUSION: This study indicates that relatively inexperienced examiners are capable of correctly identifying inter-segmental fixations (CBV) in the cervical spine using 2 commonly employed motion palpation tests. The use of a 'gold standard' (CBV) in this study and the substantial agreement achieved lends support to the validity of motion palpation in detecting major spinal fixations in the cervical spine
    corecore