257 research outputs found

    The value of home urodiagnostics in the assessment of men with lower urinary tract symptoms

    Get PDF
    PhD ThesisA third of all men experience unpleasant lower urinary tract symptoms (LUTS) such as a poor stream and being unable to postpone urination, usually later in life. Two important investigations for these men are: a one-o clinic-based measurement of urine ow rate, and the patient's hand written record of volumes passed over the course of several days. Well acknowledged deficiencies in these tests have spurred research into home-based alternatives. `Home urodiagnostic' devices have been developed that obtain multiple measurements of flow rate and an electronic voiding diary. However, little conclusive evidence exists as to their clinical utility. The aim of this thesis is to investigate the value of home urodiagnostics in the assessment of men with LUTS. First, the improvement in clinical performance of an average rather than single flow rate measurement is calculated based upon the theory of combining variance, predicting benefit for thousands of men per year. Next, finding existing devices deficient, the characteristics and technical performance of a novel device are presented. Despite its low cost, it is found to meet the required standard. In a study of conventional versus home urodiagnostics in men with LUTS, the latter is better tolerated, less likely to fail and gave more reliable measurement of flow rate. A study in which home urodiagnostics was performed before and after prostate surgery reveals large variation in the response of flow rate to surgery. Subtle changes within an individual are demonstrable. Finally, home urodiagnostics is piloted within primary care, where the resulting data suggests benefit from a change in the management strategy of over a third of patients studied. In conclusion, home urodiagnostics shows promise for improving the assessment of men with LUTS. The next step is to evaluate the effect on patient reported outcomes in a large scale trial.The Wellcome Trust

    Self-monitoring blood pressure in patients with hypertension: an internet-based survey of UK GPs.

    Get PDF
    BACKGROUND: Previous research suggests that most GPs in the UK use self-monitoring of blood pressure (SMBP) to monitor the control of hypertension rather than for diagnosis. This study sought to assess current practice in the use of self-monitoring and any changes in practice following more recent guideline recommendations. AIM: To survey the views and practice of UK GPs in 2015 with regard to SMBP and compare them with a previous survey carried out in 2011. DESIGN AND SETTING: Web-based survey of a regionally representative sample of 300 UK GPs. METHOD: GPs completed an online questionnaire concerning the use of SMBP in the management of hypertension. Analyses comprised descriptive statistics, tests for between-group differences (z, Wilcoxon signed-rank, and χ2 tests), and multivariate logistic regression. RESULTS: Results were available for 300 GPs (94% of those who started the survey). GPs reported using self-monitoring to diagnose hypertension (169/291; 58%; 95% confidence interval (CI) = 52 to 64) and to monitor control (245/291; 84%; 95% CI = 80 to 88), the former having significantly increased since 2011 (from 37%; 95% CI = 33 to 41; P<0.001) with no change in monitoring for control. More than half of GPs used higher systolic thresholds for diagnosis (118/169; 70%; 95% CI = 63 to 77) and treatment (168/225; 75%; 95% CI = 69 to 80) than recommended in guidelines, and under half (120/289; 42%; 95% CI = 36 to 47) adjusted the SMBP results to guide treatment decisions. CONCLUSION: Since new UK national guidance in 2011, GPs are more likely to use SMBP to diagnose hypertension. However, significant proportions of GPs continue to use non-standard diagnostic and monitoring thresholds. The use of out-of-office methods to improve the accuracy of diagnosis is unlikely to be beneficial if suboptimal thresholds are used.This study was funded by the British Hypertension Society and the NIHR. Ben Fletcher receives funding from the National Institute for Health Research (NIHR) School for Primary Care Research (SPCR) Doctoral Studentship. Richard McManus holds an NIHR Professorship (RP-02-12-015)) and receives funding from the NIHR Oxford CLAHRC. This article presents independent research funded by the NIHR. The views expressed are those of the authors and not necessarily those of the NHS, the NIHR or the Department of Health.This is the author accepted manuscript. The final version is available from Royal College of General Practitioners via https://doi.org/10.3399/bjgp16X68703

    Effective Post-Exposure Treatment of Ebola Infection

    Get PDF
    Ebola viruses are highly lethal human pathogens that have received considerable attention in recent years due to an increasing re-emergence in Central Africa and a potential for use as a biological weapon. There is no vaccine or treatment licensed for human use. In the past, however, important advances have been made in developing preventive vaccines that are protective in animal models. In this regard, we showed that a single injection of a live-attenuated recombinant vesicular stomatitis virus vector expressing the Ebola virus glycoprotein completely protected rodents and nonhuman primates from lethal Ebola challenge. In contrast, progress in developing therapeutic interventions against Ebola virus infections has been much slower and there is clearly an urgent need to develop effective post-exposure strategies to respond to future outbreaks and acts of bioterrorism, as well as to treat laboratory exposures. Here we tested the efficacy of the vesicular stomatitis virus-based Ebola vaccine vector in post-exposure treatment in three relevant animal models. In the guinea pig and mouse models it was possible to protect 50% and 100% of the animals, respectively, following treatment as late as 24 h after lethal challenge. More important, four out of eight rhesus macaques were protected if treated 20 to 30 min following an otherwise uniformly lethal infection. Currently, this approach provides the most effective post-exposure treatment strategy for Ebola infections and is particularly suited for use in accidentally exposed individuals and in the control of secondary transmission during naturally occurring outbreaks or deliberate release

    Psychosocial Workplace Factors and Healthcare Utilization: A Study of Two Employers

    Get PDF
    Abstract Background: While a large literature links psychosocial workplace factors with health and health behaviors, there is very little work connecting psychosocial workplace factors to healthcare utilization. Methods: Survey data were collected from two different employers using computer-assisted telephone interviewing as a part of the Work-Family Health Network (2008-2013): one in the information technology (IT) service industry and one that is responsible for a network of long-term care (LTC) facilities. Participants were surveyed four times at six month intervals. Responses in each wave were used to predict utilization in the following wave. Four utilization measures were outcomes: having at least one emergency room (ER)/Urgent care, having at least one other healthcare visit, number of ER/urgent care visits, and number of other healthcare visits. Population-averaged models using all four waves controlled for health and other factors associated with utilization. Results: Having above median job demands was positively related to the odds of at least one healthcare visit, odds ratio [OR] 1.37 (P<.01), and the number of healthcare visits, incidence rate ratio (IRR) 1.36 (P<.05), in the LTC sample. Work-to-family conflict was positively associated with the odds of at least one ER/urgent care visit in the LTC sample, OR 1.15 (P<.05), at least one healthcare visit in the IT sample, OR 1.35 (P<.01), and with more visits in the IT sample, IRR 1.35 (P<.01). Greater schedule control was associated with reductions in the number of ER/urgent care visits, IRR 0.71 (P<.05), in the IT sample. Conclusion: Controlling for other factors, some psychosocial workplace factors were associated with future healthcare utilization. Additional research is needed

    Detecting Compaction Disequilibrium with Anisotropy of Magnetic Susceptibility

    Get PDF
    In clay-rich sediment, microstructures and macrostructures influence how sediments deform when under stress. When lithology is fairly constant, anisotropy of magnetic susceptibility (AMS) can be a simple technique for measuring the relative consolidation state of sediment, which reflects the sediment burial history. AMS can reveal areas of high water content and apparent overconsolidation associated with unconformities where sediment overburden has been removed. Many other methods for testing consolidation and water content are destructive and invasive, whereas AMS provides a nondestructive means to focus on areas for additional geotechnical study. In zones where the magnetic minerals are undergoing diagenesis, AMS should not be used for detecting compaction state. By utilizing AMS in the Santa Barbara Basin, we were able to identify one clear unconformity and eight zones of high water content in three cores. With the addition of susceptibility, anhysteretic remanent magnetization, and isothermal remanent magnetization rock magnetic techniques, we excluded 3 out of 11 zones from being compaction disequilibria. The AMS signals for these three zones are the result of diagenesis, coring deformation, and burrows. In addition, using AMS eigenvectors, we are able to accurately show the direction of maximum compression for the accumulation zone of the Gaviota Slide

    Common Genetic Variants Found in HLA and KIR Immune Genes in Autism Spectrum Disorder

    Get PDF
    The “common variant—common disease” hypothesis was proposed to explain diseases with strong inheritance. This model suggests that a genetic disease is the result of the combination of several common genetic variants. Common genetic variants are described as a 5% frequency differential between diseased vs. matched control populations. This theory was recently supported by an epidemiology paper stating that about 50% of genetic risk for autism resides in common variants. However, rare variants, rather than common variants, have been found in numerous genome wide genetic studies and many have concluded that the “common variant—common disease” hypothesis is incorrect. One interpretation is that rare variants are major contributors to genetic diseases and autism involves the interaction of many rare variants, especially in the brain. It is obvious there is much yet to be learned about autism genetics. Evidence has been mounting over the years indicating immune involvement in autism, particularly the HLA genes on chromosome 6 and KIR genes on chromosome 19. These two large multigene complexes have important immune functions and have been shown to interact to eliminate unwanted virally infected and malignant cells. HLA proteins have important functions in antigen presentation in adaptive immunity and specific epitopes on HLA class I proteins act as cognate ligands for KIR receptors in innate immunity. Data suggests that HLA alleles and KIR activating genes/haplotypes are common variants in different autism populations. For example, class I allele (HLA-A2 and HLA-G 14 bp-indel) frequencies are significantly increased by more than 5% over control populations (Table 2). The HLA-DR4 Class II and shared epitope frequencies are significantly above the control populations (Table 2). Three activating KIR genes: 3DS1, 2DS1, and 2DS2 have increased frequencies of 15, 22, and 14% in autism populations, respectively. There is a 6% increase in total activating KIR genes in autism over control subjects. And, more importantly there is a 12% increase in activating KIR genes and their cognate HLA alleles over control populations (Torres et al., 2012a). These data suggest the interaction of HLA ligand/KIR receptor pairs encoded on two different chromosomes is more significant as a ligand/receptor complex than separately in autism

    Barriers impacting the POINT pragmatic trial: the unavoidable overlap between research and intervention procedures in “real-world” research

    Get PDF
    Background: This manuscript provides a research update to the ongoing pragmatic trial of Project POINT (Planned Outreach, Intervention, Naloxone, and Treatment), an emergency department-based peer recovery coaching intervention for linking patients with opioid use disorder to evidence-based treatment. The research team has encountered a number of challenges related to the "real-world" study setting since the trial began. Using an implementation science lens, we sought to identify and describe barriers impacting both the intervention and research protocols of the POINT study, which are often intertwined in pragmatic trials due to the focus on external validity. Method: Qualitative data were collected from 3 peer recovery coaches, 2 peer recovery coach supervisors, and 3 members of the research team. Questions and deductive qualitative analysis were guided by the Consolidated Framework for Implementation Research (CFIR). Results: Nine unique barriers were noted, with 5 of these barriers impacting intervention and research protocol implementation simultaneously. These simultaneous barriers were timing of intervention delivery, ineffective communication with emergency department staff, lack of privacy in the emergency department, the fast-paced emergency department setting, and patient's limited resources. Together, these barriers represent the intervention characteristics, inner setting, and outer setting domains of the CFIR. Conclusion: Results highlight the utility of employing an implementation science framework to assess implementation issues in pragmatic trials and how this approach might be used as a quality assurance mechanism given the considerable overlap that exists between research and intervention protocols in real-world trial settings. Previously undocumented changes to the trial design that have been made as a result of the identified barriers are discussed
    corecore