211 research outputs found

    Organizational factors and depression management in community-based primary care settings

    Get PDF
    Abstract Background Evidence-based quality improvement models for depression have not been fully implemented in routine primary care settings. To date, few studies have examined the organizational factors associated with depression management in real-world primary care practice. To successfully implement quality improvement models for depression, there must be a better understanding of the relevant organizational structure and processes of the primary care setting. The objective of this study is to describe these organizational features of routine primary care practice, and the organization of depression care, using survey questions derived from an evidence-based framework. Methods We used this framework to implement a survey of 27 practices comprised of 49 unique offices within a large primary care practice network in western Pennsylvania. Survey questions addressed practice structure (e.g., human resources, leadership, information technology (IT) infrastructure, and external incentives) and process features (e.g., staff performance, degree of integrated depression care, and IT performance). Results The results of our survey demonstrated substantial variation across the practice network of organizational factors pertinent to implementation of evidence-based depression management. Notably, quality improvement capability and IT infrastructure were widespread, but specific application to depression care differed between practices, as did coordination and communication tasks surrounding depression treatment. Conclusions The primary care practices in the network that we surveyed are at differing stages in their organization and implementation of evidence-based depression management. Practical surveys such as this may serve to better direct implementation of these quality improvement strategies for depression by improving understanding of the organizational barriers and facilitators that exist within both practices and practice networks. In addition, survey information can inform efforts of individual primary care practices in customizing intervention strategies to improve depression management.http://deepblue.lib.umich.edu/bitstream/2027.42/78269/1/1748-5908-4-84.xmlhttp://deepblue.lib.umich.edu/bitstream/2027.42/78269/2/1748-5908-4-84-S1.PDFhttp://deepblue.lib.umich.edu/bitstream/2027.42/78269/3/1748-5908-4-84.pdfPeer Reviewe

    Simulation Modelling in Ophthalmology : Application to Cost Effectiveness of Ranibizumab and Aflibercept for the Treatment of Wet Age-Related Macular Degeneration in the United Kingdom

    Get PDF
    Previously developed models in ophthalmology have generally used a Markovian structure. There are a number of limitations with this approach, most notably the ability to base patient outcomes on best-corrected visual acuity (BCVA) in both eyes, which may be overcome using a different modelling structure. Simulation modelling allows for this to be modelled more precisely, and therefore may provide more accurate and relevant estimates of the cost effectiveness of ophthalmology interventions

    Citizen Science Reveals Unexpected Continental-Scale Evolutionary Change in a Model Organism

    Get PDF
    Organisms provide some of the most sensitive indicators of climate change and evolutionary responses are becoming apparent in species with short generation times. Large datasets on genetic polymorphism that can provide an historical benchmark against which to test for recent evolutionary responses are very rare, but an exception is found in the brown-lipped banded snail (Cepaea nemoralis). This species is sensitive to its thermal environment and exhibits several polymorphisms of shell colour and banding pattern affecting shell albedo in the majority of populations within its native range in Europe. We tested for evolutionary changes in shell albedo that might have been driven by the warming of the climate in Europe over the last half century by compiling an historical dataset for 6,515 native populations of C. nemoralis and comparing this with new data on nearly 3,000 populations. The new data were sampled mainly in 2009 through the Evolution MegaLab, a citizen science project that engaged thousands of volunteers in 15 countries throughout Europe in the biggest such exercise ever undertaken. A known geographic cline in the frequency of the colour phenotype with the highest albedo (yellow) was shown to have persisted and a difference in colour frequency between woodland and more open habitats was confirmed, but there was no general increase in the frequency of yellow shells. This may have been because snails adapted to a warming climate through behavioural thermoregulation. By contrast, we detected an unexpected decrease in the frequency of Unbanded shells and an increase in the Mid-banded morph. Neither of these evolutionary changes appears to be a direct response to climate change, indicating that the influence of other selective agents, possibly related to changing predation pressure and habitat change with effects on micro-climate

    The appropriateness of prescribing antibiotics in the community in Europe: study design

    Get PDF
    Contains fulltext : 97417.pdf (publisher's version ) (Open Access)ABSTRACT: BACKGROUND: Over 90% of all antibiotics in Europe are prescribed in primary care. It is important that antibiotics are prescribed that are likely to be effective; however, information about antibiotic resistance in the community is incomplete. The aim of our study is to investigate the appropriateness of antibiotic prescribing in primary care in Europe by collecting and combining patterns of antibiotic resistance patterns and antibiotic prescription patterns in primary care. We will also evaluate the appropriateness of national antibiotic prescription guidelines in relation to resistance patterns. METHODS/DESIGN: Antibiotic resistance will be studied in an opportunistic sample from the community in nine European countries. Resistance data will be collected by taking a nose swab of persons (N = 4,000 per country) visiting a primary care practice for a non-infectious disease. Staphylococcus aureus and Streptococcus pneumoniae will be isolated and tested for resistance to a range of antibiotics in one central laboratory. Data on antibiotic prescriptions over the past 5 years will be extracted from the electronic medical records of General Practitioners (GPs). The results of the study will include the prevalence and resistance data of the two species and 5 years of antibiotic prescription data in nine European countries.The odds of receiving an effective antibiotic in each country will be calculated as a measure for the appropriateness of prescribing. Multilevel analysis will be used to assess the appropriateness of prescribing. Relevant treatment guidelines of the nine participating countries will be evaluated using a standardized instrument and related to the resistance patterns in that country. DISCUSSION: This study will provide valuable and unique data concerning resistance patterns and prescription behaviour in primary care in nine European countries. It will provide evidence-based recommendations for antibiotic treatment guidelines that take resistance patterns into account which will be useful for both clinicians and policy makers. By improving antibiotic use we can move towards controlling the resistance problem globally

    A novel framework for validating and applying standardized small area measurement strategies

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Local measurements of health behaviors, diseases, and use of health services are critical inputs into local, state, and national decision-making. Small area measurement methods can deliver more precise and accurate local-level information than direct estimates from surveys or administrative records, where sample sizes are often too small to yield acceptable standard errors. However, small area measurement requires careful validation using approaches other than conventional statistical methods such as in-sample or cross-validation methods because they do not solve the problem of validating estimates in data-sparse domains.</p> <p>Methods</p> <p>A new general framework for small area estimation and validation is developed and applied to estimate Type 2 diabetes prevalence in US counties using data from the Behavioral Risk Factor Surveillance System (BRFSS). The framework combines the three conventional approaches to small area measurement: (1) pooling data across time by combining multiple survey years; (2) exploiting spatial correlation by including a spatial component; and (3) utilizing structured relationships between the outcome variable and domain-specific covariates to define four increasingly complex model types - coined the Naive, Geospatial, Covariate, and Full models. The validation framework uses direct estimates of prevalence in large domains as the gold standard and compares model estimates against it using (i) all available observations for the large domains and (ii) systematically reduced sample sizes obtained through random sampling with replacement. At each sampling level, the model is rerun repeatedly, and the validity of the model estimates from the four model types is then determined by calculating the (average) concordance correlation coefficient (CCC) and (average) root mean squared error (RMSE) against the gold standard. The CCC is closely related to the intraclass correlation coefficient and can be used when the units are organized in groups and when it is of interest to measure the agreement between units in the same group (e.g., counties). The RMSE is often used to measure the differences between values predicted by a model or an estimator and the actually observed values. It is a useful measure to capture the precision of the model or estimator.</p> <p>Results</p> <p>All model types have substantially higher CCC and lower RMSE than the direct, single-year BRFSS estimates. In addition, the inclusion of relevant domain-specific covariates generally improves predictive validity, especially at small sample sizes, and their leverage can be equivalent to a five- to tenfold increase in sample size.</p> <p>Conclusions</p> <p>Small area estimation of important health outcomes and risk factors can be improved using a systematic modeling and validation framework, which consistently outperformed single-year direct survey estimates and demonstrated the potential leverage of including relevant domain-specific covariates compared to pure measurement models. The proposed validation strategy can be applied to other disease outcomes and risk factors in the US as well as to resource-scarce situations, including low-income countries. These estimates are needed by public health officials to identify at-risk groups, to design targeted prevention and intervention programs, and to monitor and evaluate results over time.</p

    Human blood autoantibodies in the detection of colorectal cancer

    Get PDF
    Colorectal cancer (CRC) is the second most common malignancy in the western world. Early detection and diagnosis of all cancer types is vital to improved prognosis by enabling early treatment when tumours should be both resectable and curable. Sera from 3 different cohorts; 42 sera (21 CRC and 21 matched controls) from New York, USA, 200 sera from Pittsburgh, USA (100 CRC and 100 controls) and 20 sera from Dundee, UK (10 CRC and 10 controls) were tested against a panel of multiple tumour-associated antigens (TAAs) using an optimised multiplex microarray system. TAA specific IgG responses were interpo- lated against the internal IgG standard curve for each sample. Individual TAA specific responses were examined in each cohort to determine cutoffs for a robust initial scoring method to establish sensitivity and specificity. Sensitivity and specificity of combinations of TAAs provided good discrimination between cancer-positive and normal serum. The overall sensitivity and specificity of the sample sets tested against a panel of 32 TAAs were 61.1% and 80.9% respectively for 6 antigens; p53, AFP, K RAS, Annexin, RAF1 and NY-CO16. Furthermore, the observed sensitivity in Pittsburgh sample set in different clinical stages of CRC;stageI(n=19),stageII(n=40),stageIII(n=34)andstageIV(n=6)wassimilar (73.6%, 75.0%, 73.5% and 83.3%, respectively), with similar levels of sensitivity for right and left sided CRC. We identified an antigen panel of sufficient sensitivity and specificity for early detection of CRC, based upon serum profiling of autoantibody response using a robust multiplex antigen microarray technology. This opens the possibility of a blood test for screening and detection of early colorectal cancer. However this panel will require further validation studies before they can be proposed for clinical practice

    Maintenance treatment of adolescent bipolar disorder: open study of the effectiveness and tolerability of quetiapine

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The purpose of the study was to determine the effectiveness and tolerability of quetiapine as a maintenance treatment preventing against relapse or recurrence of acute mood episodes in adolescent patients diagnosed with bipolar disorder.</p> <p>Methods</p> <p>Consenting patients meeting DSM-IV lifetime criteria for a bipolar disorder and clinically appropriate for maintenance treatment were enrolled in a 48-week open prospective study. After being acutely stabilized (CGI-S ≤ 3 for 4 consecutive weeks), patients were started or continued on quetiapine and other medications were weaned off over an 8-week period. Quetiapine monotherapy was continued for 40-weeks and other mood stabilizers or antidepressants were added if clinically indicated. A neurocognitive test battery assessing the most reliable findings in adult patients was administered at fixed time points throughout the study to patients and matched controls.</p> <p>Results</p> <p>Of the 21 enrolled patients, 18 completed the 48-week study. Thirteen patients were able to be maintained without relapse or recurrence in good quality remission on quetiapine monotherapy, while 5 patients required additional medication to treat impairing residual depressive and/or anxiety symptoms. According to symptom ratings and global functioning scores, the quality of remission for all patients was very good.</p> <p>Neurocognitive test performance over treatment was equivalent to that of a matched control group of never ill adolescents. Quetiapine was generally well tolerated with no serious adverse effects.</p> <p>Conclusion</p> <p>This study suggests that a proportion of adolescent patients diagnosed with bipolar disorder can be successfully maintained on quetiapine monotherapy. The good quality of clinical remission and preserved neurocognitive functioning underscores the importance of early diagnosis and effective stabilization.</p> <p>Clinical Trials Registry</p> <p>D1441L00024</p

    Regulation of immunity during visceral Leishmania infection

    Get PDF
    Unicellular eukaryotes of the genus Leishmania are collectively responsible for a heterogeneous group of diseases known as leishmaniasis. The visceral form of leishmaniasis, caused by L. donovani or L. infantum, is a devastating condition, claiming 20,000 to 40,000 lives annually, with particular incidence in some of the poorest regions of the world. Immunity to Leishmania depends on the development of protective type I immune responses capable of activating infected phagocytes to kill intracellular amastigotes. However, despite the induction of protective responses, disease progresses due to a multitude of factors that impede an optimal response. These include the action of suppressive cytokines, exhaustion of specific T cells, loss of lymphoid tissue architecture and a defective humoral response. We will review how these responses are orchestrated during the course of infection, including both early and chronic stages, focusing on the spleen and the liver, which are the main target organs of visceral Leishmania in the host. A comprehensive understanding of the immune events that occur during visceral Leishmania infection is crucial for the implementation of immunotherapeutic approaches that complement the current anti-Leishmania chemotherapy and the development of effective vaccines to prevent disease.The research leading to these results has received funding from the European Community’s Seventh Framework Programme under grant agreement No.602773 (Project KINDRED). VR is supported by a post-doctoral fellowship granted by the KINDReD consortium. RS thanks the Foundation for Science and Technology (FCT) for an Investigator Grant (IF/00021/2014). This work was supported by grants to JE from ANR (LEISH-APO, France), Partenariat Hubert Curien (PHC) (program Volubilis, MA/11/262). JE acknowledges the support of the Canada Research Chair Program

    Influenza Infectious Dose May Explain the High Mortality of the Second and Third Wave of 1918–1919 Influenza Pandemic

    Get PDF
    BACKGROUND: It is widely accepted that the shift in case-fatality rate between waves during the 1918 influenza pandemic was due to a genetic change in the virus. In animal models, the infectious dose of influenza A virus was associated to the severity of disease which lead us to propose a new hypothesis. We propose that the increase in the case-fatality rate can be explained by the dynamics of disease and by a dose-dependent response mediated by the number of simultaneous contacts a susceptible person has with infectious ones. METHODS: We used a compartment model with seasonality, waning of immunity and a Holling type II function, to model simultaneous contacts between a susceptible person and infectious ones. In the model, infected persons having mild or severe illness depend both on the proportion of infectious persons in the population and on the level of simultaneous contacts between a susceptible and infectious persons. We further allowed for a high or low rate of waning immunity and volunteer isolation at different times of the epidemic. RESULTS: In all scenarios, case-fatality rate was low during the first wave (Spring) due to a decrease in the effective reproduction number. The case-fatality rate in the second wave (Autumn) depended on the ratio between the number of severe cases to the number of mild cases since, for each 1000 mild infections only 4 deaths occurred whereas for 1000 severe infections there were 20 deaths. A third wave (late Winter) was dependent on the rate for waning immunity or on the introduction of new susceptible persons in the community. If a group of persons became voluntarily isolated and returned to the community some days latter, new waves occurred. For a fixed number of infected persons the overall case-fatality rate decreased as the number of waves increased. This is explained by the lower proportion of infectious individuals in each wave that prevented an increase in the number of severe infections and thus of the case-fatality rate. CONCLUSION: The increase on the proportion of infectious persons as a proxy for the increase of the infectious dose a susceptible person is exposed, as the epidemic develops, can explain the shift in case-fatality rate between waves during the 1918 influenza pandemic.TD acknowledges the support of the Faculdade de Ciencias e Tecnologia through grant PPCDT/AMB/55701/2004. The funders had no role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript

    Replacement of α-galactosidase A in Fabry disease: effect on fibroblast cultures compared with biopsied tissues of treated patients

    Get PDF
    The function and intracellular delivery of enzyme therapeutics for Fabry disease were studied in cultured fibroblasts and in the biopsied tissues of two male patients to show diversity of affected cells in response to treatment. In the mutant fibroblasts cultures, the final cellular level of endocytosed recombinant α-galactosidases A (agalsidases, FabrazymeTM, and ReplagalTM) exceeded, by several fold, the amount in control fibroblasts and led to efficient direct intra-lysosomal hydrolysis of (3H)Gb3Cer. In contrast, in the samples from the heart and some other tissues biopsied after several months of enzyme replacement therapy (ERT) with FabrazymeTM, only the endothelial cells were free of storage. Persistent Gb3Cer storage was found in cardiocytes (accompanied by increase of lipopigment), smooth muscle cells, fibroblasts, sweat glands, and skeletal muscle. Immunohistochemistry of cardiocytes demonstrated, for the first time, the presence of a considerable amount of the active enzyme in intimate contact with the storage compartment. Factors responsible for the limited ERT effectiveness are discussed, namely post-mitotic status of storage cells preventing their replacement by enzyme supplied precursors, modification of the lysosomal system by longstanding storage, and possible relative lack of Sap B. These observations support the strategy of early treatment for prevention of lysosomal storage
    • …
    corecore