3,348 research outputs found

    PCV17: QUALITY OF LIFE AND PATIENT PREFERENCE AS PREDICTORS FOR RESOURCE UTILIZATION AMONG PATIENTS WITH HEART FAILURE; INTERIM ANALYSIS

    Get PDF

    The relationship between area poverty rate and site-specific cancer incidence in the United States

    Get PDF
    BACKGROUND The relationship between socioeconomic status and cancer incidence in the United States has not traditionally been a focus of population-based cancer surveillance systems. METHODS Nearly 3 million tumors diagnosed between 2005 and 2009 from 16 states plus Los Angeles were assigned into 1 of 4 groupings based on the poverty rate of the residential census tract at time of diagnosis. The sex-specific risk ratio of the highest-to-lowest poverty category was measured using Poisson regression, adjusting for age and race, for 39 cancer sites. RESULTS For all sites combined, there was a negligible association between cancer incidence and poverty; however, 32 of 39 cancer sites showed a significant association with poverty (14 positively associated and 18 negatively associated). Nineteen of these sites had monotonic increases or decreases in risk across all 4 poverty categories. The sites most strongly associated with higher poverty were Kaposi sarcoma, larynx, cervix, penis, and liver; those most strongly associated with lower poverty were melanoma, thyroid, other nonepithelial skin, and testis. Sites associated with higher poverty had lower incidence and higher mortality than those associated with lower poverty. CONCLUSIONS These findings demonstrate the importance and relevance of including a measure of socioeconomic status in national cancer surveillanc

    Long-term efficacy and safety of dichlorphenamide for treatment of primary periodic paralysis

    Get PDF
    Introduction/Aim: Long-term efficacy and safety of dichlorphenamide (DCP) were characterized in patients with primary periodic paralysis (PPP). Methods: Patients with PPP in a double-blind, placebo-controlled study were randomly assigned to receive DCP 50 mg twice daily or placebo for 9 weeks, followed by a 52-week open-label DCP treatment phase (DCP/DCP and placebo/DCP populations). Efficacy (attack rate, severity-weighted attack rate) and safety were assessed in patients completing the study (61 weeks). In this post hoc analysis, efficacy and safety data were pooled from hyperkalemic and hypokalemic substudies. Results: Sixty-three adults (age, 19-76 years) completed the double-blind phase; 47 (74.6%) of these patients completed 61 weeks. There were median decreases in weekly attack and severity-weighted attack rates from baseline to week 61 (DCP/DCP [n = 25], −1.00 [P < .0001]; placebo/DCP [n = 20], −0.63 [P = .01] and DCP/DCP, −2.25 [P < .0001]; placebo/DCP, −1.69 [P = .01]). Relatively smaller median decreases in weekly attack and severity-weighted attack rates occurred from weeks 9 to 61 among patients receiving DCP continuously (n = 26; −0.14 [P = .1] and −0.24 [P = .09]) than among those switching from placebo to DCP after 9 weeks (n = 16; −1.04 [P = .049] and −2.72 [P = .08]). Common adverse events (AEs) were paresthesia and cognition-related events, which typically first occurred within 1 month of blinded treatment initiation and in rare cases led to treatment discontinuation. Dose reductions were frequently associated with common AE resolution. Discussion: One-year open-label DCP treatment after a 9-week randomized, controlled study confirmed long-term DCP remains safe and effective for chronic use. Tolerability issues (paresthesia, cognition-related AEs) were manageable in most patients

    Grifonin-1: A Small HIV-1 Entry Inhibitor Derived from the Algal Lectin, Griffithsin

    Get PDF
    Background: Griffithsin, a 121-residue protein isolated from a red algal Griffithsia sp., binds high mannose N-linked glycans of virus surface glycoproteins with extremely high affinity, a property that allows it to prevent the entry of primary isolates and laboratory strains of T- and M-tropic HIV-1. We used the sequence of a portion of griffithsin's sequence as a design template to create smaller peptides with antiviral and carbohydrate-binding properties. Methodology/Results: The new peptides derived from a trio of homologous β-sheet repeats that comprise the motifs responsible for its biological activity. Our most active antiviral peptide, grifonin-1 (GRFN-1), had an EC50 of 190.8±11.0 nM in in vitro TZM-bl assays and an EC50 of 546.6±66.1 nM in p24gag antigen release assays. GRFN-1 showed considerable structural plasticity, assuming different conformations in solvents that differed in polarity and hydrophobicity. Higher concentrations of GRFN-1 formed oligomers, based on intermolecular β-sheet interactions. Like its parent protein, GRFN-1 bound viral glycoproteins gp41 and gp120 via the N-linked glycans on their surface. Conclusion: Its substantial antiviral activity and low toxicity in vitro suggest that GRFN-1 and/or its derivatives may have therapeutic potential as topical and/or systemic agents directed against HIV-1

    Pitfalls of using the risk ratio in meta‐analysis

    Get PDF
    For meta-analysis of studies that report outcomes as binomial proportions, the most popular measure of effect is the odds ratio (OR), usually analyzed as log(OR). Many meta-analyses use the risk ratio (RR) and its logarithm, because of its simpler interpretation. Although log(OR) and log(RR) are both unbounded, use of log(RR) must ensure that estimates are compatible with study-level event rates in the interval (0, 1). These complications pose a particular challenge for random-effects models, both in applications and in generating data for simulations. As background we review the conventional random-effects model and then binomial generalized linear mixed models (GLMMs) with the logit link function, which do not have these complications. We then focus on log-binomial models and explore implications of using them; theoretical calculations and simulation show evidence of biases. The main competitors to the binomial GLMMs use the beta-binomial (BB) distribution, either in BB regression or by maximizing a BB likelihood; a simulation produces mixed results. Two examples and an examination of Cochrane meta-analyses that used RR suggest bias in the results from the conventional inverse-variance-weighted approach. Finally, we comment on other measures of effect that have range restrictions, including risk difference, and outline further research

    Approximating k-Forest with Resource Augmentation: A Primal-Dual Approach

    Full text link
    In this paper, we study the kk-forest problem in the model of resource augmentation. In the kk-forest problem, given an edge-weighted graph G(V,E)G(V,E), a parameter kk, and a set of mm demand pairs V×V\subseteq V \times V, the objective is to construct a minimum-cost subgraph that connects at least kk demands. The problem is hard to approximate---the best-known approximation ratio is O(min{n,k})O(\min\{\sqrt{n}, \sqrt{k}\}). Furthermore, kk-forest is as hard to approximate as the notoriously-hard densest kk-subgraph problem. While the kk-forest problem is hard to approximate in the worst-case, we show that with the use of resource augmentation, we can efficiently approximate it up to a constant factor. First, we restate the problem in terms of the number of demands that are {\em not} connected. In particular, the objective of the kk-forest problem can be viewed as to remove at most mkm-k demands and find a minimum-cost subgraph that connects the remaining demands. We use this perspective of the problem to explain the performance of our algorithm (in terms of the augmentation) in a more intuitive way. Specifically, we present a polynomial-time algorithm for the kk-forest problem that, for every ϵ>0\epsilon>0, removes at most mkm-k demands and has cost no more than O(1/ϵ2)O(1/\epsilon^{2}) times the cost of an optimal algorithm that removes at most (1ϵ)(mk)(1-\epsilon)(m-k) demands

    Truncated and Helix-Constrained Peptides with High Affinity and Specificity for the cFos Coiled-Coil of AP-1

    Get PDF
    Protein-based therapeutics feature large interacting surfaces. Protein folding endows structural stability to localised surface epitopes, imparting high affinity and target specificity upon interactions with binding partners. However, short synthetic peptides with sequences corresponding to such protein epitopes are unstructured in water and promiscuously bind to proteins with low affinity and specificity. Here we combine structural stability and target specificity of proteins, with low cost and rapid synthesis of small molecules, towards meeting the significant challenge of binding coiled coil proteins in transcriptional regulation. By iteratively truncating a Jun-based peptide from 37 to 22 residues, strategically incorporating i-->i+4 helix-inducing constraints, and positioning unnatural amino acids, we have produced short, water-stable, alpha-helical peptides that bind cFos. A three-dimensional NMR-derived structure for one peptide (24) confirmed a highly stable alpha-helix which was resistant to proteolytic degradation in serum. These short structured peptides are entropically pre-organized for binding with high affinity and specificity to cFos, a key component of the oncogenic transcriptional regulator Activator Protein-1 (AP-1). They competitively antagonized the cJun–cFos coiled-coil interaction. Truncating a Jun-based peptide from 37 to 22 residues decreased the binding enthalpy for cJun by ~9 kcal/mol, but this was compensated by increased conformational entropy (TDS ≤ 7.5 kcal/mol). This study demonstrates that rational design of short peptides constrained by alpha-helical cyclic pentapeptide modules is able to retain parental high helicity, as well as high affinity and specificity for cFos. These are important steps towards small antagonists of the cJun-cFos interaction that mediates gene transcription in cancer and inflammatory diseases

    Undertaking rapid evaluations during the COVID-19 pandemic: Lessons from evaluating COVID-19 remote home monitoring services in England

    Get PDF
    Introduction: Rapid evaluations can offer evidence on innovations in health and social care that can be used to inform fast-moving policy and practise, and support their scale-up according to previous research. However, there are few comprehensive accounts of how to plan and conduct large-scale rapid evaluations, ensure scientific rigour, and achieve stakeholder engagement within compressed timeframes. / Methods: Using a case study of a national mixed-methods rapid evaluation of COVID-19 remote home monitoring services in England, conducted during the COVID-19 pandemic, this manuscript examines the process of conducting a large-scale rapid evaluation from design to dissemination and impact, and reflects on the key lessons for conducting future large-scale rapid evaluations. In this manuscript, we describe each stage of the rapid evaluation: convening the team (study team and external collaborators), design and planning (scoping, designing protocols, study set up), data collection and analysis, and dissemination. / Results: We reflect on why certain decisions were made and highlight facilitators and challenges. The manuscript concludes with 12 key lessons for conducting large-scale mixed-methods rapid evaluations of healthcare services. We propose that rapid study teams need to: (1) find ways of quickly building trust with external stakeholders, including evidence-users; (2) consider the needs of the rapid evaluation and resources needed; (3) use scoping to ensure the study is highly focused; (4) carefully consider what cannot be completed within a designated timeframe; (5) use structured processes to ensure consistency and rigour; (6) be flexible and responsive to changing needs and circumstances; (7) consider the risks associated with new data collection approaches of quantitative data (and their usability); (8) consider whether it is possible to use aggregated quantitative data, and what that would mean when presenting results, (9) consider using structured processes & layered analysis approaches to rapidly synthesise qualitative findings, (10) consider the balance between speed and the size and skills of the team, (11) ensure all team members know roles and responsibilities and can communicate quickly and clearly; and (12) consider how best to share findings, in discussion with evidence-users, for rapid understanding and use. / Conclusion: These 12 lessons can be used to inform the development and conduct of future rapid evaluations in a range of contexts and settings

    Elective Modernism and the Politics of (Bio) Ethical Expertise

    Get PDF
    In this essay I consider whether the political perspective of third wave science studies – ‘elective modernism’ – offers a suitable framework for understanding the policy-making contributions that (bio)ethical experts might make. The question arises as a consequence of the fact that I have taken inspiration from the third wave in order to develop an account of (bio)ethical expertise. I offer a précis of this work and a brief summary of elective modernism before considering their relation. The view I set out suggests that elective modernism is a political philosophy and that although its use in relation to the use of scientific expertise in political and policy-making process has implications for the role of (bio)ethical expertise it does not, in the final analysis, provide an account that is appropriate for this latter form of specialist expertise. Nevertheless, it is an informative perspective, and one that can help us make sense of the political uses of (bio)ethical expertise
    corecore