411 research outputs found

    Treatment as required versus regular monthly treatment in the management of neovascular age-related macular degeneration: a systematic review and meta-analysis

    Get PDF
    Background: To investigate whether treatment as required ‘pro re nata’ (PRN) versus regular monthly treatment regimens lead to differences in outcomes in neovascular age-related macular degeneration (nAMD). Regular monthly administration of vascular endothelial growth factor (VEGF) inhibitors is an established gold standard treatment, but this approach is costly. Replacement of monthly by PRN treatment can only be justified if there is no difference in patient relevant outcomes. Methods: Systematic review and meta-analysis. The intervention was PRN treatment and the comparator was monthly treatment with VEGF-inhibitors. Four bibliographic databases were searched for randomised controlled trials comparing both treatment regimens directly (head-to-head studies). The last literature search was conducted in December 2014. Risk of bias assessment was performed after the Cochrane Handbook for Systematic Reviews of Interventions. Findings: We included 3 head-to-head studies (6 reports) involving more than 2000 patients. After 2 years, the weighted mean difference in best corrected visual acuity (BCVA) was 1.9 (95% CI 0.5 to 3.3) ETDRS letters in favour of monthly treatment. Systemic adverse events were higher in PRN treated patients, but these differences were not statistically significant. After 2 years, the total number of intravitreal injections required by the patients in the PRN arms were 8.4 (95% CI 7.9 to 8.9) fewer than those having monthly treatment. The studies were considered to have a moderate risk of bias. Conclusions: PRN treatment resulted in minor but statistically significant decrease in mean BCVA which may not be clinically meaningful. There is a small increase in risk of systemic adverse events for PRN treated patients. Overall, the results indicate that an individualized treatment approach with anti-VEGF using visual acuity and OCT-guided re-treatment criteria may be appropriate for most patients with nAMD

    Thermodynamics of C incorporation on Si(100) from ab initio calculations

    Full text link
    We study the thermodynamics of C incorporation on Si(100), a system where strain and chemical effects are both important. Our analysis is based on first-principles atomistic calculations to obtain the important lowest energy structures, and a classical effective Hamiltonian which is employed to represent the long-range strain effects and incorporate the thermodynamic aspects. We determine the equilibrium phase diagram in temperature and C chemical potential, which allows us to predict the mesoscopic structure of the system that should be observed under experimentally relevant conditions.Comment: 5 pages, 3 figure

    On composite likelihood in bivariate meta-analysis of diagnostic test accuracy studies

    Get PDF
    The composite likelihood (CL) is amongst the computational methods used for estimation of the generalized linear mixed model (GLMM) in the context of bivariate meta-analysis of diagnostic test accuracy studies. Its advantage is that the likelihood can be derived conveniently under the assumption of independence between the random effects, but there has not been a clear analysis of the merit or necessity of this method. For synthesis of diagnostic test accuracy studies, a copula mixed model has been proposed in the biostatistics literature. This general model includes the GLMM as a special case and can also allow for flexible dependence modelling, different from assuming simple linear correlation structures, normality and tail independence in the joint tails. A maximum likelihood (ML) method, which is based on evaluating the bi-dimensional integrals of the likelihood with quadrature methods has been proposed, and in fact it eases any computational difficulty that might be caused by the double integral in the likelihood function. Both methods are thoroughly examined with extensive simulations and illustrated with data of a published meta-analysis. It is shown that the ML method has non-convergence issues or computational difficulties and at the same time allows estimation of the dependence between study-specific sensitivity and specificity and thus prediction via summary receiver operating curves

    A Microsoft-Excel-based tool for running and critically appraising network meta-analyses--an overview and application of NetMetaXL.

    Get PDF
    This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited.BACKGROUND: The use of network meta-analysis has increased dramatically in recent years. WinBUGS, a freely available Bayesian software package, has been the most widely used software package to conduct network meta-analyses. However, the learning curve for WinBUGS can be daunting, especially for new users. Furthermore, critical appraisal of network meta-analyses conducted in WinBUGS can be challenging given its limited data manipulation capabilities and the fact that generation of graphical output from network meta-analyses often relies on different software packages than the analyses themselves. METHODS: We developed a freely available Microsoft-Excel-based tool called NetMetaXL, programmed in Visual Basic for Applications, which provides an interface for conducting a Bayesian network meta-analysis using WinBUGS from within Microsoft Excel. . This tool allows the user to easily prepare and enter data, set model assumptions, and run the network meta-analysis, with results being automatically displayed in an Excel spreadsheet. It also contains macros that use NetMetaXL's interface to generate evidence network diagrams, forest plots, league tables of pairwise comparisons, probability plots (rankograms), and inconsistency plots within Microsoft Excel. All figures generated are publication quality, thereby increasing the efficiency of knowledge transfer and manuscript preparation. RESULTS: We demonstrate the application of NetMetaXL using data from a network meta-analysis published previously which compares combined resynchronization and implantable defibrillator therapy in left ventricular dysfunction. We replicate results from the previous publication while demonstrating result summaries generated by the software. CONCLUSIONS: Use of the freely available NetMetaXL successfully demonstrated its ability to make running network meta-analyses more accessible to novice WinBUGS users by allowing analyses to be conducted entirely within Microsoft Excel. NetMetaXL also allows for more efficient and transparent critical appraisal of network meta-analyses, enhanced standardization of reporting, and integration with health economic evaluations which are frequently Excel-based.CC is a recipient of a Vanier Canada Graduate Scholarship from the Canadian Institutes of Health Research (funding reference number—CGV 121171) and is a trainee on the Canadian Institutes of Health Research Drug Safety and Effectiveness Network team grant (funding reference number—116573). BH is funded by a New Investigator award from the Canadian Institutes of Health Research and the Drug Safety and Effectiveness Network. This research was partly supported by funding from CADTH as part of a project to develop Excel-based tools to support the conduct of health technology assessments. This research was also supported by Cornerstone Research Group

    A mixed effect model for bivariate meta-analysis of diagnostic test accuracy studies using a copula representation of the random effects distribution

    Get PDF
    Diagnostic test accuracy studies typically report the number of true positives, false positives, true negatives and false negatives. There usually exists a negative association between the number of true positives and true negatives, because studies that adopt less stringent criterion for declaring a test positive invoke higher sensitivities and lower specificities. A generalized linear mixed model (GLMM) is currently recommended to synthesize diagnostic test accuracy studies. We propose a copula mixed model for bivariate meta-analysis of diagnostic test accuracy studies. Our general model includes the GLMM as a special case and can also operate on the original scale of sensitivity and specificity. Summary receiver operating characteristic curves are deduced for the proposed model through quantile regression techniques and different characterizations of the bivariate random effects distribution. Our general methodology is demonstrated with an extensive simulation study and illustrated by re-analysing the data of two published meta-analyses. Our study suggests that there can be an improvement on GLMM in fit to data and makes the argument for moving to copula random effects models. Our modelling framework is implemented in the package CopulaREMADA within the open source statistical environment R

    Subdecoherent Information Encoding in a Quantum-Dot Array

    Get PDF
    A potential implementation of quantum-information schemes in semiconductor nanostructures is studied. To this end, the formal theory of quantum encoding for avoiding errors is recalled and the existence of noiseless states for model systems is discussed. Based on this theoretical framework, we analyze the possibility of designing noiseless quantum codes in realistic semiconductor structures. In the specific implementation considered, information is encoded in the lowest energy sector of charge excitations of a linear array of quantum dots. The decoherence channel considered is electron-phonon coupling We show that besides the well-known phonon bottleneck, reducing single-qubit decoherence, suitable many-qubit initial preparation as well as register design may enhance the decoherence time by several orders of magnitude. This behaviour stems from the effective one-dimensional character of the phononic environment in the relevant region of physical parameters.Comment: 12 pages LaTeX, 5 postscript figures. Final version accepted by PR

    Hyper-domains in exchange bias micro-stripe pattern

    Get PDF
    A combination of experimental techniques, e.g. vector-MOKE magnetometry, Kerr microscopy and polarized neutron reflectometry, was applied to study the field induced evolution of the magnetization distribution over a periodic pattern of alternating exchange bias (EB) stripes. The lateral structure is imprinted into a continuous ferromagnetic/antiferromagnetic EB bilayer via laterally selective exposure to He-ion irradiation in an applied field. This creates an alternating frozen-in interfacial EB field competing with the external field in the course of the re-magnetization. It was found that in a magnetic field applied at an angle with respect to the EB axis parallel to the stripes the re-magnetization process proceeds via a variety of different stages. They include coherent rotation of magnetization towards the EB axis, precipitation of small random (ripple) domains, formation of a stripe-like alternation of the magnetization, and development of a state in which the magnetization forms large hyper-domains comprising a number of stripes. Each of those magnetic states is quantitatively characterized via the comprehensive analysis of data on specular and off-specular polarized neutron reflectivity. The results are discussed within a phenomenological model containing a few parameters, which can readily be controlled by designing systems with a desired configuration of magnetic moments of micro- and nano-elements

    Treatment of depressive disorders in primary care - protocol of a multiple treatment systematic review of randomized controlled trials

    Get PDF
    Background: Several systematic reviews have summarized the evidence for specific treatments of primary care patients suffering from depression. However, it is not possible to answer the question how the available treatment options compare with each other as review methods differ. We aim to systematically review and compare the available evidence for the effectiveness of pharmacological, psychological, and combined treatments for patients with depressive disorders in primary care. Methods/Design: To be included, studies have to be randomized trials comparing antidepressant medication (tricyclic antidepressants, selective serotonin reuptake inhibitors (SSRIs), hypericum extracts, other agents) and/or psychological therapies (e.g. interpersonal psychotherapy, cognitive therapy, behavioural therapy, short dynamically-oriented psychotherapy) with another active therapy, placebo or sham intervention, routine care or no treatment in primary care patients in the acute phase of a depressive episode. Main outcome measure is response after completion of acute phase treatment. Eligible studies will be identified from available systematic reviews, from searches in electronic databases (Medline, Embase and Central), trial registers, and citation tracking. Two reviewers will independently extract study data and assess the risk of bias using the Cochrane Collaboration's corresponding tool. Meta-analyses (random effects model, inverse variance weighting) will be performed for direct comparisons of single interventions and for groups of similar interventions (e.g. SSRIs vs. tricyclics) and defined time-windows (up to 3 months and above). If possible, a global analysis of the relative effectiveness of treatments will be estimated from all available direct and indirect evidence that is present in a network of treatments and comparisons. Discussion: Practitioners do not only want to know whether there is evidence that a specific treatment is more effective than placebo, but also how the treatment options compare to each other. Therefore, we believe that a multiple treatment systematic review of primary-care based randomized controlled trials on the most important therapies against depression is timely
    corecore