1,434 research outputs found

    Recent advances in the detection and management of early gastric cancer and its precursors

    Get PDF
    Despite declines in incidence, gastric cancer remains a disease with a poor prognosis and limited treatment options due to its often late stage of diagnosis. In contrast, early gastric cancer has a good to excellent prognosis, with 5-year survival rates as high as 92.6% after endoscopic resection. There remains an East-West divide for this disease, with high incidence countries such as Japan seeing earlier diagnoses and reduced mortality, in part thanks to the success of a national screening programme. With missed cancers still prevalent at upper endoscopy in the West, and variable approaches to assessment of the high-risk stomach, the quality of endoscopy we provide must be a focus for improvement, with particular attention paid to the minority of patients at increased cancer risk. High-definition endoscopy with virtual chromoendoscopy is superior to white light endoscopy alone. These enhanced imaging modalities allow the experienced endoscopist to accurately and robustly detect high-risk lesions in the stomach. An endoscopy-led staging strategy would mean biopsies could be targeted to histologically confirm the endoscopic impression of premalignant lesions including atrophic gastritis, gastric intestinal metaplasia, dysplasia and early cancer. This approach to quality improvement will reduce missed diagnoses and, combined with the latest endoscopic resection techniques performed at expert centres, will improve early detection and ultimately patient outcomes. In this review, we outline the latest evidence relating to diagnosis, staging and treatment of early gastric cancer and its precursor lesions

    Complement-Mediated Virus Infectivity Neutralisation by HLA Antibodies Is Associated with Sterilising Immunity to SIV Challenge in the Macaque Model for HIV/AIDS.

    Get PDF
    Sterilising immunity is a desired outcome for vaccination against human immunodeficiency virus (HIV) and has been observed in the macaque model using inactivated simian immunodeficiency virus (SIV). This protection was attributed to antibodies specific for cell proteins including human leucocyte antigens (HLA) class I and II incorporated into virions during vaccine and challenge virus preparation. We show here, using HLA bead arrays, that vaccinated macaques protected from virus challenge had higher serum antibody reactivity compared with non-protected animals. Moreover, reactivity was shown to be directed against HLA framework determinants. Previous studies failed to correlate serum antibody mediated virus neutralisation with protection and were confounded by cytotoxic effects. Using a virus entry assay based on TZM-bl cells we now report that, in the presence of complement, serum antibody titres that neutralise virus infectivity were higher in protected animals. We propose that complement-augmented virus neutralisation is a key factor in inducing sterilising immunity and may be difficult to achieve with HIV/SIV Env-based vaccines. Understanding how to overcome the apparent block of inactivated SIV vaccines to elicit anti-envelope protein antibodies that effectively engage the complement system could enable novel anti-HIV antibody vaccines that induce potent, virolytic serological response to be developed

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    © 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. © 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio

    Singlet Portal to the Hidden Sector

    Get PDF
    Ultraviolet physics typically induces a kinetic mixing between gauge singlets which is marginal and hence non-decoupling in the infrared. In singlet extensions of the minimal supersymmetric standard model, e.g. the next-to-minimal supersymmetric standard model, this furnishes a well motivated and distinctive portal connecting the visible sector to any hidden sector which contains a singlet chiral superfield. In the presence of singlet kinetic mixing, the hidden sector automatically acquires a light mass scale in the range 0.1 - 100 GeV induced by electroweak symmetry breaking. In theories with R-parity conservation, superparticles produced at the LHC invariably cascade decay into hidden sector particles. Since the hidden sector singlet couples to the visible sector via the Higgs sector, these cascades necessarily produce a Higgs boson in an order 0.01 - 1 fraction of events. Furthermore, supersymmetric cascades typically produce highly boosted, low-mass hidden sector singlets decaying visibly, albeit with displacement, into the heaviest standard model particles which are kinematically accessible. We study experimental constraints on this broad class of theories, as well as the role of singlet kinetic mixing in direct detection of hidden sector dark matter. We also present related theories in which a hidden sector singlet interacts with the visible sector through kinetic mixing with right-handed neutrinos.Comment: 12 pages, 5 figure

    Spatially Explicit Data: Stewardship and Ethical Challenges in Science

    Get PDF
    Scholarly communication is at an unprecedented turning point created in part by the increasing saliency of data stewardship and data sharing. Formal data management plans represent a new emphasis in research, enabling access to data at higher volumes and more quickly, and the potential for replication and augmentation of existing research. Data sharing has recently transformed the practice, scope, content, and applicability of research in several disciplines, in particular in relation to spatially specific data. This lends exciting potentiality, but the most effective ways in which to implement such changes, particularly for disciplines involving human subjects and other sensitive information, demand consideration. Data management plans, stewardship, and sharing, impart distinctive technical, sociological, and ethical challenges that remain to be adequately identified and remedied. Here, we consider these and propose potential solutions for their amelioration

    Theoretical analysis of the dose dependence of the oxygen enhancement ratio and its relevance for clinical applications

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The increased resistance of hypoxic cells to ionizing radiation is usually believed to be the primary reason for treatment failure in tumors with oxygen-deficient areas. This oxygen effect can be expressed quantitatively by the oxygen enhancement ratio (OER). Here we investigate theoretically the dependence of the OER on the applied local dose for different types of ionizing irradiation and discuss its importance for clinical applications in radiotherapy for two scenarios: small dose variations during hypoxia-based dose painting and larger dose changes introduced by altered fractionation schemes.</p> <p>Methods</p> <p>Using the widespread Alper-Howard-Flanders and standard linear-quadratic (LQ) models, OER calculations are performed for T1 human kidney and V79 Chinese hamster cells for various dose levels and various hypoxic oxygen partial pressures (pO2) between 0.01 and 20 mmHg as present in clinical situations <it>in vivo</it>. Our work comprises the analysis for both low linear energy transfer (LET) treatment with photons or protons and high-LET treatment with heavy ions. A detailed analysis of experimental data from the literature with respect to the dose dependence of the oxygen effect is performed, revealing controversial opinions whether the OER increases, decreases or stays constant with dose.</p> <p>Results</p> <p>The behavior of the OER with dose per fraction depends primarily on the ratios of the LQ parameters alpha and beta under hypoxic and aerobic conditions, which themselves depend on LET, pO2 and the cell or tissue type. According to our calculations, the OER variations with dose <it>in vivo </it>for low-LET treatments are moderate, with changes in the OER up to 11% for dose painting (1 or 3 Gy per fraction compared to 2 Gy) and up to 22% in hyper-/hypofractionation (0.5 or 20 Gy per fraction compared to 2 Gy) for oxygen tensions between 0.2 and 20 mmHg typically measured clinically in hypoxic tumors. For extremely hypoxic cells (0.01 mmHg), the dose dependence of the OER becomes more pronounced (up to 36%). For high LET, OER variations up to 4% for the whole range of oxygen tensions between 0.01 and 20 mmHg were found, which were much smaller than for low LET.</p> <p>Conclusions</p> <p>The formalism presented in this paper can be used for various tissue and radiation types to estimate OER variations with dose and help to decide in clinical practice whether some dose changes in dose painting or in fractionation can bring more benefit in terms of the OER in the treatment of a specific hypoxic tumor.</p

    Age-related changes in P wave morphology in healthy subjects

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>We have previously documented significant differences in orthogonal P wave morphology between patients with and without paroxysmal atrial fibrillation (PAF). However, there exists little data concerning normal P wave morphology. This study was aimed at exploring orthogonal P wave morphology and its variations in healthy subjects.</p> <p>Methods</p> <p>120 healthy volunteers were included, evenly distributed in decades from 20–80 years of age; 60 men (age 50+/-17) and 60 women (50+/-16). Six-minute long 12-lead ECG registrations were acquired and transformed into orthogonal leads. Using a previously described P wave triggered P wave signal averaging method we were able to compare similarities and differences in P wave morphologies.</p> <p>Results</p> <p>Orthogonal P wave morphology in healthy individuals was predominately positive in Leads X and Y. In Lead Z, one third had negative morphology and two-thirds a biphasic one with a transition from negative to positive. The latter P wave morphology type was significantly more common after the age of 50 (P < 0.01). P wave duration (PWD) increased with age being slightly longer in subjects older than 50 (121+/-13 ms vs. 128+/-12 ms, P < 0.005). Minimal intraindividual variation of P wave morphology was observed.</p> <p>Conclusion</p> <p>Changes of signal averaged orthogonal P wave morphology (biphasic signal in Lead Z), earlier reported in PAF patients, are common in healthy subjects and appear predominantly after the age of 50. Subtle age-related prolongation of PWD is unlikely to be sufficient as a sole explanation of this finding that is thought to represent interatrial conduction disturbances. To serve as future reference, P wave morphology parameters of the healthy subjects are provided.</p

    The Rewiring of Ubiquitination Targets in a Pathogenic Yeast Promotes Metabolic Flexibility, Host Colonization and Virulence

    Get PDF
    Funding: This work was funded by the European Research Council [http://erc.europa.eu/], AJPB (STRIFE Advanced Grant; C-2009-AdG-249793). The work was also supported by: the Wellcome Trust [www.wellcome.ac.uk], AJPB (080088, 097377); the UK Biotechnology and Biological Research Council [www.bbsrc.ac.uk], AJPB (BB/F00513X/1, BB/K017365/1); the CNPq-Brazil [http://cnpq.br], GMA (Science without Borders fellowship 202976/2014-9); and the National Centre for the Replacement, Refinement and Reduction of Animals in Research [www.nc3rs.org.uk], DMM (NC/K000306/1). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. Acknowledgments We thank Dr. Elizabeth Johnson (Mycology Reference Laboratory, Bristol) for providing strains, and the Aberdeen Proteomics facility for the biotyping of S. cerevisiae clinical isolates, and to Euroscarf for providing S. cerevisiae strains and plasmids. We are grateful to our Microscopy Facility in the Institute of Medical Sciences for their expert help with the electron microscopy, and to our friends in the Aberdeen Fungal Group for insightful discussions.Peer reviewedPublisher PD

    Fluids in cosmology

    Full text link
    We review the role of fluids in cosmology by first introducing them in General Relativity and then by applying them to a FRW Universe's model. We describe how relativistic and non-relativistic components evolve in the background dynamics. We also introduce scalar fields to show that they are able to yield an inflationary dynamics at very early times (inflation) and late times (quintessence). Then, we proceed to study the thermodynamical properties of the fluids and, lastly, its perturbed kinematics. We make emphasis in the constrictions of parameters by recent cosmological probes.Comment: 34 pages, 4 figures, version accepted as invited review to the book "Computational and Experimental Fluid Mechanics with Applications to Physics, Engineering and the Environment". Version 2: typos corrected and references expande
    corecore